Search

EP-4742159-A1 - METHOD AND COMPUTING APPARATUS FOR DETERMINING ADJUSTED CALIBRATION INFORMATION

EP4742159A1EP 4742159 A1EP4742159 A1EP 4742159A1EP-4742159-A1

Abstract

A method and a computing apparatus (110) for determining adjusted calibration information for a system (100) are disclosed. The computing apparatus (110) obtains (A110) preliminary calibration information for the system (100), wherein the preliminary calibration information indicates a preliminary three-dimensional rotation for the first image sensor (121) and the second image sensor (122) of the system (100), and obtains (A120) a plurality of point pairs, each point pair representing a respective observation of a detected object (130) projected onto a respective two-dimensional image plane of each of the first image sensor (121) and the second image sensor (122). The computing apparatus (110) determines (A130) the adjusted calibration information for the system (100) by solving (A140) an optimization problem whose objective function is dependent on the adjusted calibration information, the objective function being formulated in terms of a reprojection error and a difference measure between triangulated distances and known distances.

Inventors

  • HUGMARK, Joakim
  • GÖTBERG, Per Thomas Olof

Assignees

  • Topgolf Sweden AB

Dates

Publication Date
20260513
Application Date
20251112

Claims (15)

  1. A method, performed by a computing apparatus (110), for determining adjusted calibration information for a system (100), wherein the system (100) comprises the computing apparatus (110), a first image sensor (121) and a second image sensor (122), wherein each image sensor of the first image sensor (121) and the second image sensor (122) is configured to capture respective two-dimensional information representing a detected object (130) travelling through a three-dimensional space within a respective field of view of each image sensor, wherein the adjusted calibration information indicates an adjusted three-dimensional rotation of each of the first image sensor (121) and the second image sensor (122) in relation to a fixed coordinate system (200) related to the three-dimensional space, wherein at least one calibration object (140) is located within the respective field of view of the first image sensor (121) and within the respective field of view of the second image sensor (122), wherein the method comprises: obtaining (A110) preliminary calibration information for the system (100), wherein the preliminary calibration information indicates a preliminary three-dimensional rotation for the first image sensor (121) and the second image sensor (122) in relation to the fixed coordinate system (200); obtaining (A120) a plurality of point pairs, wherein each point pair of the plurality of point pairs represents a respective observation of the detected object (130) projected onto a respective two-dimensional image plane of each of the first image sensor (121) and the second image sensor (122); and determining (A130) the adjusted calibration information for the system (100) by solving (A140) an optimization problem having an objective function that is dependent on the adjusted calibration information, wherein the objective function is formulated in terms of a reprojection error with respect to the plurality of point pairs, wherein the objective function further is formulated in terms of a difference measure, expressed in the fixed coordinate system (200), between triangulated distances and known distances, wherein the triangulated distances span between each of the first image sensor (121) and the second image sensor (122) and each of the at least one calibration object (140), and wherein the known distances span between each of the first image sensor (121) and the second image sensor (122) and each of the at least one calibration object (140).
  2. The method according to claim 1, wherein the difference measure is a dimensionless, relative measure between the triangulated distances and the known distances.
  3. The method according to claim 1 or 2, wherein the reprojection error is measured in units, such as pixels, of the image plane.
  4. The method according to any preceding claim, wherein an initial estimate for the optimization problem is based on the preliminary calibration information.
  5. The method according to any preceding claim, wherein the solving (A140) of the optimization problem comprises iteratively performing (A150) a set of actions including: triangulating (A151) an updated three-dimensional position of each point pair of the plurality of point pairs using the first and second image sensors (121, 122) and the adjusted calibration information; determining (A153) an updated reprojection error for the detected object (130) corresponding to each point pair; and optimizing (A155) the objective function to obtain the adjusted calibration information.
  6. The method according to claim 5, wherein the adjusted calibration information comprises an updated value for a relative three-dimensional rotation of the second image sensor (122) in relation to the first image sensor (121), and wherein the method comprises: determining (A160) an updated value for a three-dimensional rotation of the first image sensor (121) in the fixed coordinate system (200) by solving (A164) an equation system defined by relating a known three-dimensional position of the first image sensor (121) in the fixed coordinate system (200) to a known three-dimensional position of the second image sensor (122) in the fixed coordinate system (200) and to the updated value for the relative three-dimensional rotation of the second image sensor (122) in relation to the first image sensor (121); and adjusting (A168) a measure of a rotation of the first image sensor (121) about a first axis based on a difference between a known position in the fixed coordinate system (200) of the at least one calibration object (140) and a triangulated position of the at least one calibration object (140), wherein the triangulated position is determined using the first and second image sensors (121, 122), e.g., the locations thereof, optionally wherein the first image sensor (121) and the second image sensor (122) are arranged at a horizontal distance from each other, and wherein the adjusting (A168) of the measure of the rotation of the first image sensor (121) is an adjustment of a measure of a tilt of the first image sensor (121).
  7. The method according to any one of claims 5-6, wherein the optimizing of the objective function comprises minimizing the objective function to obtain the adjusted calibration information, optionally wherein the objective function is a loss function.
  8. The method according to any preceding claim, wherein the preliminary calibration information comprises a preliminary measure of the preliminary three-dimensional rotation of each of the first and second image sensors (121, 122) in relation to the fixed coordinate system (200), and wherein the method comprises: determining (A170) the preliminary measure of the three-dimensional rotation of each of the first and second image sensors (121, 122) by determining (A180) respective coordinates in the respective two-dimensional image plane of each of the first and second image sensors (121, 122) for each of the at least one calibration object (140); solving (A190) an equation system defined by relating the respective coordinates to a rotational and translational three-dimensional transformation of the second image sensor in relation to the first image sensor to obtain a value for a relative three-dimensional rotation of the second image sensor in relation to the first image sensor; and obtaining (A200) a value for a three-dimensional rotation of the first image sensor in the fixed coordinate system (200) by solving an equation system defined by relating a known three-dimensional position of the first image sensor in the fixed coordinate system (200) to a known three-dimensional position of the second image sensor in the fixed coordinate system (200) and to the updated value for the relative three-dimensional rotation of the second image sensor in relation to the first image sensor; and adjusting a measure of a rotation of the first image sensor about a first axis based on a difference between a known position in the fixed coordinate system (200) of a calibration object of the at least one calibration object and a triangulated position of the calibration object, the triangulated position being determined using the first and second image sensors (121, 122).
  9. The method according to any preceding claim, further comprising: determining (A105), in the fixed coordinate system (200), a three-dimensional location of the first image sensor (121) and a three-dimensional location of the second image sensor (122), optionally wherein the determining (A105) of the three-dimensional location of the first and second image sensors (121, 122) is performed using an electronic distance measurement device.
  10. The method according to claim 9, wherein the adjusted calibration information comprises: an updated value for the three-dimensional location in the fixed coordinate system (200) of the first image sensor by relating a point on the first image sensor (121), the three-dimensional location in the fixed coordinate system (200) of which is known, to an optical center of the first image sensor (121) using an updated estimate regarding a three-dimensional rotation in the fixed coordinate system (200) of the first image sensor (121), and/or an updated value for the three-dimensional location in the fixed coordinate system (200) of the second image sensor (122) by relating a point on the second image sensor (122), the three-dimensional location in the fixed coordinate system (200) of which is known, to an optical center of the second image sensor (122) using an updated estimate regarding a three-dimensional rotation in the fixed coordinate system (200) of the second image sensor (122).
  11. The method according to any preceding claim, further comprising: detecting (A210) a change in rotation of the first and/or second image sensor (121, 122), referred to as "present image sensor (121/122)", between a first captured image of the present image sensor (121/122) and a second captured image of the present image sensor (121/122), the second captured image being captured after the first captured image; determining (A220), using manual input or an automatic image correlation procedure and expressed in units of the image plane of the present image sensor (121/122), an offset between the first and second captured images; translating (A230) the offset into a corresponding rotation of the present image sensor (121/122); and updating (A240) the adjusted calibration information based on the corresponding rotation.
  12. The method according to the preceding claim, wherein the preliminary calibration information comprises a preliminary measure of the three-dimensional rotation of each of the first and second image sensors (121, 122) in relation to the fixed coordinate system (200), and wherein the method comprises: determining (A250) the preliminary measure of the three-dimensional rotation of each of the first and second image sensors (121, 122) by determining (A252) an expected location, in the image plane, of the at least one calibration object (140) with a known position in the fixed coordinate system (200), wherein at least one of the first and second image sensors (121, 122) has a known position in the fixed coordinate system (200); determining (A254), using manual input or an automatic image correlation procedure and expressed in units of the image plane, an offset of the at least one calibration object (140) based on an image captured by the present image sensor (121/122) and the expected location; translating (A256) the offset into a corresponding rotation of the present image sensor (121/122); and determining (A258) the preliminary measure of the three-dimensional rotation of the present image sensor (121/122) based on the corresponding rotation.
  13. The method according to claim 11 or 12, further comprising: determining (A260) an expected location, in the image plane, of at least one calibration object (140) with a known position in the fixed coordinate system (200), using a known position in the fixed coordinate system (200) of the present image sensor (121/122); detecting (A262), using manual input or an automatic image correlation procedure and expressed in units of the image plane, an offset of the at least one calibration object (140) based on an image captured by the present image sensor (121/122) and the corresponding expected location; triangulating (A264), using the offset, an updated three-dimensional position in the fixed coordinate system (200) of the at least one calibration object (140); and determining (A266) the adjusted calibration information based on the triangulated three-dimensional position in the fixed coordinate system (200) of the at least one calibration object (140) instead of a known 3D position in the fixed coordinate system (200) of the at least one calibration object (140).
  14. A computing apparatus (110) configured for determining adjusted calibration information for a system (100), wherein the system (100) comprises the computing apparatus (110), a first image sensor (121) and a second image sensor (122), wherein each image sensor of the first image sensor (121) and the second image sensor (122) is configured to capture respective two-dimensional information representing a detected object (130) travelling through a three-dimensional space within a respective field of view of each image sensor, wherein the adjusted calibration information indicates an adjusted three-dimensional rotation of each of the first image sensor (121) and the second image sensor (122) in relation to a fixed coordinate system (200) related to the three-dimensional space, wherein at least one calibration object (140) is located within the respective field of view of the first image sensor (121) and within the respective field of view of the second image sensor (122), wherein the computing apparatus (110) is configured to perform the method of any one of claims 1 to 13.
  15. A computer program product (905) for determining adjusted calibration information for a system (100), wherein the system (100) comprises the computing apparatus (110), a first image sensor (121) and a second image sensor (122), wherein each image sensor of the first image sensor (121) and the second image sensor (122) is configured to capture respective two-dimensional information representing a detected object (130) travelling through a three-dimensional space within a respective field of view of each image sensor, wherein the adjusted calibration information indicates an adjusted three-dimensional rotation of each of the first image sensor (121) and the second image sensor (122) in relation to a fixed coordinate system (200) related to the three-dimensional space, wherein at least one calibration object (140) is located within the respective field of view of the first image sensor (121) and within the respective field of view of the second image sensor (122), the computer program product (905) comprising a non-transitory computer readable storage medium having program instructions embodied therewith, the program instructions executable by a computing apparatus (110) to cause the computing apparatus (110) to perform the method of any one of claims 1 to 13.

Description

TECHNICAL FIELD The embodiments herein relate to calibration of two cameras, having at least partially overlapping fields of view. In particular, a method and computing apparatus for determining adjusted calibration information are disclosed. A corresponding computer program and a carrier therefor as well as a related system are also disclosed. BACKGROUND When tracking sports projectiles, such as golf balls, one or more tracking sensors can be used to capture the sports projectile's trajectory as the sports projectile moves. The tracking sensors can be video cameras, radar sensors, high-speed still imaging devices, or the like. In a scenario with two tracking sensors, a calibration between said two tracking sensors can be used to triangulate points observed by the two tracking sensors. As an example, at a driving range, a distance to an observed golf ball can be estimated using data maps from the two tracking sensors and the calibration. However, a problem can be related to achieve and maintain an accurate calibration between the two tracking sensors. SUMMARY An object may be to overcome, or at least reduce, one or more of the abovementioned problems and/or disadvantages. This, and other objects, may be achieved by the solutions set forth in the appended independent claims. According to an aspect, the object is achieved by a method, performed by a computing apparatus, for determining adjusted calibration information for a system. The system comprises the computing apparatus, a first image sensor and a second image sensor. Each image sensor of the first image sensor and the second image sensor is configured to capture respective two-dimensional information representing a detected object travelling through a three-dimensional space within a respective field of view of said each image sensor. The adjusted calibration information indicates an adjusted three-dimensional rotation of each of the first image sensor and the second image sensor in relation to a fixed coordinate system related to the three-dimensional space. At least one calibration object is located within the respective field of view of the first image sensor and within the respective field of view of the second image sensor. The computing apparatus obtains preliminary calibration information for the system. The preliminary calibration information indicates a preliminary three-dimensional rotation for the first image sensor and the second image sensor in relation to the fixed coordinate system. The computing apparatus obtains a plurality of point pairs. Each point pair of the plurality of point pairs represents a respective observation of the detected object projected onto a respective two-dimensional image plane of each of the first image sensor and the second image sensor. The computing apparatus determines the adjusted calibration information for the system by solving an optimization problem having an objective function that is dependent on the adjusted calibration information. The objective function is formulated in terms of a reprojection error with respect to the plurality of point pairs. The objective function further is formulated in terms of a difference measure, expressed in the fixed coordinate system, between triangulated distances and known distances. The triangulated distances span between each of the first image sensor and the second image sensor and each of said at least one calibration object, and wherein the known distances span between each of the first image sensor and the second image sensor and each of said at least one calibration object. As a result of the method, aka the calibration procedure herein, rotation of the two tracking sensors, e.g., located in the three-dimensional space, such as a golf range, can be determined with respect to the fixed coordinate system. Since the objective function further can be formulated in terms of the difference measure, e.g., according to some examples a dimensionless relative measure, the method can according to some examples normalize the distances. In this manner, a scale of the distances can be compensated for. In some embodiments, the difference measure is a dimensionless, relative measure between the triangulated distances and the known distances. In some embodiments, the reprojection error is measured in units, such as pixels, of the image plane. In some embodiments, an initial estimate for the optimization problem is based on the preliminary calibration information. In some embodiments, the solving of the optimization problem comprises iteratively performing a set of actions including triangulating an updated 3D position of said each point pair using the first and second image sensors and the adjusted calibration information,determining an updated reprojection error for the detected object corresponding to said each point pair, andoptimizing the objective function to obtain the adjusted calibration information. In some embodiments, the adjusted calibration information comprises an