EP-4736123-A1 - SYSTEM AND METHOD FOR CALIBRATING A CAMERA AND OBJECT TRACKING SYSTEM USING A CALIBRATED CAMERA
Abstract
A method for calibrating a camera without the decomposition of camera parameters into extrinsic and intrinsic components is provided. Further, there is provided a method for tracking an object in motion comprising capturing one or more image frames of an object in motion, using one or more calibrated cameras that have been calibrated according to a calibration method that generates and uses a respective transformation matrix for mapping three-dimensional (3D) real world model features to corresponding two-dimensional (2D) image features. The tracking method further comprises determining, using a hardware processor, motion characteristics of the object in motion based on the captured one or more image frames from each one or more calibrated cameras, the determining of the motion characteristics based on implicit intrinsic camera parameters and implicit extrinsic camera parameters of the respective transformation matrix from each respective one or more calibrated cameras.
Inventors
- LIPUNOV, EVGENY
- YILDIRIM, BOTAN
- XU, Dejiang
- VISHWANATH, ADITYA
- NATESAN, SIVAPRAKASH
- SAVAK, YASAR BURAK
- OKUR, BATUHAN
Assignees
- Rapsodo Pte. Ltd.
Dates
- Publication Date
- 20260506
- Application Date
- 20250213
Claims (20)
- 1. A method for tracking an object in motion comprising: capturing, from each of one or more calibrated cameras, one or more image frames of an object in motion, each said one or more calibrated cameras having been calibrated according to a calibration method that generates and uses a respective transformation matrix for mapping three-dimensional (3D) real world model features to corresponding two-dimensional (2D) image features; and determining, using a hardware processor, motion characteristics of the object in motion based on said captured one or more image frames from each said one or more calibrated cameras, said determining the motion characteristics based on implicit intrinsic camera parameters and implicit extrinsic camera parameters of the respective transformation matrix from each respective one or more calibrated cameras.
- 2. The method as claimed in Claim 1, wherein the calibration method comprises calibrating each of the one or more calibrated cameras without decomposing the plurality of the camera parameters into explicit extrinsic and explicit intrinsic camera parameters.
- 3. The method as claimed in Claim 2, wherein each of the one or more calibrated cameras is further calibrated by modifying one or more implicit extrinsic camera parameters to obtain explicit extrinsic camera parameters, said modifying comprising adjusting implicit extrinsic camera parameters through an affine correction.
- 4. The method as claimed in Claim 3, wherein the capturing, by a calibrated camera, of one or more image frames comprises: controlling a timing of a first calibrated camera and a second calibrated camera physically spaced apart from said first calibrated camera such that each said first and second calibrated camera captures said one or more image frames of the object in motion.
- 5. The method as claimed in Claim 4, wherein the controlling the timing of the first calibrated camera and the second calibrated camera comprises a hard synchronizing the timing of the first calibrated camera and the second calibrated camera such that each captures an image frame at identical time instances, the first calibrated camera and the second calibrated camera run according to a same internal clock reference.
- 6. The method as claimed in Claim 4, wherein the controlling the timing of a first calibrated camera and the second calibrated camera comprises a soft synchronizing the timing of the first calibrated camera and the second calibrated camera such that each captures an image frame at different time instances, each calibrated camera capturing frames of a position of the object in motion in a free run mode with each calibrated camera having its own internal synchronization.
- 7. The method as claimed in Claim 4, further comprising: stitching together said captured camera frames into a trajectory parametric space to obtain a position of the object in motion.
- 8. An object tracking system comprising: a camera system comprising one or more calibrated cameras, each camera capturing one or more image frames of a position of an object in motion, each said one or more calibrated cameras having been calibrated according to a calibration method that generates and uses a respective transformation matrix for mapping 3D real world model features to corresponding 2D image features; and a hardware processor coupled to a memory storing instructions that, when executed by the processor, configure the hardware processor to determine motion characteristics of the object based on said captured one or more image frames, said determining of motion characteristics of the object is based on implicit intrinsic camera parameters and implicit extrinsic camera parameters of the respective transformation matrix from each respective one or more calibrated cameras.
- 9. The system as claimed in Claim 8, wherein the calibration method comprises calibrating each of the one or more calibrated cameras without decomposing the plurality of the camera parameters into explicit extrinsic and explicit intrinsic camera parameters.
- 10. The system as claimed in Claim 9, wherein at the respective calibrated camera, said hardware processor is further configured to: modify one or more implicit extrinsic camera parameters by adjusting implicit extrinsic camera parameters of said calibrated camera through an affine collection.
- 11. The system as claimed in Claim 10, wherein to capture by a calibrated camera of one or more image frames, said hardware processor is further configured to: control a time synchronization of a first calibrated camera and a second calibrated camera such that each said first and second calibrated camera captures said one or more image frames of the object in motion.
- 12. The system as claimed in Claim 11, wherein to control the timing of the first calibrated camera and the second calibrated camera, said hardware processor is further configured to: hard synchronize the timing of the first calibrated camera and the second calibrated camera such that each said first and second calibrated camera captures image frames synchronously at identical time instances, the first calibrated camera and the second calibrated camera run according to a same internal clock reference.
- 13. The system as claimed in Claim 11, wherein to control the timing of the first calibrated camera and the second calibrated camera, said hardware processor is further configured to: soft synchronize the timing of the first calibrated camera and the second calibrated camera such that each said first and second calibrated camera captures image frames asynchronously at different time instances, each calibrated camera capturing frames of a position of the object in motion in a free run mode with each calibrated camera having its own internal synchronization.
- 14. The system as claimed in Claim 11, wherein said hardware processor is further configured to: stitch together said captured one or more image frames of the object in motion in a parametric space to obtain a position of the object in motion.
- 15. A method of calibrating a camera comprising: providing a transformation matrix (H) that represents a plurality of camera parameters; and aligning 2D image features (q) with 2D image features of a reference (q’j, by applying one or more corrections (HA) to said plurality of camera parameters, to obtain an updated transformation matrix (H J, wherein said 2D image features (q) and said 2D image features of the reference (q’) arc represented in pixel coordinates.
- 16. The method as claimed in Claim 15, wherein the plurality of the camera parameters includes implicit extrinsic and implicit intrinsic camera parameters.
- 17. The method as claimed in Claim 15, wherein the camera is calibrated without decomposing the plurality of the implicit camera parameters into explicit extrinsic and explicit intrinsic camera parameters.
- 18. The method as claimed in Claim 17, further comprising: modifying one or more implicit extrinsic camera parameters, said modifying comprising adjusting implicit extrinsic parameters through an affine correction, said affine correction recovering 6 Degrees of Freedom (DoF) representing said explicit extrinsic camera parameters.
- 19. The method as claimed in Claim 17, wherein the transformation matrix (H) is an initial transformation matrix (Hk) for transforming pixel coordinates (q ) relating to a reference camera image object in a 2D camera field of view of the camera with corresponding known real world location coordinates in a 3D global reference space, the method further comprising: aligning a calibration image to the reference camera image object in the 3D global reference space to obtain implicit camera parameters that include implicit extrinsic and implicit intrinsic camera parameters; and building the updated transformation matrix (Hk+i) based on said aligning of the calibration image to the reference camera image object and applying said one or more corrections (HA) to said initial transformation matrix (Hk), said updated transformation matrix (Hk+i) comprising explicit extrinsic camera parameters.
- 20. The method as claimed in Claim 19, further comprising: applying said updated transformation matrix (Hk+i) to a 3D model to obtain pixel coordinates (qk+i) relating to the reference camera image object; comparing said pixel coordinates (qk+i) against the transformed pixel coordinates (q n ) evaluated with respect to the reference camera image object; determining whether a difference between the obtained pixel coordinates (qk+i) and the transformed pixel coordinates (qk) is above a threshold; and responsive to determining that the difference between the obtained pixel coordinates (qk+i) and transformed pixel coordinates (qk) is above the threshold, repeating said aligning the calibration image, said building the one or more corrections (HA) and said applying said one or more corrections (HA) to said initial transformation matrix (Hk) to further update the transformation matrix (Hk+i).
Description
SYSTEM AND METHOD FOR CALIBRATING A CAMERA AND OBJECT TRACKING SYSTEM USING A CALIBRATED CAMERA TECHNICAL FIELD [0001] The present disclosure relates to a camera calibration method and a method and system of using a calibrated camera(s) for tracking and measuring a motion of a target object in a three- dimensional space. BACKGROUND [0002] Fundamentally, a camera provides an image mapping of a three-dimensional space onto a two-dimensional space or image plane. Current camera calibration techniques supply model parameter values that are needed to compute line of sight rays in space that corresponds to a point in the image plane. [0003] A calibration or “projection’" matrix, which is estimated during a camera calibration, is typically decomposed into eleven geometric parameters that define the standard pinhole camera model. Typically, camera model parameters include extrinsic and intrinsic parameters. The extrinsic camera parameters include 3D location and orientation of a camera in the world and intrinsic camera parameters include, among others, a focal length and relationships between pixel coordinates and camera coordinates. [0004] In many applications, camera calibration is necessary to recover 3D quantitative measures about an observed scene from 2D images. For example, from a calibrated camera, it can be determined how far an object is from the camera, or the height of the object, etc. Typical calibration techniques use a 3D, 2D or ID calibration object whose geometry in 3D space is known with very good precision. [0005] From a set of world points and their image coordinates, one object of the camera calibration is to find a projection “matrix” and subsequently find intrinsic and extrinsic camera parameters from that matrix in a decomposition step. However, the decomposition into extrinsic and intrinsic camera parameters is one of the major issues in calibration due to rcprojcction error. Further, in the decomposition step, to extract the extrinsic and intrinsic camera parameters, several assumptions and constraints arc made, which might be not true, for e.g., no lens distortion or no tilt. [0006] Accordingly, it is desirable to have a camera calibration method that does not require decomposition into intrinsic and extrinsic camera parameters. In addition, it is also desirable to have a camera system for taking measurements that avoids having to make any assumptions regarding intrinsic or extrinsic camera parameters. SUMMARY [0007] There is provided a camera system and method for taking measurements of a moving object that avoids having to make any assumptions regarding intrinsic camera parameters. [0008] Further, there is provided a camera system and method for taking measurements of a moving object that avoids having to split camera parameters apart from the camera’s calibration matrix and as a result, the camera is ready to work with any lenses, any shift or tilt (intentionally or unintentionally) in the setup of the camera for tracking object in motion. [0009] Additionally, there is provided a camera system calibration method for calibrating a camera used in taking measurements of a moving object without the decomposition of camera parameters into extrinsic and intrinsic pails. [0010] In an embodiment, the camera system and method include a single camera device. [0011] In one embodiment, during the calibration process, a virtual reference is aligned to a physical object in a global reference space to obtain the camera parameters. [0012] A robust camera calibration system and method and system whereby a user can use any camera without the need for fine tuning the camera parameters to a global reference. [0013] According to one aspect, there is provided a method for tracking an object in motion. The method comprises: capturing, from each of one or more calibrated cameras, one or more image frames of an object in motion, each of the one or more calibrated cameras having been calibrated according to a calibration method that generates and uses a respective transformation matrix for mapping three-dimensional (3D) real world model features to corresponding two-dimensional (2D) image features; and determining, using a hardware processor, motion characteristics of the object in motion based on the captured one or more image frames from each of the one or more calibrated cameras, the determining of motion characteristics based on implicit intrinsic camera parameters and implicit extrinsic camera parameters of the respective transformation matrix from each respective one or more calibrated cameras. [0014] Tn a further aspect, there is provided an object tracking system. The object tracking system includes a camera system comprising one or more calibrated cameras, each camera capturing one or more image frames of a position of an object in motion, each of the one or more calibrated cameras having been calibrated according to a calibration method that generates and uses a respective transformation matrix fo