Search

CN-122017804-A - Multi-sensor joint calibration method and system based on scale compensation

CN122017804ACN 122017804 ACN122017804 ACN 122017804ACN-122017804-A

Abstract

The invention provides a multi-sensor joint calibration method and system based on scale compensation, wherein the method comprises the following steps of collecting synchronous data of a laser radar, a camera and an RTK; the method comprises the steps of obtaining pose sequences of sensors through a mileage calculation method, converting RTK data into ENU local coordinates, performing time synchronization and track registration by adopting a nearest neighbor matching strategy based on a time threshold, estimating an anisotropic scale compensation matrix initial value based on a standard deviation ratio of displacement increment, estimating a rotation matrix initial value through SVD (singular value decomposition) of a covariance matrix, estimating a translation vector initial value based on a track point mean value, constructing pose increment constraint and geometric consistency constraint, performing joint optimization solution, and outputting sensor external parameters and a scale compensation matrix. According to the invention, the anisotropic scale compensation matrix is introduced to realize three-dimensional attitude alignment, a calibration plate is not needed, a special motion track is not needed, and the problem of different scale drift degrees in all directions of the odometer is effectively solved.

Inventors

  • Cui dongshun
  • GAO ZHENYU
  • Liu Shede
  • OUYANG TINGHUI
  • LI XIN
  • ZHANG WEICHAO

Assignees

  • 广智微芯(扬州)有限公司

Dates

Publication Date
20260512
Application Date
20251222

Claims (10)

  1. 1. A multi-sensor joint calibration method based on scale compensation, which is characterized by comprising the following steps: Data acquisition is carried out on a mobile robot platform, and laser radar point cloud data, camera image data and RTK positioning data are synchronously acquired, so that the data of each sensor are ensured to have a uniform time reference; According to the data acquired by the data acquisition, acquiring a pose sequence under a laser radar coordinate system through a laser radar odometer, acquiring a pose sequence under a camera coordinate system through a visual odometer, and converting a WGS-84 geodetic coordinate output by an RTK receiver into a global pose sequence under an ENU coordinate system with a first frame position as an origin; Adopting a nearest neighbor matching strategy based on a time threshold value, searching points which have the smallest time difference and are not matched in an RTK sequence for each odometer track point, and establishing a pairing relation when the time difference meets a threshold value condition to obtain a registered track point pair set; Calculating adjacent displacement increment of the odometer track and the RTK track by adopting a displacement increment-based statistical analysis method, estimating a scale compensation matrix initial value by using a standard deviation ratio of displacement increment in each direction, constructing a cross covariance matrix by using a displacement increment sequence, carrying out singular value decomposition to estimate a rotation matrix initial value, and calculating initial value estimation of a translation vector based on track point mean; Defining a laser radar-RTK residual error and a camera-RTK residual error based on pose increment residual error constraint according to initial value estimation of the scale compensation matrix, the rotation matrix and the translation vector, and constructing multi-sensor geometric consistency constraint so as to construct an overall optimization objective function; and carrying out iterative solution on the overall optimization objective function by adopting a nonlinear optimization algorithm, and outputting an external parameter from the laser radar to a global coordinate system, an external parameter from the camera to the global coordinate system and a scale compensation matrix corresponding to each sensor.
  2. 2. The multi-sensor joint calibration method based on scale compensation according to claim 1, wherein the scale compensation matrix is an anisotropic three-dimensional positive-definite diagonal matrix, specifically: S=diag(s x ,s y ,s z ); S x ,s y ,s z respectively represents independent scale factors of the three directions of x, y and z under the coordinate system of the odometer, and is used for compensating the problem of inconsistent drift degree of the odometer in each direction.
  3. 3. The multi-sensor joint calibration method based on scale compensation according to claim 1, wherein the converting the WGS-84 geodetic coordinates output by the RTK receiver into a global pose sequence in an ENU coordinate system with a first frame position as an origin specifically comprises: Defining WGS-84 ellipsoid parameters, and calculating the curvature radius of the mortise unitary ring; and converting the geodetic coordinates into geodetic rectangular coordinates, and converting the geodetic rectangular coordinates into ENU local coordinates by using the first frame RTK position as a reference origin through a rotation matrix.
  4. 4. The method for scale-compensation-based joint calibration of multiple sensors according to claim 1, wherein the matching of the time thresholds satisfies the following condition: Wherein, the For the odometer trace point timestamp, For the RTK trace point time stamp, For an unmatched RTK point index set, δ t is the time synchronization threshold.
  5. 5. The multi-sensor joint calibration method based on scale compensation according to claim 1, wherein the initial value of the scale compensation matrix is estimated by the following function: Wherein, sigma (·) is a standard deviation operator, Δx W ,Δy W ,Δz W is a displacement increment under a global coordinate system, Δx O ,Δy O ,Δz O is a displacement increment under an odometer coordinate system, and then an initial scale compensation matrix is constructed:
  6. 6. The multi-sensor joint calibration method based on scale compensation according to claim 1, wherein the rotation matrix initial value is obtained by: constructing a cross covariance matrix of displacement increments Wherein p O is the odometer track point, p W is the track point under the global coordinate system, and K is the number of successfully matched point pairs; singular value decomposition is carried out on H=UΣV T to obtain a rotation matrix initial value R (0) =VU T , wherein U and V are orthogonal matrices, and Σ is a diagonal matrix; According to the initial values of the scale compensation matrix and the rotation matrix, calculating initial estimation of a translation vector based on the track point mean value: Wherein S (0) is an initial scale compensation matrix.
  7. 7. The multi-sensor joint calibration method based on scale compensation according to claim 1, wherein the pose increment residual is defined by using a Liqun logarithmic mapping; Wherein, the laser radar-RTK residual error is: Wherein, the For the pose increment of the laser radar odometer, For pose increment under the global coordinate system, T LW is the external parameter from the laser radar to the global coordinate system, and S L is the scale compensation matrix.
  8. 8. The multi-sensor joint calibration method based on scale compensation according to claim 1, wherein the geometrical consistency constraint is: T LW =T LC ·T CW ; Wherein T LW is the external parameter from the laser radar to the global coordinate system, T CW is the external parameter from the camera to the global coordinate system, T LC is the external parameter from the laser radar to the camera, and the corresponding residual is defined as:
  9. 9. the multi-sensor joint calibration method based on scale compensation according to claim 1, wherein the overall optimization objective function is: Wherein, the Epsilon L is a laser radar pose increment edge set, epsilon C is a camera pose increment edge set, rho (·) is a robust kernel function, and lambda is a weight coefficient of geometric consistency constraint.
  10. 10. A multi-sensor joint calibration system based on scale compensation, comprising: the data acquisition module is used for synchronously acquiring laser radar point cloud data, camera image data and RTK positioning data; The pose estimation module is used for acquiring pose sequences of the laser radar and the camera through a mileage calculation method and converting RTK geodetic coordinates into global pose sequences under an ENU coordinate system; the time synchronization module is used for realizing registration of the odometer track and the RTK track by adopting a nearest neighbor matching strategy based on a time threshold; the initial value estimation module is used for estimating an initial value of the scale compensation matrix based on the standard deviation ratio of the displacement increment and estimating an initial value of the rotation matrix through singular value decomposition of the cross covariance matrix; And the optimization solving module is used for constructing an objective function comprising pose increment residual errors and geometric consistency constraint, and outputting the external parameters of each sensor and the scale compensation matrix through iterative solving of a nonlinear optimization algorithm.

Description

Multi-sensor joint calibration method and system based on scale compensation Technical Field The invention relates to the technical field of navigation, in particular to a multi-sensor joint calibration method and system based on scale compensation. Background The multi-sensor fusion is a key technology for realizing high-precision positioning and environment sensing of automatic driving and mobile robots. In practical application, a single sensor is difficult to meet the positioning requirement in a complex environment, a laser radar can provide high-precision three-dimensional point cloud information, the method is suitable for sensing and mapping in a structured environment, but a mileage calculation method based on the laser radar can generate accumulated drift in a long-distance running process, a camera can acquire abundant texture and semantic information, the cost is low and the information quantity is large, a monocular vision system lacks absolute scale information, a binocular and RGB-D camera can restore the scale but has a limited working distance, and an RTK (real-time dynamic carrier phase difference technology) can provide global positioning information with centimeter-level precision, has the characteristic of no drift in an open environment and can be used as an absolute position reference. The multiple sensors are effectively fused, complementary advantages of the sensors can be fully exerted, and the positioning and sensing capabilities of full scenes, high precision and high robustness are achieved. The precondition of multi-sensor fusion is to acquire accurate extrinsic relationships between sensors, i.e., rigid body transformations between different sensor coordinate systems. The accuracy of the external parameter calibration directly affects the overall performance of the fusion system. The traditional external parameter calibration method is usually carried out under a controlled environment by means of specific calibration objects (such as a checkerboard calibration plate, a reflection target and the like), is complex in operation and is difficult to adapt to the change of an actual deployment scene. In recent years, attention is paid to a calibration method based on a calibration-free object, and the method estimates external parameters by analyzing the pose change of a sensor in the motion process and has the advantages of high flexibility and strong adaptability. The alignment of the coordinate system is an important technical means, and the relative pose relation between the sensors is estimated by aligning the pose tracks of different sensors to a unified reference coordinate system. However, the existing coordinate system alignment method has the following disadvantages. First, most methods consider only the alignment of the two-dimensional heading angle (Yaw), the basic assumption of which is that the sensor is mounted on a horizontal plane, with only rotational deviations about the vertical axis. However, in practical application, due to factors such as installation errors, mechanical vibration, uneven ground and the like, the installation posture of the sensor often has deviation of Pitch angle (Pitch) and Roll angle (Roll) directions, and only the aligning method of the heading angle is considered to introduce systematic errors. Second, existing methods generally assume that the trajectory of the sensor odometer output has the same dimensions as the real trajectory, or that compensation is performed using only a single isotropic scale factor. In practice, however, the degree of dimensional drift generated during long-distance operation in different directions is often different, which is closely related to factors such as environmental structure, movement pattern, feature distribution, and the like. It is difficult to accurately describe such direction dependent scale drift characteristics using isotropic scale factors. In summary, the prior art has the following problems that the alignment link of the coordinate system calibrated by fusion of the external parameters of the multiple sensors has defects, and the dimensional drift of the odometer in different directions cannot be adapted to influence the fusion precision and the robustness. Disclosure of Invention The invention aims to overcome the defects of a coordinate system alignment link of multi-sensor fusion external parameter calibration, and adapt to the dimensional drift of an odometer in different directions, so that the problems of fusion accuracy and robustness are improved. To this end, in one aspect, an embodiment of the present invention provides a multi-sensor joint calibration method based on scale compensation, the method including the steps of: Data acquisition is carried out on a mobile robot platform, and laser radar point cloud data, camera image data and RTK positioning data are synchronously acquired, so that the data of each sensor are ensured to have a uniform time reference; According to the dat