Search

CN-121994278-A - Posture error compensation method of large-view-field remote vision measurement system

CN121994278ACN 121994278 ACN121994278 ACN 121994278ACN-121994278-A

Abstract

The invention discloses a posture error compensation method of a large-view-field remote vision measurement system, which belongs to the technical field of precise optical measurement and posture calculation, and comprises the steps of obtaining a telemetry data packet and obtaining a posture measurement data set by analyzing the telemetry data packet; the method comprises the steps of meshing a partition, establishing a grid center coordinate set according to pixel coordinate distribution ranges of target feature points in all gesture measurement data sets, determining grid units to which the grid center coordinate set belongs, calculating a global average rotation matrix and grid average rotation matrices corresponding to all the grid units, constructing gesture compensation matrices of all the grid units, compensating an original gesture resolving rotation matrix corresponding to each grid unit, and optimizing a rotation matrix according to the compensated rotation matrices. The method can improve the attitude angle measurement accuracy to the level of an angle second, has higher accuracy and robustness compared with the traditional method, and can ensure high-accuracy attitude estimation even under the condition of low signal-to-noise ratio or uneven noise.

Inventors

  • MA YUEBO
  • ZHAO RUJIN
  • LONG HONGFENG

Assignees

  • 中国科学院光电技术研究所

Dates

Publication Date
20260508
Application Date
20260213

Claims (10)

  1. 1. A method for compensating for attitude errors of a large-field-of-view remote vision measurement system, the method comprising: Step 1, acquiring a telemetry data packet, and acquiring a posture measurement data set by analyzing the telemetry data packet to acquire multiple sets of posture data, wherein each set of posture data comprises a rotation matrix of a target coordinate system relative to a laser tracker coordinate system Translation vector Rotation matrix of target coordinate system relative to camera coordinate system And pixel coordinates (u, v) of a plurality of target feature points in the image plane, which are preset; step 2, meshing partition, namely generating a rectangular covering frame covering all target imaging areas on the image plane according to the pixel coordinate distribution range of target feature points in all gesture measurement data sets, and dividing the rectangular covering frame into three parts The grid cells are used for establishing a grid center coordinate set and determining the grid cells according to the target feature point center coordinates; step 3, calculating a rotation matrix of the camera coordinate system relative to the laser tracker coordinate system To obtain a global average rotation matrix And calculates a corresponding rotation matrix in each grid cell To obtain a grid average rotation matrix ; Step 4, according to the global average rotation matrix And grid average rotation matrix Constructing an attitude compensation matrix of each grid cell ; Step 5, compensating the matrix according to the posture The original posture of the target coordinate system corresponding to each grid unit relative to the camera coordinate system is calculated to obtain a rotation matrix of the camera coordinate system relative to the target coordinate system Compensating to obtain a compensated rotation matrix And (C) sum Step 6, according to the compensated rotation matrix Rotation matrix of camera coordinate system relative to laser tracker coordinate system And (5) optimizing.
  2. 2. The method for compensating for the attitude error of a large visual field distance vision measurement system according to claim 1, wherein said step 2 comprises: establishing a rotation matrix of a camera coordinate system relative to a laser tracker coordinate system Mapping relation with the center coordinate of the target to obtain a rotation matrix in the kth measurement Target center coordinates at kth measurement K is the measurement sequence number, and k has a value of 1 to the maximum number of measurements, and Dividing an image plane into m multiplied by n grid cells, defining a grid center coordinate set, and determining a grid index to which the target center coordinate belongs according to Euclidean distance between the current target center coordinate and each grid cell center coordinate.
  3. 3. The method for compensating for the attitude error of a large field of view remote vision measurement system according to claim 2, wherein said target center coordinates Pixel coordinates on the image plane from multiple target feature points in the kth measurement Calculated by arithmetic mean, where i=1, N, N is the number of target feature points.
  4. 4. The method for compensating for the attitude error of a large field of view remote vision measurement system according to claim 3, wherein said plurality of target feature points is 6 target feature points at pixel coordinates I=1, 2,..6, the calculation formula is as follows: ; Wherein, the Representing the pixel coordinates of the ith target feature point on the image plane at the kth measurement.
  5. 5. A method of compensating for the posing error of a large field of view remote vision measurement system as set forth in claim 3 wherein determining the grid index to which the center coordinates of the target belong comprises: defining a grid center coordinate set: , wherein, Representing the center coordinates of the ith row and jth column grid cells; calculating Euclidean distance between center coordinates of current target and center coordinates of each grid unit in kth measurement Sum of Determining the grid index to which the target center coordinate belongs according to the minimum Euclidean distance principle 。
  6. 6. The method for compensating for the posing error of a large field of view remote vision measuring system as set forth in claim 5, wherein said step 2 further comprises the step of rotating the rotation matrix at the time of the kth measurement Deposit into grid cells Is { of the set of rotation matrices And (3) carrying out attitude statistics and compensation calculation on the subsequent corresponding grid area, wherein { is as follows The expression of all rotation matrices belonging to the ith row and jth column grid cells Is a set of (3).
  7. 7. The method for compensating for the attitude error of a large field of view remote vision measurement system according to claim 6, wherein said step 3 comprises: For the rotation matrix at the kth measurement Counting, and calculating global average rotation matrix As a global attitude reference, and For each grid cell, a set of rotation matrices { stored in the ith row and jth column grid cells Solving by adopting a quaternion averaging algorithm to obtain a grid average rotation matrix of the ith row and the jth column of grid units 。
  8. 8. The method for compensating for the attitude error of a large field of view remote vision measurement system according to claim 7, wherein said step 4 comprises: from a global average rotation matrix Grid average rotation matrix with ith row and jth column grid cells The relation between the grid cells is calculated to obtain an attitude compensation matrix of the ith row and the jth column grid cells 。
  9. 9. The method for compensating for the attitude error of a large field of view remote vision measurement system according to claim 8, wherein said step 5 comprises: posture compensation matrix according to ith row and jth column grid cells Calculating the original rotation matrix at the kth measurement Rotation matrix after gridding compensation 。
  10. 10. The method for compensating for the attitude error of a large field of view remote vision measurement system according to claim 9, wherein said step 6 comprises: According to the compensated rotation matrix And a rotation matrix of the target coordinate system relative to the laser tracker coordinate system Calculating to obtain an optimized rotation matrix of the camera coordinate system relative to the laser tracker coordinate system during the kth measurement And (C) sum The rotation matrix will be optimized As final attitude measurement result output to replace the original rotation matrix before compensation And calculating the obtained attitude result to finish the optimization correction of the attitude estimation result.

Description

Posture error compensation method of large-view-field remote vision measurement system Technical Field The invention relates to the technical field of precise optical measurement and attitude calculation, in particular to an attitude error compensation method of a large-view-field remote vision measurement system. Background The high-precision measurement of the gesture plays an important role in the fields of spacecraft positioning, precise alignment, industrial measurement and the like. The conventional visual measurement pose estimation method in the prior art usually shoots a known target through a camera, and utilizes a computer vision algorithm, such as PnP (PERSPECTIVE-n-Point) solution, to acquire a pose matrix of the camera relative to the target. However, due to factors such as camera imaging resolution, lens distortion and noise extraction, the accuracy of traditional vision gesture measurement is difficult to meet the requirements of the level of angle seconds, and particularly under the condition of large field of view and long distance, imaging distortion and noise are often unevenly distributed at the center and the edge of the field of view, so that systematic deviation exists in measurement errors of different image areas. For example, when a target is imaged in the central region of the field of view, the target is less affected by optical distortion and the image extraction accuracy is higher, and when the target is close to the edge of the field of view, factors such as lens distortion and illumination non-uniformity can introduce larger positioning errors, so that a region-related systematic error exists in the attitude angle obtained by vision measurement. In addition, small deviations of parameters (such as the principal point position) in the camera can also introduce systematic deviations in the pose calculation, affecting the measurement accuracy of the pose and position. In the prior art, camera distortion is usually corrected and measurement accuracy is improved by an integral calibration method, but the global correction cannot be used for sub-division compensation of error differences of different view field areas, so that further improvement of attitude measurement accuracy is limited. Therefore, how to compensate for the error characteristics of different regions of the image so as to realize the attitude estimation with high accuracy in the order of an angle second becomes a technical problem to be solved. Disclosure of Invention The invention aims to provide an attitude error compensation method of a large-view-field remote vision measurement system, which is used for solving the problem that the measurement accuracy of the existing vision attitude is limited by nonuniform error distribution. The method comprises the steps of 1, acquiring a telemetry data packet, and obtaining a posture measurement data set by analyzing the telemetry data packet to obtain multiple sets of posture data, wherein each set of posture data comprises a rotation matrix of a target coordinate system relative to a laser tracker coordinate systemTranslation vectorRotation matrix of target coordinate system relative to camera coordinate systemStep 2, gridding partition, according to the distribution range of pixel coordinates of target feature points in all attitude measurement data sets, generating a rectangular covering frame covering all target imaging areas on the image plane, dividing the rectangular covering frame into m multiplied by n grid units, establishing a grid center coordinate set, and determining the grid units according to the center coordinates of the target feature points, step 3, calculating a rotation matrix of a camera coordinate system relative to a laser tracker coordinate systemTo obtain a global average rotation matrixAnd calculates a corresponding rotation matrix in each grid cellTo obtain a grid average rotation matrixStep 4, according to the global average rotation matrixAnd grid average rotation matrixConstructing an attitude compensation matrix of each grid cellStep 5, compensating the matrix according to the postureThe original posture of the target coordinate system corresponding to each grid unit relative to the camera coordinate system is calculated to obtain a rotation matrix of the camera coordinate system relative to the target coordinate systemCompensating to obtain a compensated rotation matrixAnd step 6, according to the compensated rotation matrixRotation matrix of camera coordinate system relative to laser tracker coordinate systemAnd (5) optimizing. The beneficial effects of the invention are as follows: By introducing a gridding compensation strategy, the attitude measurement errors of different image areas are corrected pertinently, so that higher-precision attitude measurement is realized. By the technical scheme provided by the invention, the attitude angle measurement precision can be improved to the level of an angle second, and compared with the tradi