CN-122023536-A - Collaborative calibration device control method, collaborative calibration method and equipment and engineering vehicle
Abstract
The disclosure provides a collaborative calibration device control method, collaborative calibration method and equipment and an engineering vehicle. The control method of the collaborative calibration device comprises the steps of utilizing ambient light intensity acquired by a light sensing sensor in real time to adjust brightness of an LED array arranged inside each of four two-dimensional codes, wherein the four two-dimensional codes are respectively arranged in adjacent areas of four corners of a rectangular calibration plate in the collaborative calibration device, and adjusting at least one of the height and pitch angle of an adjustable support bracket of the calibration plate so that the calibration plate can appear in the visual fields of an image acquisition device, a laser radar and a millimeter wave radar on an unmanned engineering vehicle.
Inventors
- WANG DONGXU
- ZHOU HONGFU
- REN LIANGCAI
Assignees
- 江苏徐工国重实验室科技有限公司
- 徐州徐工矿业机械有限公司
Dates
- Publication Date
- 20260512
- Application Date
- 20251104
Claims (20)
- 1. A co-calibration apparatus control method, performed by a control device in a co-calibration apparatus, comprising: The brightness of a Light Emitting Diode (LED) array in each two-dimensional code in the four two-dimensional codes is adjusted by utilizing the ambient illumination intensity acquired by a light sensing sensor in real time, wherein the four two-dimensional codes are respectively arranged in the adjacent areas of four corners of a rectangular calibration plate in the collaborative calibration device; At least one of the height and pitch angle of the adjustable support bracket of the calibration plate is adjusted so that the calibration plate can appear in the field of view of the image acquisition device, the laser radar and the millimeter wave radar on the unmanned engineering vehicle.
- 2. The cooperative calibration apparatus control method according to claim 1, wherein adjusting at least one of a height and a pitch angle of an adjustable support bracket of the calibration plate comprises: and adjusting at least one of the height and the pitch angle of the adjustable support bracket of the calibration plate according to the indication information sent by the unmanned engineering vehicle, wherein the indication information comprises the type of the unmanned engineering vehicle and the distance information between the unmanned engineering vehicle and the collaborative calibration device.
- 3. The control method for a cooperative calibration apparatus according to claim 1 or 2, wherein, And the brightness of the LED array in each two-dimensional code is in positive correlation with the ambient illumination intensity.
- 4. A control apparatus comprising: a memory; A processor coupled to the memory, the processor configured to implement the co-calibration device control method of any of claims 1-3 based on execution of instructions stored in the memory.
- 5. A co-calibration device, comprising: The control apparatus according to claim 4; The LED display device comprises a calibration plate, wherein the calibration plate is rectangular, the surface of the calibration plate is covered with a diffuse reflection coating, two-dimensional codes are arranged in the adjacent area of each corner of the calibration plate, and a light-emitting diode (LED) array is arranged in the two-dimensional codes, wherein the LED array adjusts brightness according to the control of the control equipment; the plurality of light-sensitive sensors are used for detecting the ambient illumination intensity, and are uniformly arranged on the calibration plate; the corner reflector is arranged at the geometric center point of the back surface of the calibration plate; and the adjustable support bracket is used for installing the calibration plate and can adjust at least one of the height and the pitch angle of the calibration plate.
- 6. A co-calibration device according to claim 5 wherein, The adjustable support bracket is configured to adjust at least one of a height and a pitch angle of the calibration plate according to control of the control device.
- 7. A collaborative calibration method performed by a parameter configuration device in an unmanned engineering vehicle, comprising: Reading external parameter initial values of an image acquisition device, a laser radar and a millimeter wave radar; Controlling the image acquisition device, the laser radar and the millimeter wave radar to acquire first image data, first laser radar point cloud and first millimeter wave radar data with synchronous time; Preprocessing the first image data, the first laser radar point cloud and the first millimeter wave radar data respectively to obtain second image data, second laser radar point cloud and second millimeter wave radar data; Extracting a first feature, a second feature and a third feature of a collaborative calibration device from the second image data, the second laser radar point cloud and the second millimeter wave radar data, wherein the collaborative calibration device is the collaborative calibration device according to claim 5 or 6; And according to the first characteristic, the second characteristic and the third characteristic of the collaborative calibration device, performing joint optimization on the external parameter initial values of the image acquisition device, the laser radar and the millimeter wave radar so as to respectively obtain external parameter calibration results of the image acquisition device, the laser radar and the millimeter wave radar.
- 8. The collaborative calibration method according to claim 7, wherein extracting a first feature from the second image data comprises: establishing a world coordinate system with a geometric center point of a calibration plate in the collaborative calibration device as an origin; processing the second image data to identify four two-dimensional codes on the calibration plate; according to the four two-dimensional codes, adopting a perspective n-point PnP algorithm to solve a rotation matrix and a translation vector of an image coordinate system relative to a world coordinate system; according to the preset size information of the calibration plate, the rotation matrix and the translation vector, projecting four corner points of the calibration plate from the world coordinate system to an image coordinate system; extracting two-dimensional coordinates of the four corner points under the image coordinate system; and ordering the two-dimensional coordinates of the four corner points according to a specified sequence, and taking the obtained first ordering result as the first feature.
- 9. The collaborative calibration method according to claim 8, wherein extracting the second feature from the second lidar point cloud comprises: Rotating the second laser radar point cloud to a designated plane; Fitting a minimum bounding box of the second laser radar point cloud in the appointed plane, and extracting four vertexes in the minimum bounding box to serve as four candidate corner points; converting the four candidate corner points into a three-dimensional space to obtain three-dimensional coordinates of the four corner points of the calibration plate under a laser radar coordinate system; And sequencing the three-dimensional coordinates of the four corner points according to a designated sequence, and taking the obtained second sequencing result as the second feature.
- 10. The collaborative calibration method according to claim 9, wherein extracting the third feature from the second millimeter wave radar data includes: Performing European clustering and target tracking on the second millimeter wave radar data to obtain a first tracking target set; Selecting a target point meeting a preset condition from the first tracking target set to generate a second tracking target set, wherein the deviation of the radar scattering sectional area value of the target point meeting the preset condition and the preset radar scattering sectional area value of a corner reflector arranged at the geometric center point of the back of the calibration plate is smaller than a preset scattering sectional area threshold value; According to the three-dimensional coordinates of the four corner points of the calibration plate under the laser radar coordinate system, determining the three-dimensional coordinates of the geometric center point of the calibration plate under the laser radar coordinate system; Converting the three-dimensional coordinates of the geometric center point of the calibration plate under a laser radar coordinate system into millimeter wave radar coordinate system; Selecting a corresponding target point of the corner reflector from the second tracking target set, wherein the deviation between the three-dimensional coordinate of the corresponding target point and the three-dimensional coordinate of the geometric center point of the calibration plate under the millimeter wave radar coordinate system is smaller than a preset distance threshold; and taking the three-dimensional coordinates of the corresponding target point under a millimeter wave radar coordinate system as the third characteristic.
- 11. The collaborative calibration method according to claim 10, wherein performing joint optimization on initial values of external parameters of the image acquisition device, the laser radar, and the millimeter wave radar includes: And according to the deviation of the first sorting result and the second sorting result and the deviation of the three-dimensional coordinates of the corresponding target point under the millimeter wave radar coordinate system and the prestored three-dimensional coordinates of the geometric center point of the corner reflector under the millimeter wave radar coordinate system, performing joint optimization on the external parameter initial values of the image acquisition device, the laser radar and the millimeter wave radar by using a nonlinear least square algorithm.
- 12. The collaborative calibration method according to claim 7, wherein preprocessing the first image data comprises: preprocessing the first image data by using a histogram equalization algorithm under the condition that the ambient illumination intensity is larger than a specified illumination intensity threshold value to obtain the second image data; And under the condition that the ambient illumination intensity is not greater than a specified illumination intensity threshold value, preprocessing the first image data by at least one of a self-adaptive brightness compensation algorithm and a multi-scale contrast enhancement algorithm to obtain the second image data.
- 13. The collaborative calibration method according to claim 7, wherein preprocessing the first lidar point cloud comprises: determining a region of interest according to the estimated position of the calibration plate; Extracting a point cloud positioned in the region of interest in the first laser radar point cloud as a first point cloud to be processed; According to the wire harness density of the laser radar, performing point cloud downsampling on the first point cloud to be processed to obtain a second point cloud to be processed; Extracting the plane of the calibration plate by adopting a plane segmentation algorithm according to the preset normal direction information of the calibration plate; and deleting the outlier point cloud in the second point cloud to be processed according to the plane to obtain the second laser radar point cloud.
- 14. The collaborative calibration method according to claim 7, wherein preprocessing the first millimeter wave radar data includes: determining a region of interest according to the estimated position of the calibration plate; And extracting point cloud positioned in the region of interest in the first millimeter wave radar data as the second millimeter wave radar data.
- 15. The collaborative calibration method according to claim 7, further comprising: visually presenting the external parameter calibration result to determine a reprojection error; And repeatedly executing first image data, first laser radar point cloud and first millimeter wave radar data for controlling the image acquisition device, the laser radar and the millimeter wave radar to acquire time synchronization under the condition that the re-projection error is larger than a preset error threshold value.
- 16. The collaborative calibration method according to claim 7, further comprising: Determining the distance between the unmanned engineering vehicle and the collaborative calibration device according to the second laser radar point cloud; And sending indication information to the collaborative calibration device, wherein the indication information comprises the type of the unmanned engineering vehicle and the distance information of the distance.
- 17. The co-calibration method according to any one of claims 7 to 16, further comprising: performing parameter configuration on the image acquisition device by using an internal parameter of the image acquisition device and an external parameter calibration result of the image acquisition device; Performing parameter configuration on the laser radar by using an external parameter calibration result of the laser radar; And carrying out parameter configuration on the millimeter wave radar by using an external parameter calibration result of the millimeter wave radar.
- 18. A parameter configuration apparatus comprising: a memory; a processor coupled to the memory, the processor configured to implement the co-calibration method of any of claims 7-17 based on execution of instructions stored in the memory.
- 19. An unmanned engineering vehicle comprising: the parameter configuration apparatus of claim 18; An image acquisition device configured to acquire image data; a lidar configured to collect a lidar point cloud; A millimeter wave radar configured to collect millimeter wave radar data.
- 20. A computer readable storage medium storing computer instructions which, when executed by a processor, implement the method of any one of claims 1-3, 7-17.
Description
Collaborative calibration device control method, collaborative calibration method and equipment and engineering vehicle Technical Field The disclosure relates to the technical field of parameter calibration, in particular to a collaborative calibration device control method, collaborative calibration method and equipment, and an engineering vehicle. Background Along with the development of mine intellectualization, unmanned engineering vehicles rely on multi-sensor (e.g., camera, lidar, millimeter wave radar) fusion technology to achieve environmental perception. Under the condition, the accuracy of the sensor external parameter calibration is directly related to the running safety and reliability of the unmanned engineering vehicle, and the efficiency and the robustness of the sensor external parameter calibration also directly influence the running efficiency of the unmanned engineering vehicle. Currently, external parameter calibration is generally achieved by adopting a camera and a laser radar. Disclosure of Invention The inventors noted that in the related art, external parameter calibration of cameras and lidars is generally achieved by using cameras and lidars, and integrated calibration for multi-modes (cameras, lidars, millimeter wave radar) cannot be achieved. Accordingly, the control method and the collaborative calibration method for the collaborative calibration device can effectively achieve integrated calibration of a camera, a laser radar and a millimeter wave radar, and can ensure operation safety of an unmanned engineering vehicle. In a first aspect of the disclosure, a control method of a collaborative calibration apparatus is provided, which is performed by a control device in the collaborative calibration apparatus, and includes adjusting brightness of a Light Emitting Diode (LED) array disposed inside each of four two-dimensional codes, which are respectively disposed in adjacent areas of four corners of a rectangular calibration plate in the collaborative calibration apparatus, using an ambient light intensity collected in real time by a light sensing sensor, and adjusting at least one of a height and a pitch angle of an adjustable support bracket of the calibration plate so that the calibration plate can appear in a field of view of an image collection apparatus, a laser radar, and a millimeter wave radar on an unmanned engineering vehicle. In some embodiments, adjusting at least one of the height and pitch angle of the adjustable support bracket of the calibration plate includes adjusting at least one of the height and pitch angle of the adjustable support bracket of the calibration plate based on the indication information sent by the unmanned engineering vehicle, wherein the indication information includes a type of the unmanned engineering vehicle and distance information between the unmanned engineering vehicle and the collaborative calibration device. In some embodiments, the brightness of the LED array inside each two-dimensional code is in positive correlation with the ambient illumination intensity. In a second aspect of the disclosure, a control apparatus is provided that includes a memory, a processor coupled to the memory, the processor configured to implement a co-calibration device control method as described in any of the embodiments above based on instructions stored by the memory. In a third aspect of the disclosure, a collaborative calibration apparatus is provided, which includes the control device according to any one of the embodiments, a calibration plate, wherein the calibration plate is rectangular, a diffuse reflection coating is covered on a surface of the calibration plate, two-dimensional codes are arranged in adjacent areas of each corner of the calibration plate, a Light Emitting Diode (LED) array is arranged in the two-dimensional codes, wherein the LED array adjusts brightness according to control of the control device, a plurality of light sensing sensors are used for detecting ambient light intensity, the plurality of light sensing sensors are uniformly arranged on the calibration plate, a corner reflector is arranged at a geometric center point of a back surface of the calibration plate, and an adjustable support bracket is used for installing the calibration plate and is capable of adjusting at least one of a height and a pitch angle of the calibration plate. In some embodiments, the adjustable support bracket is configured to adjust at least one of a height and a pitch angle of the calibration plate according to control of the control device. In a fourth aspect of the disclosure, a collaborative calibration method is provided, which is executed by parameter configuration equipment in an unmanned engineering vehicle, and includes reading initial values of parameters of an image acquisition device, a laser radar and a millimeter wave radar, controlling the image acquisition device, the laser radar and the millimeter wave radar to acquire time-synch