CN-121982113-A - Camera and radar external parameter optimization method based on operation site moving target
Abstract
The embodiment of the invention discloses a camera and radar external parameter optimization method based on a moving target of a working site, which collects continuous frame images and point clouds and extracts the moving target of the images by an optical flow method and a frame difference method respectively Point cloud moving object Projecting the moving object to the image by the existing calibration matrix Matching via rigid transformation Storing the matching result in M, extracting the edge characteristics of the image and the point cloud, and matching to obtain PNP algorithm is used to obtain external parameters Calculating error, outputting if the calculated error reaches the standard Otherwise, eliminating the error and repeating the characteristic matching and subsequent steps. According to the invention, the point cloud and image characteristics of the moving target are utilized to carry out iterative optimization of the external parameter matrix, the moving target with larger error is removed, camera and laser radar external parameter calibration based on the moving target in any scene is realized, the problem that calibration depends on a calibration plate and external parameter correction cannot be carried out in the field use process is effectively solved, and the requirement of transformer substation measurement can be well met.
Inventors
- LIAO ZHIPENG
- LU JUN
- HE HAIQIANG
- HAN CUN
- LUO JIANKENG
- XIA GUOHUA
- LI MENG
- LI JUAN
- FANG YONGFENG
- JIANG WEIZHEN
Assignees
- 广东威恒输变电工程有限公司
Dates
- Publication Date
- 20260505
- Application Date
- 20251225
Claims (3)
- 1. A camera and radar external parameter optimization method based on a moving target of a working site is characterized by comprising the following steps: S1, synchronously acquiring images and point cloud data of continuous frames by a camera to be calibrated and a laser radar, wherein the images are collected into a set The point cloud set is The specific expression is: ; ; Wherein the image data acquired for the nth time is The point cloud data is Sequentially storing the acquired images and the point cloud And In (a) and (b); s2, extracting an image acquired at the ith time in the optical flow method In (a) a moving object ; S3, extracting point cloud acquired at the ith time by adopting frame difference method In (a) a moving object ; S4, projecting the point cloud in the moving object set onto the image by adopting the existing calibration matrix of the device, wherein the projected point cloud is ; S5, adopting a rigid body transformation method to match And (3) with And storing the matched moving object I.e. ; For all matched image moving targets and point cloud moving targets in k acquisitions, namely Wherein , The image moving target and the point cloud moving target matched with the k group; S6, if The number of moving targets in the system is smaller than a threshold value Returning to step S2, otherwise traversing Extraction by Canny operator Extracting the image edge features by adopting a normal estimation method The point cloud edge characteristics in the image are registered with the point cloud edge characteristics through a Hungary matching algorithm, and the obtained image is obtained The feature sets of the images and the point clouds of all moving targets in the system are as follows: ; S7, adopting PNP algorithm, utilizing Performing external parameter calibration of the camera and the laser radar to obtain an external parameter calibration matrix ; S8, calibrating the matrix by utilizing external parameters Will be Projecting all the point cloud moving targets on the image, and calculating Euclidean distance between the point cloud moving targets and the corresponding image features as projection errors; s9, calculating the root mean square error of the projection error, if the root mean square error is smaller than the threshold value Then consider the external reference calibration matrix Calibrating matrices for camera and lidar external parameters, otherwise in And removing the moving object with the largest projection error, and returning to S6.
- 2. The camera and radar profile optimization method based on the moving object of the operation site according to claim 1, comprising the following process in step S2: S2.1, preprocessing an image, converting the image into a gray level image, and smoothing noise in the image by adopting a Gaussian filtering algorithm; S2.2, respectively calculating gray scale change rates of each pixel point in the gray scale map in the directions of an x axis and a y axis by using a Sobel operator to obtain a spatial gradient matrix in the directions of the x axis and the y axis of the image; s2.3, subtracting the pixel value of the corresponding position of the i-1 th acquired picture from the pixel value of the i-th acquired picture to obtain the time gradient of each pixel point; s2.4, in a 3X 3 neighborhood window of each pixel point, calculating the square sum Gxx of the x-direction spatial gradient, wherein the specific expression is as follows: ; Wherein the method comprises the steps of Is in the image Gray value of the position; The product sum Gxy of the x-direction spatial gradient and the y-direction spatial gradient is calculated, and the specific expression is as follows: ; The square sum Gyy of the spatial gradient in the y direction is calculated, and the specific expression is as follows: ; Within the neighborhood window, the product of the x-direction spatial gradient and the time gradient and the product of the y-direction spatial gradient and the time gradient and the Gyt are calculated, namely: ; ; Wherein, the Time difference for acquiring two images; s2.5, constructing a linear equation set, wherein the specific expression is as follows: ; s2.6, solving a linear equation set to obtain the optical flow speed of the pixel points so as to obtain a moving target in the image, wherein the set of the moving target in the ith frame of image is 。
- 3. The camera and radar profile optimization method based on the moving object of the operation site according to claim 1, comprising the following process in step S3: s3.1 Point-to-Point cloud Filtering, namely removing noise points by using Gaussian filtering; s3.2 computing Point cloud And point cloud Euclidean distance of each point in space, if the distance is greater than a threshold Then the point is considered to be a discrepancy point; And S3.3, after the difference points are obtained, carrying out connected region analysis on the difference points, merging adjacent difference points into one connected region, and then clustering the difference points, and classifying the spatially similar difference points into one type to form a plurality of cluster clusters. Each cluster may correspond to a moving object. The set of moving objects in the ith frame point cloud is: 。
Description
Camera and radar external parameter optimization method based on operation site moving target Technical Field The invention relates to the technical field of transformer substation management, in particular to a camera and radar external parameter optimization method based on a moving target of an operation site. Background In the safety management process of the transformer substation, the distance between the electrified equipment and the peripheral object needs to be detected, and electric shock hidden danger exists in the measurement process. The traditional manual measurement mode has the problems that the measurement position is difficult to reach, the efficiency of the measurement process is low, the measurement accuracy is low, and the like. In order to improve measurement accuracy and efficiency and ensure personnel safety, the camera and the laser radar are mainly used for fusing data, and the charged body and the peripheral objects are accurately measured. Although the external parameter calibration method of the current camera and the laser radar can achieve certain precision on the integral fusion of the image and the point cloud, with the use of the device, errors can occur in the calibration of the camera and the radar, so that obvious deviation can occur in the fusion of the wire point cloud and the image, and the requirement of the distance measurement of the charged body and the peripheral object in a large scene of the transformer substation is difficult to meet. Disclosure of Invention In order to solve the defects of the technology, the invention provides a camera and radar external parameter optimization method based on a moving target of a working field. The embodiment of the invention provides a camera and radar external parameter optimization method based on a moving target of a working field, which comprises the following steps: S1, synchronously acquiring images and point cloud data of continuous frames by a camera to be calibrated and a laser radar, wherein the images are collected into a set The point cloud set isThe specific expression is: ; ; Wherein the image data acquired for the nth time is The point cloud data isSequentially storing the acquired images and the point cloudAndIn (a) and (b); s2, extracting an image acquired at the ith time in the optical flow method In (a) a moving object; S3, extracting point cloud acquired at the ith time by adopting frame difference methodIn (a) a moving object; S4, projecting the point cloud in the moving object set onto the image by adopting the existing calibration matrix of the device, wherein the projected point cloud is; S5, adopting a rigid body transformation method to matchAnd (3) withAnd storing the matched moving objectI.e.;For all matched image moving targets and point cloud moving targets in k acquisitions, namelyWherein,The image moving target and the point cloud moving target matched with the k group; S6, if The number of moving targets in the system is smaller than a threshold valueReturning to step S2, otherwise traversingExtraction by Canny operatorExtracting the image edge features by adopting a normal estimation methodThe point cloud edge characteristics in the image are registered with the point cloud edge characteristics through a Hungary matching algorithm, and the obtained image is obtainedThe feature sets of the images and the point clouds of all moving targets in the system are as follows:; S7, adopting PNP algorithm, utilizing Performing external parameter calibration of the camera and the laser radar to obtain an external parameter calibration matrix; S8, calibrating the matrix by utilizing external parametersWill beProjecting all the point cloud moving targets on the image, and calculating Euclidean distance between the point cloud moving targets and the corresponding image features as projection errors; s9, calculating the root mean square error of the projection error, if the root mean square error is smaller than the threshold value Then consider the external reference calibration matrixCalibrating matrices for camera and lidar external parameters, otherwise inAnd removing the moving object with the largest projection error, and returning to S6. Preferably, in step S2, the following procedure is included: S2.1, preprocessing an image, converting the image into a gray level image, and smoothing noise in the image by adopting a Gaussian filtering algorithm; S2.2, respectively calculating gray scale change rates of each pixel point in the gray scale map in the directions of an x axis and a y axis by using a Sobel operator to obtain a spatial gradient matrix in the directions of the x axis and the y axis of the image; s2.3, subtracting the pixel value of the corresponding position of the i-1 th acquired picture from the pixel value of the i-th acquired picture to obtain the time gradient of each pixel point; s2.4, in a 3X 3 neighborhood window of each pixel point, calculating the square sum Gxx of the x-directi