Search

KR-20260064170-A - Method and apparatus for Camera and LiDAR sensor-based real-time calibration detecting and automatic calibration

KR20260064170AKR 20260064170 AKR20260064170 AKR 20260064170AKR-20260064170-A

Abstract

A method and apparatus for detecting and automatically correcting real-time calibration misalignment based on a camera and a LiDAR sensor are provided. The automatic calibration method according to an embodiment of the present invention utilizes a pre-built 360° panoramic image and a 3D point cloud to detect and correct misalignment caused by changes in the external environment in real time and automatically using sensor data. Furthermore, through the integration of a camera and a LiDAR sensor, an object can be detected in a 2D image, and a 3D point cloud corresponding to the 2D area where the detected object is located can be extracted, thereby improving both accuracy and speed. In addition, by performing calibration based on a sensor housing located in a nearby position that has not experienced misalignment, multiple connected calibrations can be effectively corrected.

Inventors

  • 박경원
  • 송병철

Assignees

  • 한국전자기술연구원

Dates

Publication Date
20260507
Application Date
20241031

Claims (12)

  1. Step of generating a panoramic image using a pre-installed camera; A step of mapping the 3D point cloud of a pre-installed LiDAR onto a panoramic image; Step of generating an image using a pre-installed camera; A step of searching for a common reference object area by comparing the generated image with the previously generated panorama image; A step of detecting calibration misalignment of the camera and lidar by comparing the positions of the detected common reference objects; and A method for automatically correcting calibration misalignment, characterized by including a step of correcting a detected calibration misalignment.
  2. In claim 1, Cameras and lidars, It is installed within the same enclosure, and The hull is, Automatic calibration misalignment correction method characterized by being installed on a fixed facility at an intersection.
  3. In claim 2, The exploration phase is, A step of gridding the panoramic image and the generated image; A step of inferring objects by grid from a gridded panoramic image and a generated image; A method for automatically correcting calibration misalignment characterized by including a step of comparing the positions of common reference objects.
  4. In claim 3, The common standard object is, Automatic correction method for calibration misalignment characterized by being an object that is fixed and immobile at an intersection.
  5. In claim 3, The correction stage is, A step of detecting the segmentation of a common reference object in an image when distortion is detected as a result of comparing the positions of a common reference object; A step of extracting 3D point clouds that match the image segmentation at points before and after distortion; A method for automatically correcting calibration misalignment, characterized by including a first update step of updating calibration parameters using extracted three-dimensional point clouds.
  6. In claim 5, The first update phase is, A method for automatically correcting calibration misalignment, characterized by applying a scan matching algorithm to a 3D point cloud prior to misalignment based on a 3D point cloud prior to misalignment to calculate rotation and translation information of a common reference object.
  7. In claim 1, A method for automatically correcting calibration misalignment, characterized by further including the step of additionally correcting the calibration misalignment based on a 3D point cloud generated from surrounding lidars where the measurement area overlaps.
  8. In claim 7, The additional correction steps are, A step of acquiring a 3D point cloud generated from surrounding lidars where the measurement area overlaps; A step of extracting a 3D point cloud region common to the acquired 3D point cloud; A method for automatically correcting calibration misalignment, characterized by including a second update step of updating calibration parameters using extracted three-dimensional point clouds.
  9. In claim 8, The surrounding lidar is, Automatic calibration misalignment correction method characterized by a LiDAR in which the calibration reference is not misaligned.
  10. An edge computer that transmits images generated by a pre-installed camera; and A calibration misalignment automatic correction system characterized by including: a server computer that generates a panoramic image using an image received from an edge computer, maps a 3D point cloud of a pre-installed LiDAR onto the panoramic image, searches for a common reference object area by comparing the image received from the edge computer with the pre-generated panoramic image, detects calibration misalignment between the camera and LiDAR by comparing the positions of the searched common reference objects, and corrects the detected calibration misalignment.
  11. A step of searching for a common reference object area by comparing an image generated by a pre-installed camera with a pre-generated panorama image; A step of detecting calibration misalignment between the camera and LiDAR by comparing the positions of the detected common reference objects; A step of correcting detected calibration deviation; and A method for automatically correcting calibration misalignment, characterized by including the step of additionally correcting the calibration misalignment based on a 3D point cloud generated from surrounding lidars where the measurement area overlaps.
  12. An edge computer that transmits images generated by a pre-installed camera; and A calibration misalignment automatic correction system characterized by including: a server computer that searches for a common reference object area by comparing an image received from an edge computer with a previously generated panoramic image, detects calibration misalignment between a camera and a lidar by comparing the positions of the searched common reference objects, corrects the detected calibration misalignment, and further corrects the calibration misalignment based on a 3D point cloud generated from a surrounding lidar where the measurement area overlaps.

Description

Method and apparatus for Camera and LiDAR sensor-based real-time calibration detecting and automatic calibration The present invention relates to sensor calibration, and more specifically, to a method and apparatus for real-time and automatic calibration of camera and LiDAR sensors caused by external environmental factors. In systems utilizing multiple LiDAR sensors, accurate calibration of the point cloud data collected by each sensor is essential. If the position or orientation of a sensor is slightly misaligned due to changes in the external environment or physical impact, such calibration errors can lead to performance degradation of the entire system and issues with data reliability. In particular, the accuracy of sensor data is critical in fields such as autonomous vehicles, smart city infrastructure, and high-precision mapping. Existing calibration methods are primarily performed in pre-set, fixed environments and have limitations in responding immediately to real-time changing environments. Therefore, there is a need for technology capable of calibrating data from multiple LiDAR sensors in real time and automatically correcting misalignment. FIG. 1 is a camera/LiDAR automatic calibration system according to one embodiment of the present invention, FIG. 2 is a method for detecting and correcting camera/LiDAR sensor calibration misalignment according to another embodiment of the present invention. FIG. 3 is a detailed flowchart of the initial value setting and camera/lidar data reception step (S210). Figure 4 shows an example of camera, LiDAR sensor, and Edge PC installation in a vehicle intersection environment. Figure 5 is an example of constructing a 360° panoramic image, FIG. 6 is a detailed flowchart of the image gridding and common object area search step (S220). Figure 7 is an example of panoramic image gridding, Figure 8 is an example of comparing object positions by grid in a panorama and camera image. FIG. 9 is a detailed flowchart of the calibration misalignment detection and correction steps, Figure 10 is an example of a reference object segmentation extraction and calibration correction process. FIG. 11 is a detailed flowchart of a precision calibration step (S240) using the common area of the lidar and camera of a nearby installed housing. Figure 12 is an example of performing precision calibration using nearby LiDAR sensor data. The present invention will be described in more detail below with reference to the drawings. An embodiment of the present invention presents a method and apparatus for detecting and automatically correcting real-time calibration misalignment based on a camera and a lidar sensor. When calibration is misaligned due to the external environment in a vehicle intersection environment, the technology utilizes sensor data by using a pre-built 360° panoramic image and a 3D point cloud to detect and correct the misalignment caused by changes in the external environment in real time and automatically. FIG. 1 is a diagram illustrating the configuration of a camera/lidar automatic calibration system according to an embodiment of the present invention. The camera/lidar automatic calibration system according to an embodiment of the present invention is a system that detects calibration misalignment in real time based on a camera and a lidar sensor and automatically calibrates based on the detection results. A camera/lidar automatic calibration system according to an embodiment of the present invention comprises Edge PCs (110, 120, ...) installed at different locations in a specific environment and a Server PC (200) connected to the Edge PCs (110, 120, ...) to enable communication and manage data in an integrated manner. Edge PCs (110, 120, ...) are connected to the same network as the Server PC (200) via Ethernet communication. In FIG. 1, Edge PCs (110, 120, ...) are implemented as multiple units, but it is also possible to implement them as a single unit. Edge PCs (110, 120, ...) are configured to include a data acquisition unit (101), a processing unit (102), and a communication unit (103). The data acquisition unit (101) is connected to cameras and lidar sensors installed at different locations in an intersection environment and acquires data from them in real time. Real-time data acquisition is possible via wired or wireless means depending on the communication method of the sensors. Meanwhile, the cameras and lidar are installed within a single enclosure and are pre-calibrated. The processing unit (102) converts the sensor data packets acquired through the data acquisition unit (101) into data for each unit. Each data packet from the camera and lidar sensors is converted into the number of pixels, width, height, and each pixel value of the image, and the lidar sensor is converted into X, Y, Z point coordinates and meter distance. The camera and lidar data converted into each unit are contained in a ROS2-based message, and the ROS2-based message is published. The communica