Search

JP-2026074630-A - Calibration system, calibration device, calibration method, calibration program

JP2026074630AJP 2026074630 AJP2026074630 AJP 2026074630AJP-2026074630-A

Abstract

[Problem] To provide a system that accurately calibrates the relationship between an imaging camera and observation radar while a vehicle is in motion. [Solution] The processor in the system for calibrating the relationship between the imaging camera and the observation radar in the host vehicle is configured to perform the following actions: acquire image data Dc of the camera field of view captured by the imaging camera, along with point cloud data Dr of the radar field of view superimposed on the camera field of view by the observation radar; monitor the amount of attitude angle change Δψ of the host vehicle that appears when the peak point in the correlation distribution of vertical angle, horizontal angle and Doppler velocity represented by the point cloud data Dr for each observation point shifts relative to the reference point of the point cloud data Dr; and construct a calibration parameter Cp corrected according to the amount of attitude angle change Δψ as a parameter for calibrating the image data Dc to be used for fusion with the point cloud data Dr. [Selection Diagram] Figure 4

Inventors

  • 志水 聖

Assignees

  • 株式会社デンソー

Dates

Publication Date
20260507
Application Date
20241021

Claims (10)

  1. A calibration system having a processor (12) for calibrating between an imaging camera (3) and an observation radar (4) in a host vehicle (2), The aforementioned processor, The acquisition of, along with image data (Dc) captured by the imaging camera, including the camera field of view (Ac) of the imaging camera, and point cloud data (Dr) observed by the observation radar, including the radar field of view (Ar) that overlaps with the camera field of view of the observation radar, is to be obtained. The change in attitude angle (Δψ) of the host vehicle, which occurs when the peak point (Pp) in the correlation distribution (α) between the vertical angle (θv), horizontal angle (θh), and Doppler velocity (Vd) represented by the point cloud data for each observation point shifts relative to the reference point (Pb) of the point cloud data, is monitored. A calibration system configured to perform the following actions: construct a calibration parameter (Cp) corrected according to the amount of change in attitude angle, as a parameter for calibrating the image data to be used for fusion with the point cloud data.
  2. The monitoring of the aforementioned attitude angle change is performed by The calibration system according to claim 1, comprising monitoring the angular deviation appearing at least one of the vertical angle and the horizontal angle at the peak point relative to the reference point as the attitude angle change amount.
  3. The construction of the aforementioned calibration parameters is The calibration system according to claim 1, further comprising correcting the rotation matrix component (R) of the external parameter (Co) defined for the imaging camera, among the calibration parameters, in accordance with the amount of change in attitude angle.
  4. The monitoring of the aforementioned attitude angle change is performed by This includes estimating the current estimated value of the attitude angle change in the current frame (Δψce) from the currently monitored value (Δψcm) of the attitude angle change observed in the current frame and the past estimated value (Δψpe) of the attitude angle change estimated in past frames, The construction of the aforementioned calibration parameters is The calibration system according to claim 1, further comprising constructing the calibration parameters corrected according to the current estimated values.
  5. The aforementioned processor, The image data, which has been calibrated according to the aforementioned calibration parameters, is fused with the point cloud data to generate fusion data (Df). The calibration system according to any one of claims 1 to 4, further configured to output recognition data (Do) by recognizing a target in the fusion data.
  6. The generation of the aforementioned fusion data is The calibration system according to claim 5, further comprising fusing the image data converted to a bird's-eye view according to the calibration parameters constructed from the external parameters (Co) and internal parameters (Ci) defined in the imaging camera with the point cloud data to generate the fusion data.
  7. The generation of the aforementioned fusion data is This includes generating fusion data for each feature quantity by fusing each of the multiple feature data (Dis) extracted from the image data according to feature quantity (5) and converted to the bird's-eye view with the point cloud data, The output of the aforementioned recognition data is: The calibration system according to claim 6, further comprising outputting the recognition data so as to represent the recognized target for each of the fusion data for each of the features.
  8. A calibration device having a processor (12) for calibrating between an imaging camera (3) and an observation radar (4) in a host vehicle (2), and configured to be mounted on the host vehicle, The aforementioned processor, The acquisition of, along with image data (Dc) captured by the imaging camera, including the camera field of view (Ac) of the imaging camera, and point cloud data (Dr) observed by the observation radar, including the radar field of view (Ar) that overlaps with the camera field of view of the observation radar, is to be obtained. The change in attitude angle (Δψ) of the host vehicle, which occurs when the peak point (Pp) in the correlation distribution (α) between the vertical angle (θv), horizontal angle (θh), and Doppler velocity (Vd) represented by the point cloud data for each observation point shifts relative to the reference point (Pb) of the point cloud data, is monitored. A calibration device configured to perform the following actions: construct a calibration parameter (Cp) corrected according to the amount of change in attitude angle, as a parameter for calibrating the image data to be used for fusion with the point cloud data.
  9. A calibration method performed by a processor (12) to calibrate the relationship between an imaging camera (3) and an observation radar (4) in a host vehicle (2), The acquisition of, along with image data (Dc) captured by the imaging camera, including the camera field of view (Ac) of the imaging camera, and point cloud data (Dr) observed by the observation radar, including the radar field of view (Ar) that overlaps with the camera field of view of the observation radar, is to be obtained. The change in attitude angle (Δψ) of the host vehicle, which occurs when the peak point (Pp) in the correlation distribution (α) between the vertical angle (θv), horizontal angle (θh), and Doppler velocity (Vd) represented by the point cloud data for each observation point shifts relative to the reference point (Pb) of the point cloud data, is monitored. A calibration method comprising constructing a calibration parameter (Cp) corrected according to the amount of change in attitude angle, as a parameter for calibrating the image data to be used for fusion with the point cloud data.
  10. A calibration program stored in a storage medium (10) for calibrating between an imaging camera (3) and an observation radar (4) in a host vehicle (2), and including instructions for causing a processor (12) to perform the calibration, The acquisition of, along with image data (Dc) captured by the imaging camera, including the camera field of view (Ac) of the imaging camera, and point cloud data (Dr) observed by the observation radar, including the radar field of view (Ar) that overlaps with the camera field of view of the observation radar, is to be obtained. The change in attitude angle (Δψ) of the host vehicle, which occurs when the peak point (Pp) in the correlation distribution (α) between the vertical angle (θv), horizontal angle (θh), and Doppler velocity (Vd) represented by the point cloud data for each observation point shifts relative to the reference point (Pb) of the point cloud data, is monitored. A calibration program including the command to construct a calibration parameter (Cp) corrected according to the amount of change in attitude angle, as a parameter for calibrating the image data to be used for fusion with the point cloud data.

Description

This disclosure relates to a calibration technique for calibrating between an imaging camera and an observation radar in a vehicle. The technology disclosed in Patent Document 1 achieves calibration between an imaging camera and an observation radar in a vehicle using a dedicated calibration plate. European Patent Application Publication No. 4283328 This is a block diagram showing the overall configuration of the calibration system according to the first embodiment.This is a block diagram illustrating the functional configuration of the calibration system according to the first embodiment.This is a bird's-eye view showing the respective fields of view of the imaging camera and observation radar according to the first embodiment.This is a flowchart showing the calibration flow of the first embodiment.This is a graph illustrating the calibration flow of the first embodiment.This is a schematic diagram illustrating the calibration flow of the first embodiment.This is a flowchart showing the calibration flow of the second embodiment. The following describes several embodiments of this disclosure with reference to the drawings. Note that in each embodiment, corresponding components are denoted by the same reference numerals, and redundant explanations may be omitted. Furthermore, if only a portion of the configuration is described in each embodiment, the configuration of other embodiments described earlier can be applied to the remaining parts of that configuration. Moreover, not only are the configurations explicitly stated in the description of each embodiment possible, but configurations from multiple embodiments can also be partially combined, even if not explicitly stated, as long as there are no particular problems with the combination. (First Embodiment) The calibration system 1 of the first embodiment shown in Figures 1 and 2 calibrates the connection between the imaging camera 3 and the observation radar 4 in the host vehicle 2. The host vehicle 2 can be considered the ego-vehicle from a viewpoint centered on the vehicle. The host vehicle 2 is a mobile object, such as an automobile, capable of traveling on a road with an occupant on board. Therefore, the directions in the following description are defined with reference to the host vehicle 2 on the horizontal plane. Figures 1 and 2 show a representative example where the calibration system 1 is configured to be mounted entirely on the host vehicle 2, as an example of implementation in the form of a calibration device such as a processing circuit (e.g., a processing ECU) or a semiconductor unit (e.g., a semiconductor chip). In the host vehicle 2, an automated driving mode is provided, categorized into levels according to the degree of manual intervention by the occupant in dynamic driving tasks. The automated driving mode may be implemented through autonomous driving control, such as conditional driving automation, advanced driving automation, or full driving automation, where the system performs all dynamic driving tasks during operation. Alternatively, the automated driving mode may be implemented through advanced driver assistance control, such as driver assistance or partial driving automation, where the occupant performs some or all dynamic driving tasks. The automated driving mode may also be implemented by either one of these autonomous driving controls or advanced driver assistance controls, or by a combination of them, or by switching between them. The host vehicle 2 is equipped with at least one pair of imaging camera 3 and observation radar 4, which are subject to mutual calibration. The imaging camera 3 and observation radar 4 pair are arranged on the host vehicle 2 so that their respective fields of view Ac and Ar, as shown in Figure 3, overlap. In particular, in the horizontal view of Figure 3, the overlap of their fields of view Ac and Ar is achieved so that the camera field of view Ac of the imaging camera 3 falls within the radar field of view Ar of the observation radar 4. For the sake of easier understanding, the following explanation will focus on the pair of imaging camera 3 (forward camera 30) and observation radar 4 (forward radar 40) shown in Figures 1-3, as a representative example. The imaging camera 3 comprises an image sensor unit 300 and an imaging circuit unit 302. The image sensor unit 300 is primarily composed of semiconductor elements, such as CMOS, with multiple pixels arranged in a two-dimensional array. The image sensor unit 300 captures light images received from targets within the camera's field of view Ac, pixel by pixel. The imaging circuit unit 302 is a semiconductor chip, such as an image processing circuit, that processes the imaging signals from each pixel of the image sensor unit 300. The imaging circuit unit 302 outputs image data Dc by converting the brightness values of each pixel, corresponding to the light intensity received from targets within the camera's field of view Ac, into two-dimensional data. The