Search

CN-121994502-A - System and method for vehicle sensor calibration in a vehicle assembly environment

CN121994502ACN 121994502 ACN121994502 ACN 121994502ACN-121994502-A

Abstract

The present disclosure provides "systems and methods for vehicle sensor calibration in a vehicle assembly environment". A method includes receiving detection data and Inertial Measurement Unit (IMU) data from a vehicle as the vehicle traverses a test lane, wherein the detection data corresponds to lidar detection, camera detection, and radar detection by one or more sensors of the vehicle, and the IMU data corresponds to a determination of one or more motion constraints from the one or more sensors and IMU and a determination of one or more constraints between the IMU and the vehicle. A plurality of targets are arranged relative to the test lane and have a plurality of calibration elements configured to allow detection by one or more sensors of a vehicle. The method further includes processing the received detection data and IMU data using a calibration algorithm, and generating one or more calibration parameters for one or more sensors of the vehicle based on the processing.

Inventors

  • P. Kothari
  • E. MURPHY
  • K. Yusef
  • L. Renjin
  • Q.Zhou
  • S. WILLIAMS

Assignees

  • 福特全球技术公司

Dates

Publication Date
20260508
Application Date
20251027
Priority Date
20241101

Claims (15)

  1. 1. A system, comprising: A plurality of targets arranged relative to a test lane, wherein the plurality of targets are configured as multi-sensor targets, wherein each target of the plurality of targets has a plurality of calibration elements configured to allow one or more sensors of a vehicle to detect, a plurality of detection elements including one or more elements configured for lidar detection, one or more elements configured for camera detection, and one or more elements configured for radar detection, wherein the test lane is configured to allow the vehicle to traverse the test lane between the plurality of targets to detect the plurality of detection elements, and wherein the test lane is configured to allow one or more motion constraints to be determined from the one or more sensors and an Inertial Measurement Unit (IMU) and one or more constraints between the IMU and the vehicle to generate IMU data A calibration system configured to: Receiving detection data and the IMU data from the vehicle as the vehicle traverses the test lane, the detection data corresponding to lidar detection by the one or more sensors of the vehicle, camera detection by the one or more sensors of the vehicle, and radar detection by the one or more sensors of the vehicle; processing the received detection data and the IMU data using a calibration algorithm, and One or more calibration parameters for the one or more sensors of the vehicle are generated based on the processing.
  2. 2. The system of claim 1, wherein the calibration system is further configured to generate an effective L3 calibration of the one or more sensors of the vehicle using the one or more calibration parameters, thereby allowing L3 autonomous operation of the vehicle.
  3. 3. The system of claim 2, wherein a spacing between one or more targets of the plurality of targets is different than a spacing between one or more other targets of the plurality of targets, and an orientation of one or more of the targets of the plurality of targets is different than an orientation of one or more other targets of the plurality of targets, the spacing and orientation configured to allow data to be collected by the one or more sensors of the vehicle to allow the effective L3 calibration.
  4. 4. The system of claim 3, wherein one or more targets of the plurality of targets are positioned along a first side of the test lane and one or more targets of the plurality of targets are positioned on a second side of the test lane, wherein the first side of the test lane is on an opposite side of the second side of the test lane relative to the vehicle traversing between the plurality of targets.
  5. 5. The system of claim 1, wherein the plurality of targets are double-sided, having a first side and a second side opposite the first side, wherein a first set of calibration elements on the first side is different from a second set of calibration elements on the second side, and wherein the first set of calibration elements is configured for detection by a front-facing one or more sensors of the vehicle and the second set of calibration elements is configured for detection by a rear-facing one or more sensors of the vehicle.
  6. 6. The system of claim 1, wherein the one or more elements configured for lidar detection comprise one or more cutouts, the one or more elements configured for camera detection comprise one or more visual tags, and the one or more elements configured for radar detection comprise one or more radar targets as corner reflectors.
  7. 7. The system of claim 6, wherein the one or more cuts, the one or more visual tags, and the one or more radar targets are disposed on at least one surface of the plurality of targets to allow detection data to be collected by the one or more sensors of the vehicle for L3 calibration.
  8. 8. The system of claim 6, wherein the one or more cuts are disposed at a central portion of the plurality of targets, the one or more visual tags are disposed along a top portion, the central portion, and a bottom portion of the plurality of targets, and the one or more radar targets are disposed at the bottom portion of the plurality of targets.
  9. 9. The system of claim 6, wherein the one or more cuts comprise four cuts arranged in a symmetrical configuration forming a uniformly spaced square configuration at a central portion of the plurality of targets, the one or more visual tags comprise sixteen visual tags configured as visual fiducial markers, wherein six of the visual tags are arranged in a symmetrical rectangular configuration at a top portion of the plurality of targets, four of the visual tags are arranged in a diamond configuration at the central portion of the plurality of targets, wherein there is a visual tag in a center of the diamond configuration and five of the visual tags are arranged in a U-shaped configuration at a bottom portion of the plurality of targets, each visual tag having a different visual fiducial pattern than the other visual tags, and the one or more radar targets comprise one radar target positioned within the U-shaped configuration of the visual tag at the bottom portion of the plurality of targets.
  10. 10. The system of claim 1, wherein one or more targets of the plurality of targets are movable targets relative to the test lane and one or more targets of the plurality of targets are fixed targets and are not movable relative to the test lane.
  11. 11. The system of claim 1, wherein the calibration system is further configured to process the received detection data using optimization to estimate a plurality of parameters including a plurality of vehicle states at each of a plurality of keyframe timestamps, a plurality of poses of the plurality of targets, and a plurality of external calibrations.
  12. 12. The system of claim 11, wherein a plurality of constraints are used in the optimization, wherein the plurality of constraints include a sensor factor.
  13. 13. The system of claim 1, wherein the calibration algorithm is configured to optimize a nonlinear least squares optimization problem using a plurality of cost functions related to 3D-2D re-projection errors from camera target detection, errors from lidar and radar target detection, 6D-6D errors from sensor odometry, vehicle kinematic models, and a priori information as CAD values.
  14. 14. The system of claim 1, wherein the calibration system is further configured to receive detection data from the vehicle when the vehicle is stopped at a plurality of points along the test lane, and wherein a location of each of the points and a time period for which the vehicle is stopped at each of the plurality of points are defined.
  15. 15. A method, comprising: Receiving detection data from a vehicle as the vehicle traverses a test lane, the detection data corresponding to lidar detection by one or more sensors of the vehicle, camera detection by one or more sensors of the vehicle, and radar detection by one or more sensors of the vehicle, wherein a plurality of targets are arranged relative to the test lane, wherein the plurality of targets are configured as a plurality of sensor targets, wherein each target of the plurality of targets has a plurality of calibration elements configured to allow detection by the one or more sensors of the vehicle, and wherein the test lane is configured to allow determination of one or more motion constraints from the one or more sensors and an Inertial Measurement Unit (IMU) and determination of one or more constraints between the IMU and the vehicle to generate IMU data; Processing, by a processor, the received detection data and IMU data using a calibration algorithm, and One or more calibration parameters for the one or more sensors of the vehicle are generated by the processor based on the processing.

Description

System and method for vehicle sensor calibration in a vehicle assembly environment Technical Field The present disclosure relates to calibrating a sensor of a vehicle. More specifically, the present disclosure relates to calibrating vehicle sensors of a vehicle having autonomous operating capabilities. Background The statements in this section merely provide background information related to the present disclosure and may not constitute prior art. The vehicle manufacturing environment may include one or more off-line (EOL) testing stations configured to calibrate and verify the functionality of various components of the vehicle. As an example, after assembling the vehicle, the sensor calibration EOL station may calibrate one or more sensors of the vehicle. For example, the camera calibration EOL station may calibrate a forward facing camera of the vehicle. However, calibrating the EOL station fails to calibrate all sensors of the vehicle to allow for level 3 (L3) autonomous driving operation at the time of delivery of the vehicle. That is, in addition to factory calibration, additional calibration is performed after the vehicle leaves the final assembly plant in order to calibrate the sensors to allow the L3 vehicle to be autonomous. The present disclosure addresses these and other issues related to factory calibration. Disclosure of Invention This section provides a general summary of the disclosure and is not a comprehensive disclosure of its full scope or all of its features. The present disclosure provides a system comprising a plurality of targets arranged relative to a test lane, wherein the plurality of targets are configured as multi-sensor targets, wherein each of the plurality of targets has a plurality of calibration elements configured to allow detection by one or more sensors of a vehicle, a plurality of detection elements including one or more elements configured for lidar detection, one or more elements configured for camera detection, and one or more elements configured for radar detection, wherein the test lane is configured to allow the vehicle to traverse the test lane between the plurality of targets to detect the plurality of detection elements, and a calibration system configured to receive detection data from the vehicle as the vehicle traverses the test lane, the detection data corresponding to lidar detection by the one or more sensors of the vehicle, The method comprises the steps of detecting by a camera of the one or more sensors of the vehicle and radar detection by the one or more sensors of the vehicle, processing the received detection data using a calibration algorithm, and generating one or more calibration parameters for the one or more sensors of the vehicle based on the processing, wherein the calibration system is further configured to generate an effective L3 calibration of the one or more sensors of the vehicle using the one or more calibration parameters, thereby allowing for autonomous operation of the L3 of the vehicle, wherein an interval between one or more targets of the plurality of targets is different from an interval between one or more other targets of the plurality of targets, and the one or more targets of the plurality of targets are oriented different from an orientation of one or more other targets of the plurality of targets, wherein the interval and orientation are configured to allow for a collection of data by the one or more sensors of the vehicle on the one side, and a second side of the plurality of targets are not aligned on the one side of the first side of the plurality of targets, wherein the one or more sides of the plurality of targets are aligned on the first side of the plurality of lane-to the second side of the vehicle, wherein the one or more sides of the plurality of targets are aligned on the second side of the first side of the vehicle are not aligned, wherein the one or more targets are aligned on the first side of the plurality of the lane are aligned with respect to the second lane, wherein the one or more targets of the plurality of the one or more targets are aligned on the first side of the lane, and wherein the first set of calibration elements is configured for detection by one or more sensors of the vehicle that face forward and the second set of calibration elements is configured for detection by one or more sensors of the vehicle that face backward, wherein the one or more elements configured for laser radar detection comprise one or more cutouts, the one or more elements configured for camera detection comprise one or more visual tags, and the one or more elements configured for radar detection comprise one or more radar targets that are corner reflectors, wherein the one or more cutouts, The one or more visual tags and the one or more radar targets are disposed on at least one surface of the plurality of targets to allow detection data to be collected by the one or more sensors of the vehicle for L3 calibration, wh