US-20260125065-A1 - SYSTEMS AND METHODS FOR VEHICLE SENSOR CALIBRATION IN A VEHICLE ASSEMBLY ENVIRONMENT
Abstract
A method includes receiving detection data and Inertial Measurement Unit (IMU) data from a vehicle as the vehicle traverses a testing lane, wherein the detection data corresponds to lidar detections, camera detections, and radar detections, by one or more sensors of the vehicle, and the IMU data corresponds to a determination of one or more motion constraints from the one or more sensors and an IMU and a determination of one or more constraints between the IMU and the vehicle to generate. A plurality of targets are arranged with respect to the testing lane and have a plurality of calibration elements configured to allow detection by the one or more sensors of the vehicle. The method further includes processing the received detection data and IMU data using a calibration algorithm, and generating one or more calibration parameters for the one or more sensors of the vehicle based on the processing.
Inventors
- Parul Kothari
- Elizabeth Murphy
- Khalid Yousif
- Larry Lenkin
- Quan Zhou
- Sean Williams
Assignees
- FORD GLOBAL TECHNOLOGIES, LLC
Dates
- Publication Date
- 20260507
- Application Date
- 20241101
Claims (20)
- 1 . A system comprising: a plurality of targets arranged with respect to a testing lane, wherein the plurality of targets are configured as multisensor targets with each target of the plurality of targets having a plurality of calibration elements configured to allow detection by one or more sensors of a vehicle, the plurality of detection elements including one or more elements configured for lidar detection, one or more elements configured for camera detection, and one or more elements configured for radar detection, wherein the testing lane is configured to allow the vehicle to traverse the testing lane between the plurality of targets to detect the plurality of detection elements, and wherein the testing lane is configured to allow determination of one or more motion constraints from the one or more sensors and an Inertial Measurement Unit (IMU) and determination of one or more constraints between the IMU and the vehicle to generate IMU data; and a calibration system configured to: receive detection data and the IMU data from the vehicle as the vehicle traverses the testing lane, the detection data corresponding to lidar detections by the one or more sensors of the vehicle, camera detections by the one or more sensors of the vehicle, and radar detections by the one or more sensors of the vehicle; process the received detection data and the IMU data using a calibration algorithm; and generating one or more calibration parameters for the one or more sensors of the vehicle based on the processing.
- 2 . The system of claim 1 , wherein the calibration system is further configured to use the one or more calibration parameters to generate a valid L3 calibration of the one or more sensors of the vehicle, thereby allowing L3 autonomous operation of the vehicle.
- 3 . The system of claim 2 , wherein a spacing between one or more targets of the plurality of targets is different than a spacing between one or more other targets of the plurality of targets, and an orientation of one or more of the targets of the plurality of targets is different than an orientation one or more other targets of the plurality of targets, the spacing and orientation configured to allow collection of data by the one or more sensors of the vehicle to allow for the valid L3 calibration.
- 4 . The system of claim 3 , wherein one or more targets of the plurality of targets are positioned along a first side of the testing lane and one or more targets of the plurality of targets are positioned on a second side of the testing lane, wherein the first side of the testing lane is on a opposite side to the second side of the testing lane relative to the vehicle traversing between the plurality of targets.
- 5 . The system of claim 1 , wherein the plurality of targets are double sided having a first side and a second side opposite the first side, wherein a first set of calibration elements on the first side is different than a second set of calibration elements on the second side, and wherein the first set of calibration elements are configured for detection by one or more sensors of the vehicle that are forward facing and the second set of calibration elements are configured for detection by one or more sensors of the vehicle that are rearward facing.
- 6 . The system of claim 1 , wherein the one or more elements configured for lidar detection comprise one or more cutouts, the one or more elements configured for camera detection comprise one or more visual tags, and the one or more elements configured for radar detection comprises one or more radar targets as corner reflectors.
- 7 . The system of claim 6 , wherein the one or more cutouts, the one or more visual tags, and the one or more radar targets are arranged on at least one surface of the plurality of targets to allow for collection of detection data by the one or more sensors of the vehicle for L3 calibration.
- 8 . The system of claim 6 , wherein the one or more cutouts are arranged at a center portion of the plurality of targets, the one or more visual tags are arranged along a top portion, the center portion, and a bottom portion of the plurality of targets, and the one or more radar targets are arranged at a bottom portion of the plurality of targets.
- 9 . The system of claim 6 , wherein the one or more cutouts comprise four cutouts arranged in a symmetrical configuration forming an evenly spaced square configuration at a center portion of the plurality of targets, the one or more visual tags comprise sixteen visual tags configured as visual fiducial markers with six of the visual tags arranged in a symmetrical rectangular configuration at a top portion of the plurality of targets, four of the visual tags arranged in a diamond configuration at the center portion of the plurality of targets with a visual tag in a center of the diamond configuration, and five of the visual tags arranged in a U-shaped configuration at a bottom portion of the plurality of targets, each visual tag having a different visual fiducial pattern than other visual tags, and the one or more radar targets comprise one radar target positioned at the bottom portion of the plurality of targets within the U-shaped configuration of the visual tags.
- 10 . The system of claim 1 , wherein one or more targets of the plurality of targets are movable targets relative to the testing lane and one or more targets of the plurality of targets are fixed targets and non-movable with respect to the testing lane.
- 11 . The system of claim 1 , wherein the calibration system is further configured to process the received detection data using an optimization to estimate a plurality of parameters including a plurality of vehicle states at each of a plurality of keyframe timestamps, a plurality of poses of the plurality of targets, and a plurality of extrinsic calibrations.
- 12 . The system of claim 11 , wherein a plurality of constraints are used in the optimization, wherein the plurality of constraints include sensor factors.
- 13 . The system of claim 1 , wherein the calibration algorithm is configured to optimize a non-linear least squares optimization problem using a plurality of cost functions related to 3D-2D reprojection errors from camera target detection, errors from lidar and radar target detection, 6D-6D errors from sensor odometries, a vehicle kinematic model, and prior information as CAD values.
- 14 . The system of claim 1 , wherein the calibration system is further configured to receive detection data from the vehicle as the vehicle is stopped at a plurality of points along the testing lane, and wherein a location of each of the points and a time period that the vehicle stops at each point of the plurality of points is defined.
- 15 . A method comprising: receiving detection data from a vehicle as the vehicle traverses a testing lane, the detection data corresponding to lidar detections by one or more sensors of the vehicle, camera detections by one or more sensors of the vehicle, and radar detections by one or more sensors of the vehicle, wherein a plurality of targets are arranged with respect to the testing lane, wherein the plurality of targets are configured as multisensor targets with each target of the plurality of targets having a plurality of calibration elements configured to allow detection by the one or more sensors of the vehicle, and wherein the testing lane is configured to allow determination of one or more motion constraints from the one or more sensors and an Inertial Measurement Unit (IMU) and determination of one or more constraints between the IMU and the vehicle to generate IMU data; processing, by a processor, the received detection data and IMU data using a calibration algorithm; and generating, by the processor, one or more calibration parameters for the one or more sensors of the vehicle based on the processing.
- 16 . The method of claim 15 , wherein the plurality of detection elements include one or more elements configured for lidar detection, one or more elements configured for camera detection, and one or more elements configured for radar detection, wherein the testing lane is configured to allow a vehicle to traverse the testing lane between the plurality of targets to detect the plurality of detection elements.
- 17 . The method of claim 15 , further comprising using the one or more calibration parameters to generate a valid L3 calibration of the one or more sensors of the vehicle, thereby allowing L3 autonomous operation of the vehicle, wherein a spacing between one or more targets of the plurality of targets is different than a spacing between one or more other targets of the plurality of targets, and an orientation of one or more of the targets of the plurality of targets is different than an orientation one or more other targets of the plurality of targets, the spacing and orientation configured to allow collection of data by the one or more sensors of the vehicle to allow for the valid L3 calibration.
- 18 . The method of claim 17 , wherein the plurality of targets are double sided having a first side and a second side opposite to the first side, wherein a first set of calibration elements on the first side is different than a second set of calibration elements on the second side, and wherein the first set of calibration elements are configured for detection by one or more sensors of the vehicle that are forward facing and the second set of calibration elements are configured for detection by one or more sensors of the vehicle that are rearward facing.
- 19 . The method of claim 15 , further comprising processing the received detection data using an optimization to estimate a plurality of parameters including a plurality of vehicle states at each of a plurality of keyframe timestamps, a plurality of poses of the plurality of targets, and a plurality of extrinsic calibrations, wherein a plurality of constraints are used in the optimization, wherein the plurality of constraints include sensor factors.
- 20 . One or more non-transitory computer-readable media storing processor-executable instructions that, when executed by at least one processor, cause the at least one processor to: receive detection data from a vehicle as the vehicle traverses a testing lane, the detection data corresponding to lidar detections by one or more sensors of the vehicle, camera detections by one or more sensors of the vehicle, and radar detections by one or more sensors of the vehicle, wherein a plurality of targets are arranged with respect to the testing lane, and wherein the plurality of targets are configured as multisensor targets with each target of the plurality of targets having a plurality of calibration elements configured to allow detection by the one or more sensors of the vehicle, and wherein the testing lane is configured to allow determination of one or more motion constraints from the one or more sensors and an Inertial Measurement Unit (IMU) and determination of one or more constraints between the IMU and the vehicle to generate IMU data; process the received detection data and IMU data using a calibration algorithm; and generate one or more calibration parameters for the one or more sensors of the vehicle based on the processing.
Description
FIELD The present disclosure relates to calibrating sensors of a vehicle. More specifically, the present disclosure relates to calibrating vehicle sensors of a vehicle having autonomous operation capabilities. BACKGROUND The statements in this section merely provide background information related to the present disclosure and may not constitute prior art. A vehicle manufacturing environment may include one or more end-of-line (EOL) testing stations that are configured to calibrate and verify the functionality of various components of a vehicle. As an example, after the vehicle is assembled, a sensor calibration EOL station may calibrate one or more sensors of the vehicle. For example, a camera calibration EOL station may calibrate a forward-facing camera of the vehicle. However, the calibration EOL stations are unable to calibrate all sensors of the vehicle to allow for Level 3 (L3) autonomous driving operation upon delivery of the vehicle. That is, in addition to factory calibration, additional calibration is performed after the vehicle leaves the final assembly plant in order to calibrate the sensors to allow for L3 vehicle autonomy. These issues with factory calibration, among other issues, are addressed by the present disclosure. SUMMARY This section provides a general summary of the disclosure and is not a comprehensive disclosure of its full scope or all of its features. The present disclosure provides a system comprising: a plurality of targets arranged with respect to a testing lane, wherein the plurality of targets are configured as multisensor targets with each target of the plurality of targets having a plurality of calibration elements configured to allow detection by one or more sensors of the vehicle, the plurality of detection elements including one or more elements configured for lidar detection, one or more elements configured for camera detection, and one or more elements configured for radar detection, wherein the testing lane is configured to allow a vehicle to traverse the testing lane between the plurality of targets to detect the plurality of detection elements; and a calibration system configured to: receive detection data from the vehicle as the vehicle traverses the testing lane, the detection data corresponding to lidar detections by the one or more sensors of the vehicle, camera detections by the one or more sensors of the vehicle, and radar detections by the one or more sensors of the vehicle; process the received detection data using a calibration algorithm; and generating one or more calibration parameters for the one or more sensors of the vehicle based on the processing; wherein the calibration system is further configured to use the one or more calibration parameters to generate a valid L3 calibration of the one or more sensors of the vehicle, thereby allowing L3 autonomous operation of the vehicle; wherein a spacing between one or more targets of the plurality of targets is different than a spacing between one or more other targets of the plurality of targets, and an orientation of one or more of the targets of the plurality of targets is different than an orientation one or more other targets of the plurality of targets, the spacing and orientation configured to allow collection of data by the one or more sensors of the vehicle to allow for the valid L3 calibration; wherein one or more targets of the plurality of targets are positioned along a first side of the testing lane and one or more targets of the plurality of targets are positioned on a second side of the testing lane, wherein the first side of the testing lane is on a opposite side to the second side of the testing lane relative to the vehicle traversing between the plurality of targets; wherein the plurality of targets are double sided having a first side and a second side opposite the first side, wherein a first set of calibration elements on the first side is different than a second set of calibration elements on the second side, and wherein the first set of calibration elements are configured for detection by one or more sensors of the vehicle that are forward facing and the second set of calibration elements are configured for detection by one or more sensors of the vehicle that are rearward facing; wherein the one or more elements configured for lidar detection comprise one or more cutouts, the one or more elements configured for camera detection comprise one or more visual tags, and the one or more elements configured for radar detection comprises one or more radar targets as corner reflectors; wherein the one or more cutouts, the one or more visual tags, and the one or more radar targets are arranged on at least one surface of the plurality of targets to allow for collection of detection data by the one or more sensors of the vehicle for L3 calibration; wherein the one or more cutouts are arranged at a center portion of the plurality of targets, the one or more visual tags are arranged along a top portion