Search

DE-112023006720-T5 - METHOD AND DEVICE FOR RESTORING POINT CLOUD DISTORTION EFFECTS, ELECTRONIC DEVICE AND STORAGE MEDIUM

DE112023006720T5DE 112023006720 T5DE112023006720 T5DE 112023006720T5DE-112023006720-T5

Abstract

The present disclosure provides a method for restoring point cloud distortion effects, a device for restoring point cloud distortion effects, an electronic device, and a storage medium, relates to the technical field of autonomous driving, and the method comprises the following: controlling the laser radar to simultaneously emit a plurality of groups of lasers along different angles according to a horizontal angular resolution and determining the plurality of groups of lasers that are captured in each simulation image when the simulation environment for autonomous driving is a low frame rate simulation environment; within each simulation image, determining the coordinates of the collision points corresponding to each laser in the simulation image using a beam tracking algorithm, and correcting the coordinates of the collision points corresponding to each laser based on current velocity vectors of the main vehicle and laser emission time differences corresponding to each laser, determining corrected coordinates of the collision points; and obtaining a point cloud to add the distortion effect from the corrected coordinates of the collision points corresponding to each laser. In the present disclosure, the motion characteristics of the main vehicle add a distortion effect to the point cloud, thereby making the environment more realistic for autonomous driving and improving the accuracy of subsequent calculations.

Inventors

  • Dalin HU
  • Zhenxing Yang
  • Qiang Yang

Assignees

  • Beijing Saimo Technology Co., Ltd.

Dates

Publication Date
20260513
Application Date
20231031
Priority Date
20230725

Claims (14)

  1. Method for restoring point cloud distortion effects, wherein the method for restoring point cloud distortion effects comprises: Acquiring a simulation frame rate corresponding to an autonomous driving simulation environment and assessing, based on the simulation frame rate, whether the autonomous driving simulation environment is an ultra-high frame rate simulation environment, wherein the autonomous driving simulation environment includes a main vehicle equipped with a laser radar, the main vehicle being in a moving state; Controlling the laser radar to rotate and emit a plurality of groups of lasers sequentially to obtain a point cloud for adding a distortion effect when the autonomous driving simulation environment is an ultra-high frame rate simulation environment; Controlling the laser radar to simultaneously emit a multitude of laser groups along different angles according to a horizontal angular resolution, and determining the multitude of laser groups captured in each simulation image when the autonomous driving simulation environment is a low frame rate environment; within each simulation image, determining the coordinates of the collision points corresponding to each laser in the simulation image using a beam tracking algorithm, and correcting the coordinates of the collision points corresponding to each laser based on the current velocity vectors of the main vehicle and the laser emission time differences corresponding to each laser, thus determining the corrected coordinates of the collision points; and obtaining the point cloud to add the distortion effect from the corrected coordinates of the collision points corresponding to each laser.
  2. Methods for restoring point cloud distortion effects after Claim 1 , wherein the assessment of whether the autonomous driving simulation environment is an ultra-high frame rate simulation environment is carried out by the following means: acquiring a rotational frequency and a horizontal angular resolution corresponding to the laser radar; determining a laser emission time difference corresponding to the laser radar based on the rotational frequency and the horizontal angular resolution; calculating a simulation time difference based on the simulation frequency; determining the autonomous driving simulation environment as the ultra-high frame rate simulation environment if the laser emission time difference is greater than or equal to the simulation time difference; and determining the autonomous driving simulation environment as the low frame rate simulation environment if the laser emission time difference is less than the simulation time difference.
  3. Methods for restoring point cloud distortion effects after Claim 2 , wherein determining a laser emission time difference corresponding to the laser radar based on the rotational frequency and the horizontal angular resolution comprises: determining a total rotational time used for one revolution of the laser radar based on the rotational frequency corresponding to the laser radar; determining a total number of lasers emitted in one revolution of the laser radar based on the horizontal angular resolution; and determining a ratio between the total rotational time and the total number of lasers emitted as the laser emission time difference.
  4. Methods for restoring point cloud distortion effects after one of the Claims 1 until 3 , wherein obtaining the point cloud for adding the distortion effect is done as follows, if the autonomous driving simulation environment is an ultra-high frame rate simulation environment: Emitting the lasers sequentially based on the laser emission time difference; performing the following processing for each emitted laser: extracting environmental information of the main vehicle and a position of the main vehicle recorded in a simulation image corresponding to the emitted laser; calculating beam-tracking data generated by the emitted laser in the corresponding simulation image using the beam-tracking algorithm, the environmental information of the main vehicle, and the position of the main vehicle, wherein the beam-tracking data includes distance information and emission angle information between the main vehicle and the collision point in the environment; and performing a polar coordinate conversion based on the distance information and the emission angle information corresponding to each emitted laser to obtain the point cloud to add the distortion effect when the laser radar made a rotation.
  5. Methods for restoring point cloud distortion effects after one of the Claims 1 until 4 , wherein the corrected coordinates of the collision points corresponding to each laser are determined by the following: determining a product of the current velocity vector of the main vehicle and the laser emission time difference corresponding to the laser as a displacement vector of the main vehicle corresponding to the laser; and determining a difference between the coordinate of the collision point and the displacement vector of the main vehicle as the corrected coordinate of the collision point corresponding to the laser.
  6. Methods for restoring point cloud distortion effects after one of the Claims 1 until 5 , wherein obtaining the point cloud for adding the distortion effect from the corrected coordinates of the collision points corresponding to each laser comprises: performing the polar coordinate conversion of the corrected coordinate of the collision point of the laser for each laser to obtain the corresponding distance information and emission angle information of the laser; and outputting the point cloud for adding the distortion effect using the distance information and emission angle information corresponding to each laser.
  7. Device for restoring point cloud distortion effects, wherein the device for restoring point cloud distortion effects comprises: an evaluation module for a frame rate simulation environment, wherein the evaluation module for a frame rate simulation environment is configured as follows: capturing a simulation frame rate corresponding to an autonomous driving simulation environment and evaluating, based on the simulation frame rate, whether the autonomous driving simulation environment is an ultra-high frame rate simulation environment, wherein the autonomous driving simulation environment includes a main vehicle equipped with a laser radar, the main vehicle being in a driving state; A high frame rate distortion recovery module, wherein the high frame rate distortion recovery module is configured as follows: Controlling the laser radar to rotate and emit a multitude of laser groups sequentially to obtain a point cloud for adding a distortion effect when the autonomous driving simulation environment is an ultra-high frame rate simulation environment; A first low frame rate distortion recovery module, wherein the first low frame rate distortion recovery module is configured as follows: Controlling the laser radar to simultaneously emit a multitude of laser groups along different angles according to a horizontal angular resolution and to determine the multitude of laser groups captured in each simulation frame when the autonomous driving simulation environment is a low frame rate simulation environment; a correction module, wherein the correction module is configured as follows: within each simulation image, determining the coordinates of the collision points corresponding to each laser in the simulation image using a beam tracking algorithm, and correcting the coordinates of the collision points corresponding to each laser based on current velocity vectors of the main vehicle and laser emission time differences corresponding to each laser, and determining corrected coordinates of the collision points; and a second low frame rate distortion recovery module, wherein the second low frame rate distortion recovery module is configured as follows: preserving the point cloud to add the distortion effect from the corrected coordinates of the collision points corresponding to each laser.
  8. Device for restoring point cloud distortion effects after Claim 7 , wherein the assessment module for a frame rate simulation environment is further configured such that the assessment of whether the autonomous driving simulation environment is an ultra-high frame rate simulation environment is carried out by: acquiring a rotational frequency and a horizontal angular resolution corresponding to the laser radar; determining a laser emission time difference corresponding to the laser radar based on the rotational frequency and the horizontal angular resolution; calculating a simulation time difference based on the simulation frequency; determining the autonomous driving simulation environment as the ultra-high frame rate simulation environment if the laser emission time difference is greater than or equal to the simulation time difference; and determining the autonomous driving simulation environment as the low frame rate simulation environment if the laser emission time difference is less than the simulation time difference.
  9. Device for restoring point cloud distortion effects after Claim 8 , wherein the evaluation module for a frame rate simulation environment is further configured such that determining a laser emission time difference corresponding to the laser radar based on the rotation frequency and the horizontal angular resolution includes: determining a total rotation time used for one revolution of the laser radar based on the rotation frequency corresponding to the laser radar; determining a total number of lasers emitted in one revolution of the laser radar based on the horizontal angular resolution; and determining a ratio between the total rotation time and the total number of lasers emitted as the laser emission time difference.
  10. Device for restoring point cloud distortion effects after one of the Claims 7 until 9 , wherein the high frame rate distortion recovery module is further configured such that, when the autonomous driving simulation environment is an ultra-high frame rate simulation environment, the point cloud is obtained for adding the distortion effect as follows: Emitting the lasers sequentially based on the laser emission time difference; performing the following processing for each emitted laser: extracting environmental information of the main vehicle and a position of the main vehicle recorded in a simulation image corresponding to the emitted laser; calculating beam-tracking data generated by the emitted laser in the corresponding simulation image using the beam-tracking algorithm, the environmental information of the main vehicle, and the position of the main vehicle, wherein the beam-tracking data includes distance information and emission angle information between the main vehicle and the collision point in the environment; and performing a polar coordinate conversion based on the distance information and the emission angle information corresponding to each emitted laser to obtain the point cloud to add the distortion effect when the laser radar made a rotation.
  11. Device for restoring point cloud distortion effects after one of the Claims 7 until 10 , wherein the correction module is further configured such that the corrected coordinates of the collision points corresponding to each laser are determined by: determining a product of the current velocity vector of the main vehicle and the laser emission time difference corresponding to the laser as a displacement vector of the main vehicle corresponding to the laser; and determining a difference between the coordinate of the collision point and the displacement vector of the main vehicle as the corrected coordinate of the collision point corresponding to the laser.
  12. Device for restoring point cloud distortion effects after one of the Claims 7 until 11 , wherein the second low frame rate distortion recovery module is further configured such that obtaining the point cloud for adding the distortion effect from the corrected coordinates of the collision points corresponding to each laser includes: performing the polar coordinate conversion of the corrected coordinate of the laser's collision point for each laser to obtain the corresponding distance and emission angle information of the laser; and outputting the point cloud for adding the distortion effect using the distance and emission angle information corresponding to each laser.
  13. Electronic device, the electronic device comprising: a processor, a memory, and a bus, wherein the memory stores machine-readable instructions executable by the processor, wherein the processor communicates with the memory via the bus when the electronic device is running, wherein the machine-readable instructions, when executed by the processor, perform the steps of the method for restoring point cloud distortion effects according to one of the Claims 1 until 6 carry out.
  14. A computer-readable storage medium, wherein a computer program is stored on the computer-readable storage medium, and wherein the computer program, when executed by the processor, performs the steps of the method for restoring point cloud distortion effects according to one of the Claims 1 until 6 executes.

Description

Cross-reference to related registrations The present disclosure claims priority from Chinese patent disclosure no. 202310913118.2 and with the title “ Method and apparatus for restoring point cloud distortion effects, electronic device and storage medium”, which was published on July 25, 2023 were submitted to the State Intellectual Property Administration in China and are incorporated by reference in their entirety as part of the present disclosure. Technical field The present disclosure relates to the technical field of autonomous driving, in particular a method and a device for restoring point cloud distortion effects, an electronic device and a storage medium. State of the art The point cloud generated by laser radar plays a crucial role in perception within autonomous driving technology. Furthermore, due to the density of the point cloud it produces, laser radar has become one of the most important perception tools in the field of autonomous driving perception. When developing simulation software for autonomous driving, the laser radar point cloud must be simulated as real-world perception data to validate the control algorithm as well as the perception and fusion algorithms. Beam tracing technology is currently an important technology for generating point clouds from laser radar, which are frequently used for laser radar simulation. One of the drawbacks of current point clouds generated with beam tracing technology is that they do not accurately reflect distortion phenomena. The beam tracing algorithm must operate within a static image, while the real-world scene is dynamic, and the laser pulse emission from the scanning laser radar is also dynamic. This results in the point cloud generated by the beam tracing algorithm being too perfect to accurately reflect the real-world simulation environment. Description of the invention In light of this, the present disclosure provides at least one method for restoring point cloud distortion effects, a device for restoring point cloud distortion effects, an electronic device, and a storage medium. In high- and low-frequency simulations, the motion characteristics of the main vehicle add a distortion effect to the point cloud, thereby making the environment more realistic for autonomous driving and improving the accuracy of subsequent calculations. The present revelation mainly covers at least the following aspects: Some embodiments of the present disclosure provide a method for restoring point cloud distortion effects, wherein the method for restoring point cloud distortion effects may include the following: Acquiring a simulation frame rate corresponding to an autonomous driving simulation environment and, based on the simulation frame rate, assessing whether the autonomous driving simulation environment is an ultra-high frame rate environment, wherein the autonomous driving simulation environment includes a main vehicle equipped with a laser radar, the main vehicle being in a moving state; controlling the laser radar to rotate and emit a multitude of groups of lasers sequentially to obtain a point cloud to add a distortion effect if the autonomous driving simulation environment is an ultra-high frame rate environment; controlling the laser radar to simultaneously emit a multitude of groups of lasers along different angles according to a horizontal angular resolution and determining the multitude of groups of lasers captured in each simulation image if the autonomous driving simulation environment is a low frame rate environment; within each simulation image, determine the coordinates of the collision points corresponding to each laser in the simulation image using a beam tracking algorithm, and correct the coordinates of the collision points corresponding to each laser based on current velocity vectors of the main vehicle and laser emission time differences corresponding to each laser, determine corrected coordinates of the collision points; and obtain the point cloud to add the distortion effect from the corrected coordinates of the collision points corresponding to each laser. In a preferred embodiment, determining whether the autonomous driving simulation environment is an ultra-high frame rate simulation environment can be accomplished by: acquiring a rotational frequency and a horizontal angular resolution corresponding to the laser radar; determining a laser emission time difference corresponding to the laser radar based on the rotational frequency and the horizontal angular resolution; calculating a simulation time difference based on the simulation frequency; determining the autonomous driving simulation environment as the ultra-high frame rate simulation environment if the laser emission time difference is greater than or equal to the simulation time difference; and determining the autonomous driving simulation environment as the low frame rate simulation environment if the laser emission time difference is less than the simulation time difference. In