CN-114074667-B - Stationary object detection
Abstract
The present disclosure relates to stationary object detection, and in particular to a method of detecting stationary objects implemented by a control system of an autonomous vehicle, comprising receiving sensor signal data comprising stationary detection and non-stationary detection from an ambient environment of the vehicle, determining at least one stationary detection combination satisfying one or more lateral position selection criteria based on lateral position of each stationary detection from a direction in which the vehicle is facing, determining at least one stationary detection combination satisfying one or more combination regularity selection criteria based on regularity of position differences between stationary detection pairs sequentially positioned in the combination in the direction in which the vehicle is facing, determining one or more stationary detection combinations satisfying the lateral selection criteria and the combination regularity selection criteria for a combination of stationary detections corresponding to the at least one stationary object, and removing stationary detection corresponding to the at least one stationary object from the sensor signal data output to one or more data processing components of the control system.
Inventors
- PETER BART
Assignees
- 哲内提
Dates
- Publication Date
- 20260508
- Application Date
- 20210820
- Priority Date
- 20200820
Claims (15)
- 1. A method implemented by a control system (12) of an autonomous vehicle (10), the method comprising: -receiving (700, 800) sensor signal data comprising stationary detection and non-stationary detection from the surroundings of the autonomous vehicle; determining (702) at least one stationary detection combination that meets one or more lateral position selection criteria based on each stationary detected lateral position from a direction in which the autonomous vehicle is facing; Determining (704) at least one stationary detection combination that meets one or more combination regularity selection criteria based on regularity of position differences between stationary detection pairs of the combination that are sequentially positioned in a direction in which the autonomous vehicle faces; Determining (706) one or more stationary detection combinations meeting the lateral position selection criteria and the combination regularity selection criteria as a stationary detection combination corresponding to at least one predetermined type of stationary object, and -Removing (708) the stationary detection combination corresponding to the at least one predetermined type of stationary object from the sensor signal data output to one or more data processing components (28) of the control system (12).
- 2. The method of claim 1, wherein determining (706) one or more stationary detection combinations that satisfy the lateral position selection criterion and the combination regularity selection criterion for a stationary detection combination corresponding to at least one stationary object comprises: Determining (702) at least one lateral combination of stationary detections meeting one or more lateral position selection criteria based on the lateral position comprising each combination of stationary detections from the direction in which the autonomous vehicle faces, and For each of at least one lateral combination of the stationary detections, determining (704) whether the lateral combination satisfies the one or more combination regularity selection criteria, and if so, determining that the stationary detection in that lateral combination corresponds to the at least one stationary object.
- 3. The method of claim 1, wherein determining (704) at least one stationary detection combination that meets one or more combination regularity selection criteria comprises: for each stationary detection in the at least one combination, determining a distance of the location of the stationary detection from the autonomous vehicle in a direction in which the autonomous vehicle is facing; Determining (814) a set of distance differences between pairs of positions sequentially arranged from the position of the autonomous vehicle, and It is determined (816) whether the determined set of distance differences meets at least one distance regularity criterion.
- 4. A method according to claim 3, wherein the distance regularity criteria comprise at least a criterion that the distance difference relative to the autonomous vehicle in each selected lateral combination in the longitudinal direction is below a predetermined threshold.
- 5. The method of claim 3 or 4, wherein the distance regularity criteria comprises at least the distances within a predetermined distance range.
- 6. The method of any of claims 1-4, wherein determining (702) a combination of stationary detections that meets one or more lateral position selection criteria based on the lateral position detected from each stationary detection of a direction in which the autonomous vehicle faces comprising a control circuit comprises: determining (804) from the received signal a stationary detected lateral position relative to the direction the autonomous vehicle is facing; -assigning (806) the stationary detection into a lateral combination based on the lateral position of the stationary detection; -laterally ordering (810) the lateral combinations based on lateral positions of the lateral combinations from the direction the autonomous vehicle of each combination is facing; determining whether the number of stationary detections in each lateral combination associated with the lateral location range meets a member number threshold; For each laterally ordered combination of stationary detections meeting said membership number threshold, determining whether said number of stationary detections in that combination meets a peak threshold condition compared to the average number of stationary detections in one or more adjacent laterally ordered combinations.
- 7. The method of any of claims 1 to 4, wherein removing (708) the stationary detection combination corresponding to the at least one stationary object comprises: Classifying (818) each stationary detection in the combination satisfying the selection criteria as a stationary sensing detection corresponding to a stationary object of a predetermined type, and -Filtering (820) the classified stationary detection combination from the sensor signal data output to one or more data processing components (28) of the autonomous vehicle control system (12).
- 8. The method of claim 7, further comprising: Determining (902) a confidence of the regularity of the stationary detection corresponding to the predetermined type of stationary object; Removing (820) the stationary detection from the sensor signal data output to one or more data processing components (28) of the autonomous vehicle control system (12) based on a confidence assessment that the stationary detection corresponds to the predetermined type of stationary object, and An indication of a classification of each candidate stationary object sensing detection and/or a confidence assessment of an association of the classification is output.
- 9. The method of any of claims 1-4, wherein the at least one stationary object (52, 58, 68) is positioned adjacent to a road segment along the direction in which the autonomous vehicle faces and comprises an elongated structure.
- 10. The method of any of claims 1-4, wherein the method is implemented by a preprocessing component (26), the preprocessing component (26) being configured to provide the output to the one or more data processing components (28) of the control system (12) of the autonomous vehicle (10).
- 11. The method according to any one of claims 1 to 4, wherein the one or more data processing components (28) of the control system (12) comprise one or more components of the following type: a sensor data fusion processing section; Object tracking processing means, and Road estimation means.
- 12. The method of any of claims 1-4, wherein the sensor signals comprise radar signals generated by an array of one or more radar sensors (16) of the autonomous vehicle (10).
- 13. A control system configured to implement the method for detecting a stationary object according to any of the preceding claims, the control system comprising: Means for receiving (700) sensor signal data including stationary detection and non-stationary detection from an ambient environment of the autonomous vehicle; Means for determining (702) at least one stationary detection combination that meets one or more lateral position selection criteria based on each stationary detected lateral position from a direction in which the autonomous vehicle is facing; Means for determining (704) at least one stationary detection combination that meets one or more combination regularity selection criteria based on regularity of position differences between stationary detection pairs in the combination that are sequentially positioned in a direction in which the autonomous vehicle faces; Means for determining (706) that one or more stationary detection combinations are fulfilled for the lateral position selection criterion and the combination regularity selection criterion as stationary detection combinations corresponding to at least one stationary object, and -Means for removing (708) the stationary detection combination corresponding to the at least one stationary object from the sensor signal data output to one or more data processing components (28) of the control system (12).
- 14. A vehicle (10), comprising: The control system (12) according to claim 13, wherein the control system (12) comprises: Speed determining means (20, 22) for monitoring the speed of the vehicle (10), comprising an inertial monitoring unit (20) and a positioning system (22); a perception system (14) for monitoring an ambient environment (32) of the vehicle (10) includes at least one sensor (16, 18).
- 15. A computer readable storage medium storing one or more programs configured for execution by one or more processors of a vehicle control system, the one or more programs comprising instructions for performing the method of any of claims 1-12.
Description
Stationary object detection Technical Field The disclosed technology relates to methods of detecting stationary objects, for example, to methods of detecting stationary objects having a particular type of physical attribute in object data including both sensed stationary objects and sensed non-stationary objects, and related aspects. The disclosed technology is applicable to vehicles, including vehicles that assist the driver in a semi-automated manner as well as fully automated vehicles. The disclosed technology may be implemented in a control system of such a vehicle, also referred to herein as an autonomous vehicle (ego vehicles). The disclosed techniques may be implemented in a control system of an autonomous vehicle to detect and remove one or more types of stationary object detection from a sensed object detection stream generated by one or more radar sensors of the autonomous vehicle. In some embodiments, the disclosed techniques are particularly useful for removing the sensing detection of certain types of stationary objects, such as roads or lane-side structures (e.g., guardrails or similar obstacle-type structures that may be disposed alongside a path traversed by a vehicle). This structure can extend a long distance and can generate a large number of sensor detections in a sensing cycle. The disclosed technology seeks to provide a solution to the problem of how to reliably reduce the amount of data that needs to be processed by the perception system of an autonomous vehicle to provide reliable information about the surrounding sensed environment of the autonomous vehicle. The disclosed technology seeks to provide a way to quickly determine whether a sensed static detection combination sharing one or more attributes is together the result of sensing a portion (or all) of a particular type of static object, such as, for example, a portion (or all) of a road or lane-side barrier or guardrail or the like. The disclosed techniques allow for removal of sensing detection of guardrails/roadblocks and the like in a reliable and time-saving manner during the preprocessing stage of the batch process, which reduces the likelihood of mistakes in classifying other objects that might generate similar sensing detection like sensor detection. Such similar sensor detection includes radar detection of temporarily stationary objects, such as a train of vehicles, for example, which may extend in a similar manner along a lane in a road/road, in particular at a branched intersection on a relatively straight road segment. The disclosed techniques reduce the amount of data comprising sensing detection during the preprocessing phase, otherwise the processing components of the sensing system and/or the control system would have to analyze the sensing detection in more detail to distinguish between temporarily stationary objects and truly stationary objects to better understand the environment of the autonomous vehicle. Background Autonomous or driver-assisted autonomous vehicles are provided with sophisticated sensor systems to allow their control systems to learn the environment in which the autonomous vehicle is located, e.g., so that the autonomous vehicle can know its location and lane position, and thus it can detect and bypass potential obstacles. Autonomous vehicles traveling along a lane on a highway (or stationary) face challenges to reliably distinguishing moving sensed objects from stationary sensed objects in the sensed detection stream generated by their sensor systems. Moving objects, which may represent other vehicles, pedestrians, or animals, may be monitored and tracked, and predicted paths may be generated for the moving objects to avoid collisions. On the other hand, stationary objects may represent buildings, lane dividers, branches, bridges, faulty vehicles, side walls and guardrails, etc., and are not generally tracked in the same manner as moving objects. One problem that the perception system must deal with is that the alignment of stationary vehicles may generate sensor detection that is easily confused with the sensing detection of elongated structures such as a barrier or guardrail. As the barrier and guardrail progress along the roadway, they may generate a multitude of sensory detections by remote sensors (such as radar sensor systems) in the same manner as the train of vehicles. This is desirable if a large number of sensing detections of stationary large objects (e.g. elongated structures such as forming guardrails or roadside barriers) can be distinguished from temporarily stationary objects (e.g. queues of stationary vehicles) as fast as possible. However, this is problematic to achieve in a safe, reliable manner due to the number of such detections that can be sensed in a sensing cycle. Known techniques for removing barrier and guardrail detection to reduce the amount of sensed object detection that requires further processing and classification by the perception system and/or control syst