Search

CN-121994256-A - Method and device for lane fusion, controller and computer program product

CN121994256ACN 121994256 ACN121994256 ACN 121994256ACN-121994256-A

Abstract

Embodiments of the present disclosure relate to methods and apparatus, controllers, and computer program products for lane fusion. The method includes determining a map boundary of at least one lane corresponding to the target road segment based on the map data. The method also includes acquiring a sensing boundary for the at least one lane based on the sensing data from the at least one sensor. The method also includes identifying an abnormal boundary of the target road segment based on a match of the map boundary and the sensed boundary. In this way, not only can the matching between map data and sensed data having different data structures or representations be effectively processed, but also the abnormal boundaries that may exist in the map data and sensed data can be identified based on the matching results, thereby facilitating performance improvement of driving functionality and helping the vehicle make more reasonable and safe driving decisions.

Inventors

  • WANG LEICHEN
  • LI XINRUN

Assignees

  • 罗伯特·博世有限公司

Dates

Publication Date
20260508
Application Date
20241101

Claims (15)

  1. 1. A method (200) for lane fusion, comprising: Determining (210) a map boundary of at least one lane corresponding to the target road segment based on the map data; acquiring (220) a sensed boundary for the at least one lane based on sensed data from the at least one sensor, and An abnormal boundary of the target road segment is identified (230) based on a match of the map boundary and the sensed boundary.
  2. 2. The method (200) of claim 1, wherein the map boundary is indicated by an overstep of a map boundary of a side of the at least one lane corresponding to the target road segment, the method comprising: Searching the map data for a target road segment corresponding to the sensing boundary, the map data including a road segment, a lane, and a boundary; the super-boundary is obtained by at least one of: Forming the super boundary by stitching the first boundary and the second boundary in the vehicle traveling direction in response to the searched distance between the first boundary and the second boundary of the target road section being smaller than a first distance threshold; And in response to the searched total length of the target road segment being greater than a second distance threshold, forming the super boundary by cropping a portion of the searched target road segment exceeding the second distance threshold.
  3. 3. The method (200) of claim 2, wherein the target segment corresponding to the sensing boundary comprises at least one of: A first road segment overlapping the sensing boundary, or A predetermined number of second road segments upstream and downstream of the first road segment.
  4. 4. The method (200) of claim 1, wherein the matching of the map boundary to the sensing boundary comprises: Determining a similarity between an overstep of the map data and the sensing boundary of the sensing data, and Based on the calculated similarity, a boundary topology map indicating a matching relationship between the super-boundary and the sensing boundary is constructed.
  5. 5. The method (200) of claim 4, wherein determining the similarity comprises: determining a point distance between a sample point on the super-boundary and a nearest point on the sensing boundary, and The boundary distance between the super-boundary and the sensing boundary is obtained by weighting the determined point distances.
  6. 6. The method (200) of claim 4, wherein the boundary topology map is constructed by: creating a starting node and a terminating node; Creating a boundary crossing node corresponding to each boundary crossing and a sensing boundary node corresponding to each sensing boundary, and The out-of-boundary node and the sense boundary node having the smallest node distance are matched together based on a node distance indicating a boundary distance between the out-of-boundary and the sense boundary.
  7. 7. The method (200) of claim 6, further comprising: the first super-boundary and the first sensing boundary are pre-matched together as an anomalous match in response to a boundary distance between the unmatched first super-boundary and the first sensing boundary being greater than a third distance threshold corresponding to a lane width.
  8. 8. The method (200) of claim 7, further comprising: Obtaining a matching result between the super-boundary and the sensing boundary by filtering abnormal matching between the super-boundary and the sensing boundary.
  9. 9. The method (200) of claim 8, wherein filtering the outlier matches comprises: In response to a second oversubstance matching a plurality of second sensing boundaries, preserving a match between one second sensing boundary having a smallest boundary distance from the second oversubstance, and Responsive to a boundary distance between a third super-boundary that has been matched and a third sensing boundary being greater than the third distance threshold, a pre-match between the third super-boundary and the third sensing boundary is deleted.
  10. 10. The method (200) of claim 1, wherein the sensed data comprises first sensed data from a first sensor and second sensed data from a second sensor, the method (200) further comprising: Performing a first fusion based on a first match of the map data and the first sensed data; performing a second fusion based on a second match of the map data and the second sensed data, and And respectively executing adjustment on the fusion results of the first fusion and the second fusion.
  11. 11. The method (200) of claim 10, further comprising: a confidence score is assigned for the map data, the first sensed data, or the second sensed data based on the first match and the second match.
  12. 12. The method (200) of claim 11, wherein the exception boundary comprises a false negative boundary, the identifying the exception boundary comprising: Assigning a high confidence score to the map data that a false negative boundary exists at the location of a fourth sensing boundary and a fifth sensing boundary associated with the fourth sensing boundary from the first sensing data and a fifth sensing boundary associated with the fifth sensing boundary from the second sensing data do not match any oversubstance, and In response to a fourth oversboundary matching only a sixth sense boundary from the first sense data or the second sense data, a high confidence score is assigned to unmatched sense data having false negative boundaries at the locations of the fourth oversboundary and the sixth sense boundary.
  13. 13. An apparatus (1000) for lane fusion, comprising: A map boundary determination module (1010) configured to determine a map boundary of at least one lane corresponding to the target link based on the map data; A sensing boundary acquisition module (1020) configured to acquire a sensing boundary for the at least one lane based on sensing data from the at least one sensor, and A boundary matching module (1030) is configured to identify an abnormal boundary of the target road segment based on a match of the map boundary with the sensed boundary.
  14. 14. A controller (1100), comprising: At least one processor (1101), and A memory (1102) coupled to the at least one processor (1101) and having instructions stored thereon that, when executed by the at least one processor (1101), cause the controller (1100) to perform the method according to any of claims 1-11.
  15. 15. A computer program product tangibly stored on a computer-readable medium (1102) and comprising computer-executable instructions which, when executed by a processor (1101) of a computer, cause the computer to perform the method according to any of claims 1 to 12.

Description

Method and device for lane fusion, controller and computer program product Technical Field Embodiments of the present disclosure relate generally to the field of driving and, in particular, relate to a method and apparatus, a controller and a computer program product for lane fusion. Background The accurate modeling of complex traffic scenes is not only a key link for realizing accurate planning and effective control of driving systems such as an Automatic Driving (AD) system and an Advanced Driving Assistance System (ADAS), but also an important basis for improving road safety, reducing traffic accident risks and the like. In order to keep the vehicle running in the correct position, lane detection functionality is provided, which aims at identifying and tracking markers in the road for accurate navigation and safe running of the vehicle etc. There are a variety of detection and sensing methods, such as camera-based lane detection, laser radar (LiDAR) based lane detection, and the like. Disclosure of Invention Embodiments of the present disclosure provide a method and apparatus, controller and computer program product for lane fusion. According to a first aspect of the present disclosure, a method for lane fusion is provided. The method includes determining a map boundary of at least one lane corresponding to the target road segment based on the map data. The method also includes acquiring a sensing boundary for the at least one lane based on the sensing data from the at least one sensor. The method also includes identifying an abnormal boundary of the target road segment based on a match of the map boundary and the sensed boundary. According to a second aspect of the present disclosure, an apparatus for lane fusion is provided. The apparatus includes a map boundary determination module configured to determine a map boundary of at least one lane corresponding to a target road segment based on map data. The apparatus further includes a sensing boundary acquisition module configured to acquire a sensing boundary for the at least one lane based on sensing data from the at least one sensor. The apparatus also includes a boundary matching module configured to identify an abnormal boundary of the target road segment based on a match of the map boundary with the sensed boundary. According to a third aspect of the present disclosure, a controller is provided. The controller includes at least one processor. The controller further includes a memory coupled to the at least one processor and having instructions stored thereon that, when executed by the at least one processor, cause the apparatus to perform the steps of the method in the first aspect of the disclosure. According to a fourth aspect of the present disclosure, a vehicle is provided. The vehicle includes a controller in a third aspect of the present disclosure. According to a fifth aspect of the present disclosure, there is provided a computer program product tangibly stored on a computer-readable medium and comprising computer-executable instructions which, when executed by a processor of a computer, cause the computer to perform the steps of the method in the first aspect of the present disclosure. According to a sixth aspect of the present disclosure there is provided a machine-readable storage medium having stored thereon instructions which, when executed by a processor, cause a machine to perform the steps of the method in the first aspect of the present disclosure. Drawings The foregoing and other objects, features and advantages of the disclosure will become more apparent from the following more particular description of exemplary embodiments of the disclosure, as illustrated in the accompanying drawings. In exemplary embodiments of the present disclosure, the same or similar reference numerals generally represent the same or similar parts, components, etc. FIG. 1 illustrates a schematic diagram of an example environment in which methods and/or devices may be implemented, according to embodiments of the present disclosure; FIG. 2 illustrates a flow chart of a method for lane fusion according to an embodiment of the present disclosure; FIG. 3 illustrates a diagram of a lane fusion process according to an embodiment of the present disclosure; FIG. 4 illustrates a diagram of an example data representation utilizing map data and sensed data for a super boundary, in accordance with an embodiment of the present disclosure; FIG. 5 illustrates a diagram of an out-of-bounds fetching process, according to an embodiment of the present disclosure; FIG. 6 illustrates a diagram of a boundary topology process according to an embodiment of the present disclosure; FIG. 7 illustrates a diagram of a constructed boundary topology graph, in accordance with an embodiment of the present disclosure; FIG. 8 illustrates a diagram of a multimodal fusion process, according to an embodiment of the disclosure; FIG. 9 illustrates a diagram of an anomaly boundary identification pr