EP-4089648-B1 - LANE EDGE EXTRACTION METHOD AND APPARATUS, AUTONOMOUS DRIVING SYSTEM, VEHICLE, AND STORAGE MEDIUM
Inventors
- LIN, Binbin
Dates
- Publication Date
- 20260506
- Application Date
- 20220328
Claims (8)
- A computer-implemented lane edge extraction method (10), comprising: receiving tracking edge points, about lane edges, of an immediately preceding frame of an edge image sequence (S102); determining observation edge points, about the lane edges, of a current frame of the edge image sequence (S104); obtaining temporary tracking edge points of the current frame by continuing the tracking edge points of the immediately preceding frame with the observation edge points of the current frame and correcting the tracking edge points of the immediately preceding frame based on the observation edge points of the current frame (S106); fitting a lane edge curve based on the temporary tracking edge points (S108); and excluding outliers from the temporary tracking edge points based on the lane edge curve, to form tracking edge points of the current frame, wherein it is determined that a point is a part of the outliers if a distance between the point in the temporary tracking edge points and the lane edge curve is greater than a preset value (S110), wherein tracking edge points of a first frame of the edge image sequence are observation edge points of the first frame, wherein the obtaining temporary tracking edge points of the current frame by continuing the tracking edge points of the immediately preceding frame with the observation edge points of the current frame and correcting the tracking edge points of the immediately preceding frame based on the observation edge points of the current frame (S106) comprise: determining current positions of the tracking edge points of the immediately preceding frame and the observation edge points of the current frame in the vehicle rectangular coordinate system; continuing, in the vehicle rectangular coordinate system, a non-coincident part of the observation edge points of the current frame and the tracking edge points of the immediately preceding frame, to obtain a part of the temporary tracking edge points; mapping a coincident part of the observation edge points of the current frame and the tracking edge points of the immediately preceding frame into the vehicle polar coordinate system, and correcting the tracking edge points of the immediately preceding frame by using the observation edge points of the current frame in the coincident part, to obtain a remaining part of the temporary tracking edge points; mapping the remaining part of the temporary tracking edge points into the vehicle rectangular coordinate system; and combining the part of the temporary tracking edge points and the remaining part of the temporary tracking edge points to form the temporary tracking edge points.
- The method (10) according to claim 1, wherein the determining current positions of the tracking edge points of the immediately preceding frame in the vehicle rectangular coordinate system comprises: determining the current positions of the tracking edge points of the immediately preceding frame based on a speed and a yaw angle of a vehicle and a time difference between the immediately preceding frame and the current frame.
- The method (10) according to claim 1 or 2, wherein where an angle of orientation of a first point in the tracking edge points of the immediately preceding frame in relation to polar in the vehicle polar coordinate system is between angles of orientation of a second point and a third point of the observation edge points of the current frame in relation to polar in the vehicle polar coordinate system, linear interpolation is performed by using radial distances of the second point and the third point, to obtain an radial distance of a fourth point that is the same as the angle of orientation of the first point in relation to polar in the vehicle polar coordinate system; and an radial distance of the first point is corrected by means of filtering based on the radial distance of the first point and the radial distance of the fourth point.
- The method (10) according to claim 3, wherein the lane edge curve is fitted by means of a least square method based on the temporary tracking edge points; and it is determined that the fourth point is a part of the outliers if a distance between the fourth point in the temporary tracking edge points and the lane edge curve is greater than a preset value.
- A computer-readable storage medium storing instructions, wherein the instructions, when executed by a processor, cause the processor to perform a lane edge extraction method, characterized in that , preferably the method (10) of any one of claims 1 to 4, the method comprising: receiving tracking edge points, about lane edges, of an immediately preceding frame of an edge image sequence (S102); determining observation edge points, about the lane edges, of a current frame of the edge image sequence (S104); obtaining temporary tracking edge points of the current frame by continuing the tracking edge points of the immediately preceding frame with the observation edge points of the current frame and correcting the tracking edge points of the immediately preceding frame based on the observation edge points of the current frame (S106); fitting a lane edge curve based on the temporary tracking edge points (S108); and excluding outliers from the temporary tracking edge points based on the lane edge curve, to form tracking edge points of the current frame, wherein it is determined that a point is a part of the outliers if a distance between the point in the temporary tracking edge points and the lane edge curve is greater than a preset value (S110), wherein tracking edge points of a first frame of the edge image sequence are observation edge points of the first frame.
- A lane edge extraction apparatus (20), characterized in that , comprising: an image obtaining apparatus (202) configured to obtain an edge image sequence; a calculation apparatus (204) configured to: receive tracking edge points, about lane edges, of an immediately preceding frame of the edge image sequence; determine observation edge points, about the lane edges, of a current frame of the edge image sequence; obtain temporary tracking edge points of the current frame by continuing the tracking edge points of the immediately preceding frame with the observation edge points of the current frame and correct the tracking edge points of the immediately preceding frame based on the observation edge points of the current frame; fit a lane edge curve based on the temporary tracking edge points; and exclude outliers from the temporary tracking edge points based on the lane edge curve, to form tracking edge points of the current frame, wherein it is determined that a point is a part of the outliers if a distance between the point in the temporary tracking edge points and the lane edge curve is greater than a preset value, wherein tracking edge points of a first frame of the edge image sequence are observation edge points of the first frame; and an edge generation unit (206) configured to output the lane edge curve, wherein the calculation apparatus (204) is configured to: determine current positions of the tracking edge points of the immediately preceding frame and the observation edge points of the current frame in the vehicle rectangular coordinate system; continue, in the vehicle rectangular coordinate system, a non-coincident part of the observation edge points of the current frame and the tracking edge points of the immediately preceding frame, to obtain a part of the temporary tracking edge points; map a coincident part of the observation edge points of the current frame and the tracking edge points of the immediately preceding frame into the vehicle polar coordinate system, and correcting the tracking edge points of the immediately preceding frame by using the observation edge points of the current frame in the coincident part, to obtain a remaining part of the temporary tracking edge points; map the remaining part of the temporary tracking edge points into the vehicle rectangular coordinate system; and combine the part of the temporary tracking edge points and the remaining part of the temporary tracking edge points to form the temporary tracking edge points.
- An autonomous driving system, characterized in that , the autonomous driving system comprises the lane edge extraction apparatus (20) of claim 6.
- A vehicle, characterized in that , the vehicle comprises the lane edge extraction apparatus (20) of claim 6 or the autonomous driving system of claim 7.
Description
Technical Field This application relates to the field of visual control for vehicles, and in particular, to a lane edge extraction method, a lane edge extraction apparatus, an autonomous driving system, a vehicle, and a computer-readable storage medium. Background Art Computer vision processing technology is increasingly applied in the field of vehicle driving. At present, functions such as lateral control for driver assistance are highly dependent on the quality of lane lines on a road. When information about lane lines is inaccurate due to blurred road signs, accumulated water and snow on a road, etc., driver assistance may hardly control the direction of a vehicle correctly. The above common scenarios limit the application range of vehicle driving assistance, and also easily lead to dangers during an application process. By means of existing visual identification, a drivable space of a vehicle can be provided. However, data is relatively primitive in structure, and the data is noisy and has a large error. Therefore, although making no difference to suppression and warning of a function, the data may raise many risks if being directly used for route planning and control. CN109 977 776 A directs to a lane detection method, which includes the steps of: fitting lane line pixel points extracted from images acquired by image sensor to obtain a lane line; predicting a lane line of the image based on the tracked lane line and the vehicle information; matching the fitted lane lines to the predicted lane lines; if matched, splicing the fitted lane line with the tracked lane line. "Research on Lane Detection Based on GlobalSearch of Dynamic Region of Interest (DROI)" by Hu Jianjun ET AL, Applied Sciences, relates to a lane detection approach based on the dynamic region of interest (DROI) selection in the horizontal and vertical safety vision. WO 2022/154967 A1 relates to system identifies a road to be navigated by an ADV, the road being captured by one or more point clouds from one or more LIDAR sensors. CN 106 447 730 B relates to a parameter estimation method, a parameter estimation apparatus and electronic equipment for estimating the position coordinates of the vanishing line in the current frame of image acquired by the imaging device, with higher precision. Summary of the Invention Embodiments of this application provide a lane edge extraction method, a lane edge extraction apparatus, an autonomous driving system, a vehicle, and a computer-readable storage medium, which are used for improving the stability and accuracy of lane edge extraction. The subject-matter of the present invention is defined by the features of the independent claims. Further preferred embodiments of the present invention are defined in the dependent claims. In particular, according to an aspect of this application, a lane edge extraction method is provided, the method including: receiving tracking edge points, about lane edges, of an immediately preceding frame of an edge image sequence; determining observation edge points, about the lane edges, of a current frame of the edge image sequence; obtaining temporary tracking edge points of the current frame by continuing the tracking edge points of the immediately preceding frame with the observation edge points of the current frame and correcting the tracking edge points of the immediately preceding frame based on the observation edge points of the current frame; fitting a lane edge curve based on the temporary tracking edge points; and excluding outliers from the temporary tracking edge points based on the lane edge curve, to form tracking edge points of the current frame, wherein it is determined that a point is a part of the outliers if a distance between the point in the temporary tracking edge points and the lane edge curve is greater than a preset value, wherein tracking edge points of a first frame of the edge image sequence are observation edge points of the first frame. The obtaining temporary tracking edge points of the current frame by continuing the tracking edge points of the immediately preceding frame with the observation edge points of the current frame and correcting the tracking edge points of the immediately preceding frame based on the observation edge points of the current frame comprise: determining current positions of the tracking edge points of the immediately preceding frame and the observation edge points of the current frame in the vehicle rectangular coordinate system; continuing, in the vehicle rectangular coordinate system, a non-coincident part of the observation edge points of the current frame and the tracking edge points of the immediately preceding frame, to obtain a part of the temporary tracking edge points; mapping a coincident part of the observation edge points of the current frame and the tracking edge points of the immediately preceding frame into the vehicle polar coordinate system, and correcting the tracking edge points of the immediately preced