KR-20260063099-A - VEHICLE SPEED ESTIMATION SYSTEM AND ESTIMATION METHOD THEREOF
Abstract
A method for estimating vehicle speed according to one embodiment comprises: (a) a step of calculating an embedded wheel feature by embedding wheel speed data measured for at least one wheel mounted on a vehicle; (b) a step of calculating a fusion feature by combining features of multiple sensor data obtained from multiple sensors; (c) a step of calculating a correction coefficient to correct an error in vehicle speed calculated based on the wheel speed data using the embedded wheel feature and the fusion feature; and (d) a step of estimating a corrected vehicle speed based on a value obtained by multiplying the correction coefficient and the embedded wheel feature.
Inventors
- 박건우
- 김병주
- 김임수
- 오종환
Assignees
- 주식회사 베이리스
Dates
- Publication Date
- 20260507
- Application Date
- 20241030
Claims (16)
- In a vehicle speed estimation system, Computing device; including, The above computing device is, A vehicle speed estimation system comprising: calculating an embedded wheel feature by embedding wheel speed data measured for at least one wheel mounted on a vehicle; calculating a fusion feature by combining features of multiple sensor data obtained from multiple sensors; calculating a correction coefficient to correct an error in the vehicle speed calculated based on the wheel speed data using the embedded wheel feature and the fusion feature; and estimating the corrected vehicle speed based on the value obtained by multiplying the correction coefficient and the embedded wheel feature.
- In paragraph 1, The above computing device is, Encoded texture features encoded with ground texture information and encoded optical flow features encoded with optical flow information are calculated from camera data, respectively, and From IMU sensor data, an encoded IMU feature is calculated in which the feature of the IMU sensor data is encoded, and A vehicle speed estimation system that calculates encoded radar features, in which features of the radar sensor data are encoded, from radar sensor data.
- In paragraph 2, The above computing device is, The above encoded texture feature is used as the key and value of the first transformer decoder, and the above encoded IMU feature is used as the query value of the first transformer decoder to calculate the final IMU feature corrected from the first transformer decoder. A vehicle speed estimation system that uses the encoded optical flow feature as the key and value of the second transformer decoder, uses the encoded radar feature as the query value of the second transformer decoder, and calculates a final radar feature corrected from the radar sensor data feature from the second transformer decoder.
- In paragraph 3, The above computing device is, A vehicle speed estimation system that calculates a fusion feature by combining the final IMU feature and the final radar feature.
- In paragraph 4, The above computing device is, A vehicle speed estimation system that combines the final IMU feature and the final radar feature by adaptively setting weights to be applied to each of the final IMU feature and the final radar feature.
- In paragraph 2, The above computing device is, A system for estimating the speed of a vehicle, which calculates frame-by-frame camera features using the above camera data frame by frame, calculates encoded texture features using the frame-by-frame camera features for a plurality of frames, and calculates encoded optical flow features using the frame-by-frame camera features for a plurality of frames.
- In paragraph 2, The above computing device is, The above IMU sensor data is normalized frame by frame, the normalized frame-by-frame IMU sensor data is embedded, and the embedded frame-by-frame IMU sensor data is encoded to produce encoded frame-by-frame IMU features, The above encoded IMU feature is, A vehicle speed estimation system comprising frame-by-frame encoded IMU features for the aforementioned plurality of frames.
- In paragraph 2, The above computing device is, A vehicle speed estimation system that calculates frame-by-frame radar features using the above radar sensor data frame by frame, embeds the frame-by-frame radar features for the plurality of frames, and encodes the embedded frame-by-frame radar features for the plurality of frames to calculate encoded radar features.
- In the method for estimating vehicle speed, (a) A step of calculating an embedded wheel feature by embedding wheel speed data measured for at least one wheel mounted on a vehicle; (b) a step of calculating a fusion feature by combining features of multiple sensor data obtained from multiple sensors; (c) a step of calculating a correction coefficient to correct an error in the vehicle speed calculated based on the wheel speed data using the embedded wheel feature and the fusion feature; and (d) a step of estimating the corrected vehicle speed based on the value obtained by multiplying the correction coefficient and the embedded wheel feature; a method for estimating the speed of a vehicle.
- In Paragraph 9, The above step (b) is, (b-1) A step of generating, respectively, an encoded texture feature in which ground texture information is encoded and an encoded optical flow feature in which optical flow information is encoded from camera data; (b-2) A step of calculating an encoded IMU feature in which the feature of the IMU sensor data is encoded from the IMU sensor data; and (b-3) A method for estimating the speed of a vehicle, comprising the step of calculating an encoded radar feature in which the features of the radar sensor data are encoded from the radar sensor data.
- In Paragraph 10, The above step (b) is, (b-4) a step of using the encoded texture feature as a key and value of the first transformer decoder, and using the encoded IMU feature as a query value of the first transformer decoder to calculate a final IMU feature corrected from the first transformer decoder; and (b-5) a step of using the encoded optical flow feature as a key value and a value of the second transformer decoder, and using the encoded radar feature as a query value of the second transformer decoder to calculate a final radar feature that corrects the features of the radar sensor data from the second transformer decoder; further comprising a method for estimating the speed of a vehicle.
- In Paragraph 11, The above step (b) is, (b-6) A method for estimating the speed of a vehicle, further comprising the step of combining the final IMU feature and the final radar feature to produce the fusion feature.
- In Paragraph 12, In the above (b-6) step, A method for estimating the speed of a vehicle by adaptively setting weights to be applied to each of the final IMU feature and the final radar feature, and combining the final IMU feature and the final radar feature.
- In Paragraph 10, The above (b-1) step is, (b-1-1) A step of calculating frame-by-frame camera features using the above camera data frame by frame; (b-1-2) A step of calculating the encoded texture feature using the frame-specific camera feature for a plurality of frames; and (b-1-3) A step of calculating the encoded optical flow feature using the frame-specific camera feature for the plurality of frames; comprising a method for estimating the speed of a vehicle.
- In Paragraph 10, The above (b-2) step is, (b-2-1) A step of normalizing the above IMU sensor data frame by frame; (b-2-2) A step of embedding the normalized frame-by-frame IMU sensor data from the above (b-2-1) step; and (b-2-3) A step of encoding the frame-specific IMU sensor data embedded in the above step (b-2-2) to calculate the encoded frame-specific IMU features; wherein The above encoded IMU feature is, A method for estimating the speed of a vehicle, comprising frame-specific encoded IMU features for the aforementioned plurality of frames.
- In Paragraph 10, The above (b-3) step is, (b-3-1) A step of calculating frame-by-frame radar features using the above radar sensor data frame by frame; (b-3-2) A step of embedding frame-specific radar features for the plurality of frames; and (b-3-3) A step of encoding frame-specific radar features for the plurality of frames embedded in step (b-3-2) to calculate encoded radar features; a method for estimating the speed of a vehicle.
Description
Vehicle Speed Estimation System and Estimation Method Thereof The embodiments disclosed in this specification relate to a vehicle speed estimation system and a method for estimating the same. Accurate measurement of a vehicle's speed is crucial for smooth driving planning in autonomous vehicles. For vehicles driving through construction sites or rough terrain, the accuracy of GNSS cannot be guaranteed, so the vehicle's speed must be inferred solely from its onboard sensors. However, in such cases, it is difficult to predict the precise speed based only on wheel speed because slippage with the ground occurs. To address this, there are methods to utilize additional IMU sensors, but there are limitations to precise speed prediction due to the noise and errors present in each sensor. Furthermore, if additional sensors are used for sensor fusion to increase precision, the computational cost increases, which has the disadvantage of slowing down the inference speed. FIG. 1 is a configuration diagram of a vehicle speed estimation system according to one embodiment. FIG. 2 is a configuration diagram of a computing device according to one embodiment. FIG. 3 is an explanatory diagram of the overall operation of a computing device according to one embodiment. FIG. 4 is an explanatory diagram of the inference step of a computing device according to one embodiment. FIG. 5 is an explanatory diagram of a method for calculating an embedded wheel feature according to one embodiment. FIG. 6 is an explanatory diagram of a method for calculating encoded texture features and encoded optical flow features according to one embodiment. FIG. 7 is an explanatory diagram of a method for calculating encoded IMU features according to one embodiment. FIG. 8 is an explanatory diagram of a method for calculating encoded radar features according to one embodiment. FIG. 9 is an explanatory diagram of a method for calculating final IMU features according to one embodiment. FIG. 10 is an explanatory diagram of a method for calculating a final radar feature according to one embodiment. FIG. 11 is an explanatory diagram of a method for calculating fusion features according to one embodiment. FIG. 12 is a flowchart of a method for estimating vehicle speed according to one embodiment. Hereinafter, a system for estimating vehicle speed and a method for estimating the same according to embodiments of the present disclosure will be described in detail with reference to the attached drawings. It should be understood that the following embodiments of the present disclosure are merely for the purpose of embodying the present disclosure and do not limit or restrict the scope of the rights of the present disclosure. Anything that can be easily inferred by a person skilled in the art to which the present disclosure pertains from the detailed description and embodiments of the present disclosure is interpreted as falling within the scope of the rights of the present disclosure. FIG. 1 shows a configuration diagram of a vehicle speed estimation system (100) according to one embodiment. As can be seen from FIG. 1, a vehicle speed estimation system (100) according to one embodiment may be configured to include a sensor unit (10) and a computing device (20). The sensor unit (10) includes at least one type of sensor mounted on the vehicle. That is, the sensor unit (10) may be configured to include a camera (11), an IMU (Inertial Measurement Unit) sensor (12), and a radar sensor (13). FIG. 2 is a configuration diagram of a computing device (20) according to one embodiment. As can be seen from FIG. 2, the computing device (20) may be configured to include at least one processor (P), memory (M), and communication device (C). The memory (M) contains instructions executable by at least one processor (P), and at least one processor (P) executes the instructions. The communication device (C) is a device for the computing device (20) to communicate with an external device. In addition, although not shown in FIG. 2, the computing device (20) may further include various input devices, such as a keyboard for receiving data from a user, and various output devices, such as a display for displaying data on a screen. Each operation of the computing device (20) described below may be performed by at least one processor (P) included in the computing device (20). The computing device (20) may be a terminal mounted on a vehicle. FIG. 3 shows an explanatory diagram of the overall operation of a computing device (20) according to one embodiment. First, the computing device (20) calculates an embedded wheel feature by embedding wheel speed data measured for at least one wheel mounted on the vehicle. The wheel speed data includes speed data and angle data for each of at least one wheel. In addition, the computing device (20) calculates a fusion feature by combining features of multiple sensor data obtained from multiple sensors. Here, the multiple sensors may include a camera (11), an