EP-4372316-B1 - ROUTE GUIDANCE DEVICE AND ROUTE GUIDANCE SYSTEM BASED ON AUGMENTED REALITY AND MIXED REALITY
Inventors
- LEE, KIHYUNG
- JANG, Yujung
- CHOI, Sunghwan
- KIM, Seungman
Dates
- Publication Date
- 20260506
- Application Date
- 20220712
Claims (15)
- A route guidance device comprising: a communication unit (1310) that communicates with a cloud server; an interface unit (1320) that receives, from at least one sensor disposed in a vehicle, an environment image around a vehicle including an image of a road on which the vehicle is traveling, and sensing information obtained by sensing a traveling state of the vehicle; an Augmented Reality, AR, module (800) that renders AR information including at least one of the sensing information and Point Of Interest (POI) information received from the cloud server; and a Mixed Reality, MR, module (900) that renders MR information including at least one virtual object, based on the sensing information and map information received from the cloud server; and a processor (1330) that controls the interface unit so that a first view image, which is one of an AR view image including the AR information and an MR view image including the MR information, is displayed on a display of the vehicle, and controls the interface unit so that a second view image, which is the other one of the AR view image and the MR view image, is displayed on a portion of an area of the display where the first view image is displayed based on whether the vehicle enters a specific situation, and wherein the second view image is aligned in the manner that a point of attention of a Field of View (FOV) of the second view image is directed to a point of attention of a FOV of the first view image.
- The route guidance device of claim 1, wherein the processor (1330) determines whether the vehicle has entered the specific situation according to at least one of the traveling state of the vehicle, a state of the road on which the vehicle travels, a weather condition around the vehicle, and time at which the vehicle travels, and wherein the traveling state of the vehicle includes whether the vehicle is in a stopped state or is traveling, and the state of the road on which the vehicle travels includes a curvature variation of the road and a slope variation of the road.
- The route guidance device of claim 2, wherein the processor (1330) calculates the curvature variation of the road according to a sensing value of a gyro sensor, which senses an inclination of the vehicle, and a detection result of a lane detector, which detects a lane of the road on which the vehicle travels, calculates the slope variation of the road according to a road shape detected through a vertical profile and a high-definition map (HD MAP) detected based on map information related to the road on which the vehicle is currently traveling, and determines whether the vehicle has entered the specific situation based on an AR fitting rate between a real image ahead of the vehicle acquired through a camera of the vehicle and an AR object displayed on the real image according to the calculated curvature variation and slope variation.
- The route guidance device of claim 1, wherein the processor (1330) determines a location of the vehicle on a travel route based on the sensing information, and determines whether the vehicle has entered the specific situation by determining whether the vehicle has departed from a route on which the vehicle is capable of traveling or whether the vehicle is located at a junction on the route or adjacent to an exit or destination within a predetermined distance.
- The route guidance device of claim 1, wherein the processor (1330) controls the interface unit to display warning information for warning a dangerous area located around the vehicle or a possibility of a collision detected from around the vehicle, based on the sensing information, and determines whether the vehicle has entered the specific situation depending on whether the displayed warning information is exposed ahead of the vehicle.
- The route guidance device of claim 1, wherein the processor determines objects detected from around the vehicle based on the sensing information, and determines whether the vehicle has entered the specific situation based on at least one of a number of the determined objects and sizes of the objects.
- The route guidance device of claim 1, wherein the processor (1330) controls the interface unit to split the display into a first display area and a second display area when the vehicle enters the specific situation, and controls the interface unit to display a first view image and a second view image on the first display area and the second display area, respectively.
- The route guidance device of claim 7, wherein the first display area is a display area where an image of an area, in which a distance from the vehicle is within a preset distance, is displayed, and the second display area is a display area where an image of an area, in which the distance from the vehicle exceeds the preset distance, is displayed.
- The route guidance device of claim 8, wherein the processor (1330) controls the interface unit to display the AR view image and the MR view image on the first display area and the second display area, respectively.
- The route guidance device of claim 7, wherein the processor (1330), when the vehicle enters the specific situation, generates a second view image having the same point of attention as that of the first view image by changing a camera calibration of the second view image according to a camera calibration of the first view image.
- The route guidance device of claim 10, wherein the second view image is an image with the same size and ratio as the first view image, based on a Field of View (FOV) of the first view image.
- The route guidance device of claim 10, wherein the first view image and the second view image are connected to each other through a boundary surface of a view image to enable movement of an object between one view image and another view image.
- The route guidance device of claim 7, wherein the first display area is an area where a first view image providing route information related to a travel route on which the vehicle is currently traveling is displayed, and the second display area is an area where a second view image providing route information related to a route on which the vehicle has not traveled yet or has already traveled is displayed.
- A route guidance system comprising: a route guidance device that is mounted on a vehicle and displays on a display of the vehicle an Augmented Reality (AR) view image including AR information rendered based on Point Of Interest (POI) information received or a Mixed Reality (MR) view image including MR information rendered based on Three-Dimensional (3D) map information; and a cloud server that provides the route guidance device with POI information or 3D map information corresponding to a current, past, or predicted future location of the vehicle, in response to a request of the route guidance device, wherein the route guidance device comprises a processor that controls the display to display a first view image which is one of the AR view image and the MR view image and controls the display to display a second view image which is the other one of the AR view image and the MR view image on a portion of an area of the display where the first view image is displayed, based on whether the vehicle enters the specific situation, and wherein the second view image is aligned in the manner that a point of attention of a Field of View (FOV) of the second view image is directed to a point of attention of a FOV of the first view image.
- The route guidance system of claim 14, wherein the cloud server comprises: a Digital Twin as a Service (DTaaS) server providing digital-twin 3D map information including virtual objects corresponding to respective buildings included in a map area; an MR server that performs communication connection with the route guidance device, provides location information related to the vehicle, collected from the route guidance device, to the DTaaS server, and provides the digital-twin 3D map information provided by the DTaaS server to the route guidance device; and an AR server that receives the location information related to the vehicle, provided from the route guidance device and the sensing information, and provides POI information corresponding to the received information to the route guidance device.
Description
Technical Field The present invention relates to a route guidance device and a route guidance system for guiding a route for a vehicle to travel. Background Art Recently, Augmented Reality (AR) that outputs a graphic object through a windshield or a Head Up Display (HUD) of a vehicle or additionally outputs a virtual object to the real world by using a graphic object that is overlaid on an image captured by a camera has appeared. A vehicle is currently providing a driver with additional information related to an environment around the vehicle, a vehicle status, and a driving route (travel route) of the vehicle through the AR technology, and thus the driver can intuitively recognize the vehicle and the traveling environment of the vehicle. Therefore, traveling efficiency and convenience can be further improved. Meanwhile, when using such AR technology, various types of information necessary for driving a vehicle may be provided based on the real world. In other words, the AR technology uses images of the real world acquired through a camera, and requires acquisition of clear images of the real world. However, since a sensor, namely, a camera that acquires images of the real world senses a real-time environment around the vehicle, there is a problem that route guidance information cannot be accurately identified from the images acquired from the sensor, due to obstacles, such as rain, snow, shadows of street trees, or vehicles ahead in case of bad weather such as the rain or snow or in a complex traffic situation such as traffic jams. As one example, the camera may not be able to recognize a lane in which the vehicle is currently traveling due to snow, rain, shadows, or a vehicle ahead. Additionally, in the case of a road with different heights, such as a ramp, on which a vehicle travels, or a road with complex curves, the slope or curves of the road may not be recognized. In this case, there is a problem that AR objects related to lanes may not be displayed or incorrect AR objects may be displayed. In other words, there is a problem that discrepancy may occur between the AR object and the real environment depending on the complexity of the real world acquired through the camera or the state of an image obtained. Meanwhile, following this AR technology, a technology related to Mixed Reality (MR), which can provide various simulation information related to a vehicle by applying Digital Twin (DT) technology, is actively being developed. As an effort of developing such MR-related technologies, a method of providing information related to route guidance to a driver using the MR is being actively researched. The route guidance using the MR has an advantage of providing a driver with various types of information that the driver in a cockpit cannot check, such as displaying a graphic object corresponding to a vehicle on a 3D map digitized through the digital twinning technology and providing information related to a driving route on which the driver has not driven the vehicle yet through the map and a graphic object, or providing a field of view (viewing angle) such as a bird's-eye view. This MR provides vehicle-related information through virtual objects displayed through a digitized 3D map, and may provide information regardless of images of the real world obtained through a camera. Therefore, a problem that discrepancy may occur between provided information and an actual environment depending on the complexity of the real world acquired through the camera or the state of an image obtained. However, the MR provides information through images of a digitized 3D map. Therefore, depending on the degree of correspondence between the 3D map image and the real world around the vehicle, discrepancy may occur between a graphic object provided through the MR, that is, an MR object, and the real environment. However, it is very difficult to provide a 3D map that is completely identical to the real world, and thereby information related to stationary objects such as buildings or objects with a specific size or greater such as vehicles can be merely provided, but it is difficult to display objects, such as people or animals around the vehicles, which are small or difficult to be sensed, through the MR using the 3D map images. Due to this problem, it is difficult to completely replace AR, which directly uses images of the real world, with MR. Accordingly, technology development for effective ways to use both AR and MR is being actively researched. US 2021/125411 A1 discloses a method of controlling augmented reality (AR) mobility which includes: generating, by a camera, image data by photographing one or more users, extracting information about the one or more users from the image data, wherein the information about the one or more users may include location information about the users, calculating a reference point for projection of an AR object based on the location information about the users, and displaying the AR object on