Search

JP-2026075365-A - vehicle

JP2026075365AJP 2026075365 AJP2026075365 AJP 2026075365AJP-2026075365-A

Abstract

[Problem] To ensure that the occupants are aware that the vehicle is recognizing targets even when it is in manual driving mode. [Solution] When the vehicle is in driving assistance mode, the support unit performs support processing (e.g., ACC, LKA) to support at least one of the vehicle's acceleration/deceleration and steering based on the target recognition result by the target recognition unit. When the vehicle is in manual driving mode in which the support unit does not perform support processing, the display control unit displays the targets recognized by the target recognition unit (leading target image 80 and surrounding target image 82) on the display unit. [Selection Diagram] Figure 4

Inventors

  • 高田 新
  • 服部 勉
  • 小西 翔吾
  • 西田 喬士

Assignees

  • トヨタ自動車株式会社

Dates

Publication Date
20260508
Application Date
20241022

Claims (4)

  1. When the vehicle is in driving assistance mode, the support unit performs support processing to assist at least one of the acceleration/deceleration and steering of the vehicle based on the target recognition result by the target recognition unit, When the vehicle is in a manual driving mode in which the support unit does not perform the support processing, a display control unit is provided to display the object recognized by the object recognition unit on the display unit, Vehicles including this.
  2. The vehicle according to claim 1, wherein the object to be displayed on the display unit when the vehicle is in manual driving mode is a surrounding object located in a lane other than the lane in which the vehicle is traveling.
  3. When the vehicle is in driving assistance mode, the support unit performs support processing to assist at least one of the acceleration/deceleration and steering of the vehicle based on the target recognition result by the target recognition unit, In response to input instructions, the display control unit switches the screen displayed on the display unit to the surrounding monitoring screen or another screen, and when the surrounding monitoring screen is displayed on the display unit, even when the vehicle is in manual driving mode where the support unit does not perform the support processing, the display control unit displays the object recognized by the object recognition unit on the display unit. Vehicles including this.
  4. The vehicle according to claim 3, wherein the display control unit displays information including the operating status of the support process on the display unit while the other screen is displayed on the display unit, but does not display the target recognized by the target recognition unit on the display unit.

Description

This disclosure pertains to vehicles. Patent Document 1 describes displaying landmarks such as other vehicles on a display unit during the execution of automated driving (assistance processing), which automatically performs some or all of the operations of the accelerator, brakes, turn signals, steering, etc. Japanese Patent Publication No. 2022-41288 This is a block diagram showing the schematic configuration of an in-vehicle system according to an embodiment.This is a flowchart illustrating an example of screen display processing.This is an illustrative diagram showing an example of the surrounding monitoring screen displayed on the display unit when the vehicle is in driver assistance mode.This is an illustrative diagram showing an example of the surrounding monitoring screen displayed on the display unit when the vehicle is in manual driving mode.This is an illustrative diagram showing an example of the meter screen displayed on the display unit when the vehicle is in manual driving mode.This is an illustrative diagram showing an example of the driving mode display screen that appears on the display unit when the vehicle is in manual driving mode. Hereinafter, an example of an embodiment of this disclosure will be described in detail with reference to the drawings. As shown in Figure 1, the in-vehicle system 10 according to this embodiment is mounted on a vehicle 11 and is equipped with a communication bus 12. The communication bus 12 is connected to a group of surrounding situation acquisition devices 14, a group of vehicle driving state detection sensors 26, a target recognition ECU (Electronic Control Unit) 60, an ADAS (Advanced Driver-Assistance Systems) ECU 34, and a display control ECU 42, respectively. Note that Figure 1 shows only a part of the in-vehicle system 10. Furthermore, the vehicle 11 on which the in-vehicle system 10 is mounted will be referred to as "the vehicle 11" below. The vehicle 11 is an example of a vehicle according to this disclosure. The surrounding environment acquisition device group 14 includes a GNSS (Global Navigation Satellite System) device 16, an in-vehicle communication device 18, a navigation system 20, a radar device 22, and a camera unit 24, among others, as devices that acquire information representing the surrounding environment of the vehicle 11. The GNSS device 16 receives GNSS signals from multiple GNSS satellites to determine the position of the vehicle 11. The on-board communication device 18 performs at least one of vehicle-to-vehicle communication with other vehicles and vehicle-to-infrastructure communication with roadside units. The navigation system 20 includes a map information storage unit 20A that stores map information. Based on the position information obtained from the GNSS device 16 and the map information stored in the map information storage unit 20A, it displays the position of the vehicle 11 on a map and determines and guides the vehicle to its destination. The radar device 22 detects objects such as other vehicles and pedestrians present around the vehicle 11 as point cloud information, and acquires the relative position and relative speed of the detected objects and the vehicle 11. Furthermore, based on changes in the relative position and relative speed of individual objects, the radar device 22 excludes noise, roadside objects such as guardrails, etc., from the monitoring target, and outputs information such as the relative position and relative speed of the monitored objects (targets) such as other vehicles and pedestrians. The camera unit 24 captures images of the area around the vehicle 11 with multiple cameras and outputs the captured images. Furthermore, the vehicle driving state detection sensor group 26 includes, as multiple sensors for acquiring the driving state of the vehicle 11, a steering angle sensor 28 for detecting the steering angle of the vehicle 11, a vehicle speed sensor 30 for detecting the driving speed of the vehicle 11, and an acceleration sensor 32 for detecting the acceleration applied to the vehicle 11. The target recognition ECU 60, although not shown in the diagram, incorporates a CPU (Central Processing Unit), memory such as ROM (Read Only Memory) and RAM (Random Access Memory), and storage such as an HDD (Hard Disk Drive) and SSD (Solid State Drive). The storage contains a predetermined program that enables the CPU of the target recognition ECU 60 to function as a target recognition unit 62. The target recognition unit 62 recognizes the boundary line of the lane (own lane) in which the vehicle 11 is traveling, from the image of the front of the vehicle 11 captured by the camera unit 24. Furthermore, regardless of whether the vehicle 11 is in manual driving mode or driver assistance mode, the target recognition unit 62 recognizes targets (such as a preceding vehicle traveling in its own lane or a vehicle traveling in an adjacent lane) present around the vehicle 11 as captured by the camera unit 24, recognize