Search

JP-2026075911-A - Position estimation device and position estimation method

JP2026075911AJP 2026075911 AJP2026075911 AJP 2026075911AJP-2026075911-A

Abstract

[Problem] To provide a technology for easily estimating the position and orientation of a vehicle. [Solution] A position estimation device for estimating the position and orientation of a vehicle in a space where a wall is installed comprises: an acquisition unit that acquires detection results including the distance and angle to the wall from a range sensor mounted on the vehicle; a storage unit that stores target position information including the distance and angle to the wall at a target position in the space; and an estimation unit that estimates the position and orientation of the vehicle relative to the target position using the detection results from the range sensor and the target position information. [Selection Diagram] Figure 2

Inventors

  • 山本 拓也
  • 上田 晃宏

Assignees

  • トヨタ自動車株式会社

Dates

Publication Date
20260511
Application Date
20241023

Claims (5)

  1. A position estimation device for estimating the position and orientation of a vehicle within a space where walls are installed, An acquisition unit that acquires detection results including the distance and angle to the wall from a range sensor mounted on the vehicle, A storage unit that stores target position information including the distance and angle from the wall at the target position in the space, An estimation unit that estimates the position and attitude of the vehicle relative to the target position using the detection results of the range sensor and the target position information, A position estimation device equipped with the following features.
  2. A position estimation device according to claim 1, The estimation unit, Using the previous estimation results and the detection results of the vehicle speed sensor mounted on the vehicle, the range of the detection results of the range sensor, which may include the distance and angle to obstacles other than the wall, including the distance and angle to the wall, is determined as the usable range. A position estimation device that estimates the position and attitude of the vehicle relative to the target position using the aforementioned usage range and the aforementioned target position information.
  3. A position estimation device according to claim 2, The estimation unit is a position estimation device that calculates an approximate straight line of the wall using the usage range and estimates the position and attitude of the vehicle from the distance and angle between the range sensor and the approximate straight line.
  4. A position estimation device according to claim 1, A position estimation device further comprising a control unit that uses the estimation results of the estimation unit and the target position information to drive the vehicle to the target position.
  5. A position estimation method for estimating the position and orientation of a vehicle within a space where walls are installed, The vehicle is equipped with a range sensor to obtain detection results including the distance and angle to the wall. Using the detection results of the range sensor and target position information including the distance and angle with respect to the wall at the target position in the space, the position and orientation of the vehicle relative to the target position are estimated. Location estimation method.

Description

This disclosure relates to a position estimation device and a position estimation method for estimating the position and orientation of a vehicle. An automated guided vehicle (AGV) that navigates while estimating its own position using sensors that detect the distance to objects and a three-dimensional map is known (for example, Patent Document 1). Japanese Patent Publication No. 2023-084219 An explanatory diagram showing the configuration of the vehicle according to the first embodiment.An explanatory diagram showing the vehicle of the first embodiment driving inside a garage.A flowchart showing the processing procedure for vehicle driving control in the first embodiment.An explanatory diagram showing the vehicle position estimation method of the first embodiment.An explanatory diagram showing the vehicle position estimation method of the second embodiment. A. First embodiment: Figure 1 is an explanatory diagram showing the configuration of a vehicle 100 equipped with a control device 150 as a position estimation device in the first embodiment. Figure 2 is an explanatory diagram showing the vehicle 100 traveling inside a garage 200. The vehicle 100 shown in Figure 1 is configured to be able to travel autonomously while estimating its own position and orientation. The vehicle 100 is used, for example, to transport automobiles VH in an automobile manufacturing plant. The vehicle 100 is equipped with two range sensors 110F and 110L, a vehicle speed sensor 120, a GNSS (Global Navigation Satellite System) receiver 130, an actuator group 140, and a control device 150. However, the vehicle 100 does not necessarily have to be equipped with a GNSS receiver 130. The first range sensor 110F is fixed to the front of the vehicle 100, facing forward. The second range sensor 110L is fixed to the left side of the vehicle 100, facing left. In the following description, the first range sensor 110F will be referred to as the front range sensor 110F, and the second range sensor 110L will be referred to as the left range sensor 110L. When the front range sensor 110F and the left range sensor 110L are not specifically distinguished, they will simply be referred to as range sensor 110. The range sensor 110 scans the surroundings and detects objects present in the surroundings. In this embodiment, the range sensor 110 is a two-dimensional LiDAR. The range sensor 110 scans a 180-degree range in 0.25-degree increments. The range sensor 110 outputs two-dimensional distance information as a detection result. The two-dimensional distance information contains multiple pairs of angle-distance data. The angle and distance included in the two-dimensional distance information represent the azimuth angle of the detection point on the object as viewed from the range sensor 110, and the distance from the range sensor 110 to the detection point on the object. The vehicle speed sensor 120 detects the vehicle speed, i.e., the speed of the vehicle 100. For example, the vehicle speed sensor 120 can be a wheel speed sensor that detects the vehicle speed based on the rotational speed of the wheels and the size of the wheels. The GNSS receiver 130 detects the current position of the vehicle 100 using signals transmitted from GNSS satellites. For example, a GPS receiver can be used as the GNSS receiver 130. The actuator group 140 includes actuators for the drive system that generates the propulsion force of the vehicle 100, actuators for the steering system that changes the steering angle of the vehicle 100, and actuators for the braking system that generates the braking force of the vehicle 100. In this embodiment, the vehicle 100 is configured as an electric vehicle (BEV: Battery Electric Vehicle), and the drive system is driven by battery power. As shown in Figure 2, the vehicle 100 has front wheels 101 arranged symmetrically around the longitudinal axis CL that passes through the center of the vehicle 100 in a plan view, and rear wheels 102 arranged on the longitudinal axis CL of the vehicle 100 in a plan view. The vehicle 100 can rotate around the rear wheels 102 with a steering angle of +90 degrees or -90 degrees. As shown in Figure 1, the control device 150 is composed of a computer comprising a processor 151, a memory 152, an input/output interface 153, and an internal bus 154. The processor 151, memory 152, and input/output interface 153 are connected via the internal bus 154 for bidirectional communication. The input/output interface 153 is connected to the front range sensor 110F, left range sensor 110L, vehicle speed sensor 120, GNSS receiver 130, and actuator group 140 via signal lines, etc. Memory 152 pre-stores the driving program PG, which is a computer program for autonomous driving of the vehicle 100, and target position information TZ, which relates to the target position of the vehicle 100. The processor 151, by executing the driving program PG, functions as an acquisition unit 155 that acquires detection results from the range sen