US-12617430-B2 - Method and apparatus for trajectory shape generation for autonomous vehicles
Abstract
An apparatus for controlling a direction and speed of travel of an autonomous vehicle or driver assisted autonomous vehicle (AV). A GPS and map module receive a start location and a destination location for the AV and generate a plan to move the AV from the start location to the destination location. A trajectory profile generator module receives the plan and calculates a path to move the AV from the start location to the destination location. A supervisory control module receives the calculated path and selects a speed for the AV based on a geometry of the calculated path, inner ear constraints comprising a level of frequencies in a human's inner ear crossover spectrum and a decay time of inner ear disturbance history, the speed limit, and environmental information. A steering control module receives the calculated path and selected speed and control acceleration of the AV based thereon.
Inventors
- David Arthur Bailey
- Daniel Kee Young Kim
Assignees
- LIT MOTORS CORPORATION
Dates
- Publication Date
- 20260505
- Application Date
- 20200610
Claims (2)
- 1 . An apparatus for controlling a direction and speed of travel of an autonomous vehicle or driver assisted autonomous vehicle (AV), comprising: a GPS and map module configured to receive a start location and a destination location for the AV, and a speed limit, and generate a plan for moving the vehicle from the start location to the destination location; a trajectory profile generator module configured to receive the plan and calculate in real-time a path including a sequence of changes in course or speed of the vehicle to move the vehicle from the start location to the destination location, wherein to calculate the path in real-time comprises: tracking each lateral acceleration event with an amplitude above a human sensitivity level, and a corresponding severity; calculating a time weighted severity sum based on the corresponding severity of the tracked lateral acceleration events; and selecting a lateral acceleration level below a maximum lateral acceleration level that can induce motion sickness in a human for an upcoming one of the sequence of changes in course or speed of the vehicle based on the time weighted severity sum; a supervisory control module configured to receive the calculated path and the selected lateral acceleration level and select a speed for the AV based on a geometry of the calculated path, the selected lateral acceleration level, inner ear constraints comprising a level of frequencies in a human's inner ear crossover spectrum and a decay time of inner ear disturbance history, the speed limit, and environmental information; and; a steering control module configured to receive the calculated path and the selected speed and control a lateral acceleration and a change in lateral acceleration of the vehicle based on the calculated path and the selected speed.
- 2 . The apparatus of claim 1 , wherein the trajectory profile generator module configured to receive the plan and calculate in real-time the path including the sequence of changes in course or speed of the vehicle to move the vehicle from the start location to the destination location, comprises the trajectory profile generator module configured to generate a curve that has C3 characteristics or greater.
Description
CLAIM OF PRIORITY This patent application is related, and claims priority, to provisional patent application No. 62/859,649 filed Jun. 10, 2019, entitled “Optimal Trajectory Shape Generation for the Minimization of Motion Sickness in Autonomous Vehicles”, the contents of which are incorporated herein by reference. TECHNICAL FIELD Embodiments of the invention relate to autonomous vehicles, in particular, controlling the trajectory of an autonomous vehicle to minimize motion sickness of passengers in the autonomous vehicle. BACKGROUND A self-driving car, also known as an autonomous vehicle (AV), a connected and autonomous vehicle (CAV), a driverless car, or a robotic car (robo-car), is a vehicle that is capable of sensing its environment and moving safely with little or no human input. Self-driving cars combine a variety of sensors to perceive their surroundings, such as video, radar, lidar, sonar, GPS, odometry and inertial measurement units. Control systems interpret sensory information to identify appropriate navigation paths, as well as obstacles and relevant signage. Autonomous vehicles can make their passengers motion sick. Motion sickness has been linked to the frequency, level, and direction of external accelerations imparted upon the passenger, as well as a passenger's lack of anticipation of a maneuver performed by the autonomous vehicle. Part of the passenger's lack of anticipation is based on how an autonomous vehicle performs a maneuver, i.e., the shape of the velocity vs. time graph forming the autonomous vehicle's acceleration and the attitude (i.e., the three-dimensional orientation) of the autonomous vehicle. Time optimal paths and constant curvature turns, both used in autonomous vehicles, are not the way people maneuver vehicles. What is needed is a “passenger comfort” optimal path in which reduced probability of motion sickness is the property to be optimized. BRIEF DESCRIPTION OF THE DRAWINGS Embodiments are illustrated by way of example, and not by way of limitation, and will be more fully understood with reference to the following detailed description when considered in connection with the figures in which: FIG. 1 is a functional block diagram of embodiments of the invention; and FIG. 2 is a depiction of the geometric constraints used when generating a trajectory. DETAILED DESCRIPTION Embodiments of the invention control an autonomous vehicle trajectory and speed to reduce the likelihood of a passenger getting motion sickness. In particular, embodiments of the invention control the transitioning trajectory between the dynamic states of position (direction), velocity and acceleration of the autonomous vehicle. Dynamic states of an autonomous vehicle in this context are constituted by changes in the vehicle's embedded longitudinal axis (back to front), controlled by propulsion and braking, and the vehicle's lateral, or cross, axis, controlled by steering. Multiple interdependent controls for vehicle speed and direction are used to create the desired motion sickness reducing trajectory. Controls include, but are not limited to, controlling the jerk (i.e., the change in acceleration) component of the dynamic states, including lateral and forward motions, within the path constraints of a desired direction of travel. The speed and direction controls are managed by a supervisory control module that constrains the time-dependent lateral accelerations and jerk in the frequency range that potentially is disturbing to the inner ear of a passenger in the autonomous vehicle. With reference to FIG. 2, examples of autonomous vehicle maneuvers to be controlled in this context include actions such as turning from one street to another or changing from one Lane 220 to another (such as changing lanes or turning from one street to another street). A Lane 220 is defined herein as the width of an area on a road or surface over which an autonomous vehicle can be driven. More broadly a maneuver is any change in course or speed and the trajectory is the path taken between one or both of those states. Embodiments are primarily for autonomous vehicles but may be applicable to driver assisted autonomous vehicles as well. With reference to FIG. 1, an embodiment of the invention 100 receives input from three sources: a GPS and map module 101 that provides a current, or a selected or chosen starting location, and a selected destination location, for the autonomous vehicle, with reference to a digitized map; a database of baseline maneuver profiles 103 for the autonomous vehicle, which provides basic information for controlling the direction, speed and acceleration of the autonomous vehicle, for example, for changing lanes, making left or right hand turns, or entering or exiting a freeway; and sensor input 104 from sensors, e.g., forward-looking sensors, and, optionally, lateral-looking sensors, that identify the autonomous vehicle's current lane and turn environment, as well as the autonomous vehicle's proposed lane,