US-20260126810-A1 - Remote Operator to Observe and Rectify On-Vehicle Safety Systems on Autonomous Work Vehicles
Abstract
Systems and methods are disclosed for a remote operator to observe and rectify safety issues on an autonomous work vehicle. For example, an autonomous work vehicle may detect an obstacle on or near a path within a work zone. The obstacle may be detected using a camera, a LiDAR or a RADAR. The autonomous work vehicle may stop prior to contacting the object. The autonomous work vehicle may also image the obstacle and send the image to a mobile device either directly or via a remote server. The user may determine the obstacle is not a safety issue or remove the obstacle from the path. In either case, the user may send an indication through the user interface that the path is clear and the autonomous work vehicle is safe to proceed along the path.
Inventors
- Taylor Bybee
- Mckay Colleni
- Mckord Harris
- James Yonk
- Mike Hornberger
- Mitch Torrie
- Jeff Ferrin
- BRET TURPIN
Assignees
- Autonomous Solutions, Inc.
Dates
- Publication Date
- 20260507
- Application Date
- 20251022
Claims (20)
- 1 . An autonomous work vehicle comprising: a transceiver; an obstacle detection system; a digital storage medium; a steering control system; a speed control system; a camera; and a controller in communication with the camera, the speed control system, the steering control system, the digital storage medium, the obstacle detection system, and the transceiver, the controller: retrieves map data from the digital storage medium, the map defining features and a path within a work site; directs the autonomous work vehicle to drive on or near the path using the steering control system and the speed control system; detects an obstacle along the path using the obstacle detection system; in response to detecting the obstacle in the path, stops the autonomous work vehicle using the steering control system and/or the speed control system; records an image of the obstacle using the camera; sends the image of the obstacle to a user via the transceiver; receives a confirmation via the transceiver that the obstacle has been removed; and in response to receiving the confirmation, drives the autonomous work vehicle along the path using the steering control system and the speed control system.
- 2 . The autonomous work vehicle according to claim 1 , wherein the autonomous work vehicle includes one or more lights, and wherein the controller: turns on the lights in a first color when the autonomous work vehicle is following the path; and turns on the lights in a second color when the autonomous work vehicle is stopped after detecting an obstacle.
- 3 . The autonomous work vehicle according to claim 1 , wherein the autonomous work vehicle includes one or more speakers or sirens, and wherein the controller: transmits an audible alarm via the one or more speakers or sirens when the autonomous work vehicle is following the path; and stops transmitting the audible alarm via the one or more speakers or sirens when the autonomous work vehicle is following the path.
- 4 . The autonomous work vehicle according to claim 1 , wherein the obstacle detection system comprises a LiDAR or a radar.
- 5 . An autonomous work vehicle control system comprising: an autonomous work vehicle that autonomously follows a path by viewing it's surrounding with one or more sensors; and a mobile device with a graphical user interface that is in communication with the autonomous work vehicle the mobile device receives sensor data from the autonomous work vehicle when the autonomous work vehicle encounters an obstacle or has a blind spot; displays at least a portion of the sensor data to a user via the graphical user interface; receives one or more responses from the user via the graphical user interface whether to allow the autonomous work vehicle to do one or more of the following continue along the path, to stop, or to wait until the user removes the obstacle; and sends instructions to the autonomous work vehicle with instructions based on the received response from the user.
- 6 . The autonomous work vehicle control system according to claim 5 , wherein the sensor data comprises a photographic image of a field of view from the perspective of the autonomous work vehicle.
- 7 . The autonomous work vehicle control system according to claim 5 , further comprising displaying one or more buttons to the user via the graphical interface, wherein each button allows the user to select one of continue, stop, or wait.
- 8 . The autonomous work vehicle control system according to claim 5 , further comprising displaying a button to the user via the graphical interface, wherein the button allows the user to instruct the autonomous work vehicle to proceed despite any obstacles, and the mobile device sends instructions to the autonomous work vehicle to proceed along the path despite the presence of an obstacle.
- 9 . The autonomous work vehicle control system according to claim 5 , further comprising displaying one or more buttons to the user via the graphical interface, wherein the buttons allow the user to indicate whether an obstacle has been visually detected by the user.
- 10 . (canceled)
- 11 . (canceled)
- 12 . (canceled)
- 13 . (canceled)
- 14 . (canceled)
- 15 . (canceled)
- 16 . (canceled)
- 17 . (canceled)
- 18 . An autonomous work vehicle comprising: a transceiver; an obstacle detection system; a digital storage medium; a steering control system; a speed control system; a camera; and a controller in communication with the camera, the speed control system, the steering control system, the digital storage medium, the obstacle detection system, and the transceiver, the controller: retrieves map data from the digital storage medium, the map defining features and a path within a work site; directs the autonomous work vehicle to drive on or near the path using the steering control system and the speed control system; detects a blind spot on or near the path using the obstacle detection system, a blind spot is an area on the map that cannot be viewed by obstacle detection subsystem; in response to detecting the blind spot, stops the autonomous work vehicle using the steering control system and/or the speed control system; records an image of the blind spot using the camera; sends the image of the blind spot to a user via the transceiver; receives a confirmation via the transceiver that the autonomous work vehicle can safely proceed into the blind spot; and in response to receiving the confirmation, drives the autonomous work vehicle along the path and into the blind spot using the steering control system and the speed control system.
- 19 . The autonomous work vehicle according to claim 18 , wherein the autonomous work vehicle includes one or more lights, and wherein the controller: turns on the lights in a first color when the autonomous work vehicle is following the path; and turns on the lights in a second color when the autonomous work vehicle is stopped after detecting an obstacle.
- 20 . The autonomous work vehicle according to claim 18 , wherein the autonomous work vehicle includes one or more speakers or sirens, and wherein the controller: transmits an audible alarm via the one or more speakers or sirens when the autonomous work vehicle is following the path; and stops transmitting the audible alarm via the one or more speakers or sirens when the autonomous work vehicle is following the path.
Description
BACKGROUND Autonomous work vehicles can utilize onboard digital maps to guide steering and speed control systems along predefined routes within work sites. In many implementations, lidar, radar, or ultrasonic obstacle detection systems monitor the vehicle's path and trigger automatic slowing or full stops when obstructions are encountered. While effective for basic safety, these implementations typically lack integrated mechanisms for capturing detailed visual information about detected obstacles or for transmitting such information to remote operators in real time. To improve situational awareness, certain systems have incorporated cameras to record images or video along the planned path. In these configurations, image processing algorithms may identify potential hazards, but the visual data is generally processed and stored locally. Remote users receive only summary alerts or processed detection flags, rather than raw or annotated images of the obstacles themselves, limiting the operator's ability to assess the nature and severity of the blockage. Wireless transceivers have been added to some heavy machinery platforms to relay telemetry, status updates, and simple admission or clearance signals back to a central control station. These solutions enable remote monitoring of vehicle position, speed, and basic sensor alarms, yet they seldom support user-initiated confirmation workflows based on visual evidence. As a result, vehicles may remain halted until manual intervention occurs, without a structured mechanism for a user to confirm obstacle removal and trigger resumption of autonomous operation. Other approaches have attempted to merge path guidance, obstacle detection, and remote communication by issuing generic stop alerts when an obstruction is detected. The alerts often contain minimal contextual information—such as timestamp and sensor classification—omitting precise map references and clear images of the obstacle. Consequently, operators receive insufficient detail to verify that an obstruction has been cleared, and vehicles may await arbitrary timeout intervals or manual overrides before proceeding. SUMMARY Some examples disclosed in this document relate to an autonomous work vehicle including: a transceiver; an obstacle detection system; a digital storage medium; a steering control system; a speed control system; a camera; and a controller in communication with the camera, the speed control system, the steering control system, the digital storage medium, the obstacle detection system, and the transceiver, the controller: retrieves map data from the digital storage medium, the map defining features and a path within a work site; directs the autonomous work vehicle to drive on or near the path using the steering control system and the speed control system; detects an obstacle along the path using the obstacle detection system; in response to detecting the obstacle in the path, stops the autonomous work vehicle using the steering control system and/or the speed control system; records an image of the obstacle using the camera; sends the image of the obstacle to a user via the transceiver; receives a confirmation via the transceiver that the obstacle has been removed; and in response to receiving the confirmation, drives the autonomous work vehicle along the path using the steering control system and the speed control system. Some examples disclosed in this document relate to an autonomous work vehicle, wherein the autonomous work vehicle includes one or more lights, and wherein the controller: turns on the lights in a first color when the autonomous work vehicle is following the path; and turns on the lights in a second color when the autonomous work vehicle is stopped after detecting an obstacle. Some examples disclosed in this document relate to an autonomous work vehicle, wherein the autonomous work vehicle includes one or more speakers or sirens, and wherein the controller: transmits an audible alarm via the one or more speakers or sirens when the autonomous work vehicle is following the path; and stops transmitting the audible alarm via the one or more speakers or sirens when the autonomous work vehicle is following the path. Some examples disclosed in this document relate to an autonomous work vehicle, wherein the obstacle detection system includes a LiDAR or a radar. Some examples disclosed in this document relate to an autonomous work vehicle control system including: an autonomous work vehicle that autonomously follows a path by viewing it's surrounding with one or more sensors; and a mobile device with a graphical user interface that is in communication with the autonomous work vehicle the mobile device receives sensor data from the autonomous work vehicle when the autonomous work vehicle encounters an obstacle or has a blind spot; displays at least a portion of the sensor data to a user via the graphical user interface; receives one or more responses from the user via the graphical user in