US-20260126795-A1 - AUTONOMOUS ROBOTICS PLATFORM
Abstract
A robotic system includes an autonomous mobile robot (AMR) and one or more processing circuits. The AMR includes a tractive assembly configured to facilitate movement of the AMR and a sensor system configured to facilitate acquiring data regarding (a) surroundings of the AMR and (b) a location of the AMR. The one or more processing circuits are configured to acquire images and location information via the sensor system at each of a plurality of predefined locations each time the AMR navigates to each of the plurality of predefined locations over a period of time, and organize the images in a location-based, time-shifting arrangement such that (a) first images associated with a first location are arranged together in a time sequential arrangement over the period of time and (b) second images associated with a second location are arranged together in the time sequential arrangement over the period of time.
Inventors
- Lana Graf
- Alex Rand
- Eric J. Cushman
- Thomas Freeman Gilbane, JR.
Assignees
- OSHKOSH CORPORATION
Dates
- Publication Date
- 20260507
- Application Date
- 20251231
Claims (20)
- 1 . A robotic system comprising: an autonomous mobile robot (AMR) including: a tractive assembly configured to facilitate movement of the AMR; and a sensor system configured to facilitate acquiring data regarding (a) surroundings of the AMR and (b) a location of the AMR; one or more processing circuits configured to: control the tractive assembly to autonomously navigate that AMR between a plurality of predefined locations within a site multiple times over a period of time, the plurality of predefined locations including at least a first location and a second location; acquire images and location information via the sensor system at each of the plurality of predefined locations each time the AMR navigates to each of the plurality of predefined locations over the period of time; and organize the images in a location-based, time-shifting arrangement such that (a) first images of the images associated with the first location are arranged together in a time sequential arrangement over the period of time and (b) second images of the images associated with the second location are arranged together in the time sequential arrangement over the period of time.
- 2 . The robotic system of claim 1 , wherein the one or more processing circuits include at least one of (a) a first processing circuit located on the AMR or (b) a second processing circuits located remote from the AMR.
- 3 . The robotic system of claim 1 , wherein the one or more processing circuits include are configured to: receive a location input from a user device, the location input designating the first location; retrieve the first images associated with the first location; and provide the first images for display on the user device in the time sequential arrangement.
- 4 . The robotic system of claim 1 , wherein the one or more processing circuits are configured to: evaluate a most-recent image of the first images associated with the first location; and determine a completion percentage of a task being performed at the first location based on the most-recent image.
- 5 . The robotic system of claim 1 , wherein the one or more processing circuits are configured to: evaluate the data to determine whether a warning condition is detected proximate the AMR as the AMR is navigating through the site; and provide an alert in response to the warning condition being detected.
- 6 . The robotic system of claim 5 , wherein the AMR includes a siren, and wherein the alert includes an audible sound emitted from the siren.
- 7 . The robotic system of claim 5 , wherein the AMR includes a strobe or a warning light, and wherein the alert includes light emitted from the strobe or the warning light.
- 8 . The robotic system of claim 5 , wherein the alert includes a notification to an entity remote from the AMR, and wherein the notification includes the location of the AMR where the warning condition was detected.
- 9 . The robotic system of claim 5 , wherein the warning condition includes an indication of a fire.
- 10 . The robotic system of claim 5 , wherein the warning condition includes an indication of a flood.
- 11 . The robotic system of claim 5 , wherein the warning condition includes an indication of theft or burglary.
- 12 . The robotic system of claim 5 , wherein the warning condition includes an indication of explosion hazards.
- 13 . The robotic system of claim 5 , wherein the warning condition includes an indication of excessive noise levels.
- 14 . The robotic system of claim 5 , wherein the warning condition includes an indication of excessive pollution levels.
- 15 . The robotic system of claim 5 , wherein the warning condition includes an indication of a lack of personal safety equipment being worn by one or more personnel at the site.
- 16 . The robotic system of claim 5 , wherein the warning condition includes an indication of a lack of use of site safety equipment at the site.
- 17 . The robotic system of claim 1 , wherein the one or more processing circuits are configured to: evaluate the data to determine whether a cleaning condition is detected proximate the AMR as the AMR is navigating through the site; and provide an alert in response to the cleaning condition being detected; wherein the cleaning condition includes at least one of an indication of debris that needs to be cleaned up, a spill that needs to be cleaned up, tools or equipment that need to be picked up, or a trash receptable that needs to be emptied.
- 18 . The robotic system of claim 1 , wherein the tractive assembly includes at least one of wheels, tracks, or legs.
- 19 . A robotic system comprising: a non-transitory computer-readable medium having instructions stored thereon that, when executed by one or more processors, cause the one or more processors to: control a tractive assembly of an autonomous mobile robot (AMR) to autonomously navigate that AMR between a plurality of predefined locations within a site multiple times over a period of time, the plurality of predefined locations including at least a first location and a second location; acquire images and location information via a sensor system of the AMR at each of the plurality of predefined locations each time the AMR navigates to each of the plurality of predefined locations over the period of time; and organize the images in a location-based, time-shifting arrangement such that (a) first images of the images associated with the first location are arranged together in a time sequential arrangement over the period of time and (b) second images of the images associated with the second location are arranged together in the time sequential arrangement over the period of time.
- 20 . A robotic system comprising: a non-transitory computer-readable medium having instructions stored thereon that, when executed by one or more processors, cause the one or more processors to: control one or more mobile to robots navigate to a plurality of predefined locations within a site; acquire data via one or more sensors of the one or more mobile robots as the one or more mobile robots navigate the site; provide an alert in response to the data indicating that a warning condition or a cleaning condition is present; acquire images and location information via the one or more sensors of the one or more mobile robots each time one of the one or more mobile robots navigates to a respective location of the plurality of predefined locations over; and organize the images in a location-based, time-shifting arrangement such that the images associated with the respective location are arranged together in a time sequential arrangement.
Description
CROSS-REFERENCE TO RELATED PATENT APPLICATIONS This application is a continuation of U. S Ser. No. 18/298,039 , filed Apr. 10, 2023, which claims the benefit of and priority to U.S. Provisional Application No. 63/328,993, filed Apr. 8, 2022, the entire contents of which are incorporated herein by reference. TECHNICAL FIELD This disclosure relates to robots and, more particularly, to autonomous robots. BACKGROUND Autonomous mobile robots (AMRs) are robots that can move around and perform tasks without the need for human guidance or control. The development of autonomous mobile robots has been driven by advances in robotics, artificial intelligence, and computer vision. The concept of autonomous robots has been around for several decades, but it was not until the late 20th century that the technology became advanced enough to make it a reality. In the early days, autonomous robots were limited to industrial applications, such as manufacturing and assembly line tasks. However, with the advancements in computer processing power and sensors, autonomous robots have become more sophisticated and can now perform a wide range of tasks. Today, AMRs are used in a variety of applications, including warehousing and logistics, agriculture, healthcare, and even in military and defense. The development of autonomous mobile robots has been driven by the need for more efficient and cost-effective solutions for various tasks. AMRs can operate around the clock, without the need for breaks or rest, making them ideal for repetitive tasks that would otherwise require human intervention. SUMMARY Progress Tracking In one implementation, a computer implemented method is executed on a computing device and includes: navigating an autonomous mobile robot (AMR) within a defined space; acquiring imagery at one or more defined locations within the defined space; processing the imagery using an ML model to define a completion percentage for the one or more defined locations within the defined space; and reporting the completion percentage of the one or more defined locations within the defined space to a user. One or more of the following features may be included. The defined space may be a construction site. The imagery may include one or more of: flat images; 360° images; and videos. Navigating an autonomous mobile robot (AMR) within a defined space may include one or more of: navigating an autonomous mobile robot (AMR) within a defined space via a predefined navigation path; navigating an autonomous mobile robot (AMR) within a defined space via GPS coordinates; and navigating an autonomous mobile robot (AMR) within a defined space via a machine vision system. The machine vision system may include one or more of: a LIDAR system; and a plurality of discrete machine vision cameras. The plurality of defined locations may include one or more of: at least one human defined location; and at least one machine defined location. Processing the imagery using an ML model to define a completion percentage for the one or more defined locations within the defined space may include one or more of: comparing the imagery to visual training data to define the completion percentage for the one or more defined locations within the defined space; and comparing the imagery to user's defined completion content to define the completion percentage for the one or more defined locations within the defined space. The ML model may be trained using visual training data that identifies construction projects or portions thereof in various levels of completion so that the ML model may associate various completion percentages with visual imagery. Training the ML model using visual training data that identifies construction projects or portions thereof in various percentages of completion may include: having the ML model make an initial estimate concerning the completion percentage of a specific visual image within the visual training data; and providing the specific visual image and the initial estimate to a human trainer for confirmation and/or adjustment. In another implementation, a computer program product resides on a computer readable medium and has a plurality of instructions stored on it. When executed by a processor, the instructions cause the processor to perform operations including: navigating an autonomous mobile robot (AMR) within a defined space; acquiring imagery at one or more defined locations within the defined space; processing the imagery using an ML model to define a completion percentage for the one or more defined locations within the defined space; and reporting the completion percentage of the one or more defined locations within the defined space to a user. One or more of the following features may be included. The defined space may be a construction site. The imagery may include one or more of: flat images; 360° images; and videos. Navigating an autonomous mobile robot (AMR) within a defined space may include one or more of: navigating an auto