Search

KR-20260066732-A - Driving robot and its control method

KR20260066732AKR 20260066732 AKR20260066732 AKR 20260066732AKR-20260066732-A

Abstract

The present disclosure relates to a driving robot capable of driving to a destination without map and location information and a method for controlling the same. The robot comprises a sensor unit for acquiring surrounding environment information, an input unit for receiving movement commands, a driving motor unit for moving the robot, and a control unit for controlling the driving motor unit based on the surrounding environment information and movement commands. The control unit recognizes a path structure based on the surrounding environment information to generate path structure information, selects a target point for movement and an action of the robot based on the path structure information and movement commands, inputs the selection information including the surrounding environment information and the target point for movement and actions of the robot into a pre-trained neural network model to determine the driving motion of the robot, and controls the driving motor unit so that the robot moves based on the determined driving motion.

Inventors

  • 황금성
  • 최원수
  • 남희진
  • 우성호

Assignees

  • 주식회사 베어로보틱스코리아

Dates

Publication Date
20260512
Application Date
20230919

Claims (15)

  1. A sensor unit for acquiring surrounding environment information; Input section where a movement command is entered; A drive motor unit that moves the robot; and, It includes a control unit that controls the drive motor unit based on the surrounding environment information and movement commands. The above control unit is, A driving robot characterized by recognizing a path structure based on the surrounding environment information to generate path structure information, selecting a target point and action for the robot based on the path structure information and the movement command, inputting selection information including the surrounding environment information and the target point and action for the robot into a pre-trained neural network model to determine the driving direction and speed of the robot, and controlling the driving motor unit so that the robot moves based on the determined driving direction and speed.
  2. In Article 1, The above control unit is, A driving robot characterized by, when generating the above-mentioned path structure information, collecting the surrounding environment information from the sensor unit, generating a measurement distance map based on the surrounding environment information, and inputting the measurement distance map into a pre-trained neural network model to recognize the path structure and generate path structure information.
  3. In Article 2, The above control unit is, A driving robot characterized by, when generating the above-mentioned path structure information, scaling one-dimensional sensor input data including the above-mentioned measurement distance map to a preset size, classifying the scaled input data by feature extraction item, and inputting the classified feature extraction items into a plurality of pre-trained neural network models respectively to generate path structure information for each feature extraction item.
  4. In Article 2, The above control unit is, A driving robot characterized by, when generating the above path structure information, recognizing a straight line of a certain distance or longer as a corridor wall when detected, and calculating the path direction and corridor direction based on the straight line of the wall when the straight line of the wall is located to the left or right of the robot.
  5. In Article 2, The above control unit is, A driving robot characterized by recognizing a straight line of a certain distance or longer as a corridor wall when generating the above-mentioned path structure information, and recognizing the location as an intersection when the straight line of the wall is positioned to face the path.
  6. In Article 2, The above control unit is, A driving robot characterized by recognizing a straight line of a certain distance or longer as a corridor wall when generating the above path structure information, and recognizing the location as a vertex and corner when the straight line of the wall meets the path at a right angle.
  7. In Article 1, The above control unit is, A driving robot characterized by, when selecting a movement target point and action of the robot, checking the type of the movement command when the movement command is input, generating a movement command script corresponding to the type of the movement command, and selecting the movement target point and action of the robot based on the movement command script and the path structure information.
  8. In Article 7, The above control unit is, A driving robot characterized by using a script containing the content of a movement command as the movement command script when checking the type of the movement command, if the movement command is a script type containing the content of a movement command from a starting point to a destination.
  9. In Article 7, The above control unit is, A driving robot characterized by generating a movement command script based on analysis information when checking the type of the movement command, if the movement command is a type of analysis information that analyzed a map and path given prior to the robot's position recognition failure.
  10. In Article 7, The above control unit is, A driving robot characterized by generating a movement command script based on a map containing movement indicators when checking the type of the movement command, if the movement command is a map type containing movement indicators from a starting point to a destination.
  11. In Article 10, The above control unit is, A driving robot characterized by checking whether the straight distance of the straight path and the straight distance between the destination and the intersection are set within the diagram based on a diagram of a block structure containing the above movement indicator, and generating the above movement command script if both the straight distance of the straight path and the straight distance between the destination and the intersection are set.
  12. In Article 7, The above control unit is, A driving robot characterized by generating a movement command script based on a map edit map when checking the type of the movement command and if the movement command is a map edit map type.
  13. In Article 12, The above control unit is, A driving robot characterized by generating the above-mentioned map editing map by providing a plan view of the map, adjusting the transparency of the plan view of the map, inputting the scale and information of the map, inputting the path information of the map, inputting the destination of the map, and performing a test to display a path by selecting the starting point and destination of the map.
  14. In Article 1, The above control unit is, Move command generator that generates move command scripts; A road structure recognizer that recognizes the road structure based on the surrounding environment information and generates road structure information; An action controller that selects the robot's movement target point and action based on the above path structure information and the above movement command; and, A driving robot characterized by including a driving controller that determines a driving motion of the robot by inputting selection information, including surrounding environment information and the robot's movement target point and action, into a pre-trained neural network model, and controls the driving motor unit to move the robot based on the determined driving motion.
  15. In a control method for a driving robot including a drive motor unit, Step of obtaining surrounding environment information and movement commands; A step of recognizing the road structure based on the surrounding environment information and generating road structure information; A step of selecting a movement target point and action of the robot based on the above path structure information and the above movement command; A step of determining the driving motion of the robot by inputting the above surrounding environment information and selection information including the robot's movement target point and action into a pre-trained neural network model; and, A control method for a driving robot characterized by including the step of controlling the driving motor unit so that the robot moves based on the driving motion determined above.

Description

Driving Robot and Its Control Method The present disclosure relates to a driving robot capable of driving to a destination based on artificial intelligence without map and location information, and a method for controlling the same. Generally, a robot is a machine that automatically processes or operates a given task based on its own capabilities, and its applications are typically classified into various fields such as industrial, medical, space, and underwater. Recently, there has been an increasing trend of robots capable of communicating or interacting with humans through voice or gestures. These robots may include various types of robots, such as guide robots placed in specific locations to provide users with various information, or home robots installed in homes. For a robot to drive, it must find its location on a map, there must be a path to the destination location on the map, and the robot must be able to control its motors to follow that path. The position of the robot on the map can be determined through a Simultaneous Localization and Mapping (SLAM) function that compares the robot's sensor information, including camera images, lidar, and distances to obstacles around the robot, with map information. In addition, the process of generating a path to a destination is divided into a Global Path, which generates a path from the robot's position on a map to the destination position, and a Local Path, which generates a path to avoid obstacles around the robot. The robot can control its motors through a motion controller to follow the finally generated Local Path. However, there was a problem where the robot would stop for safety reasons until it received assistance from an outsider if it failed to recognize its own location or generate paths such as global and local paths, as it could not determine the route to the destination. In this case, there was the inconvenience of services such as robot delivery and guidance being suspended indefinitely until external assistance was available, or having to move the robot to another location and restart it to re-recognize its position. Therefore, there is a need for the future development of driving robots capable of driving to a destination based on given path movement commands and road structure recognition, even without maps or location information. FIG. 1 is a drawing for explaining a driving robot according to one embodiment of the present disclosure. FIG. 2 is a diagram illustrating the process of acquiring surrounding environment information of a driving robot according to one embodiment of the present disclosure. FIGS. 3 and 4 are drawings for explaining the process of generating a measurement distance map of a driving robot according to one embodiment of the present disclosure. FIG. 5 is a diagram illustrating the process of generating path structure information of a driving robot according to one embodiment of the present disclosure. FIG. 6 is a drawing for explaining the process of calculating the intersection collision area of a driving robot according to one embodiment of the present disclosure. FIGS. 7 to 9 are drawings for explaining path structure information of a driving robot according to one embodiment of the present disclosure. FIGS. 10 and FIGS. 11 are drawings for explaining a control unit of a driving robot according to one embodiment of the present disclosure. FIGS. 12 and FIGS. 13 are drawings for explaining the global map analysis process of a driving robot according to one embodiment of the present disclosure. FIG. 14 is a diagram illustrating the global path analysis process of a driving robot according to one embodiment of the present disclosure. FIG. 15 is a diagram illustrating a movement process based on a movement command script of a driving robot according to one embodiment of the present disclosure. FIG. 16 is a diagram illustrating the process of generating a map-based movement command script for a driving robot according to one embodiment of the present disclosure. FIG. 17 is a flowchart illustrating the control process of a driving robot according to one embodiment of the present disclosure. Hereinafter, embodiments disclosed in this specification will be described in detail with reference to the attached drawings. Identical or similar components, regardless of drawing symbols, are assigned the same reference number, and redundant descriptions thereof will be omitted. The suffixes "module" and "part" used for components in the following description are assigned or used interchangeably solely for the ease of drafting the specification and do not inherently possess distinct meanings or roles. Furthermore, in describing embodiments disclosed in this specification, if it is determined that a detailed description of related prior art could obscure the essence of the embodiments disclosed in this specification, such detailed description will be omitted. Additionally, the attached drawings are intended only to facilitate understa