Search

KR-102961895-B1 - The Method and System for Process Automation Based on Imitation Learning Using a Robot Hand Apparatus for Both Teaching and Motion

KR102961895B1KR 102961895 B1KR102961895 B1KR 102961895B1KR-102961895-B1

Abstract

The present invention relates to a method and system for process automation based on imitation learning using a robot hand device for both teaching and motion, wherein learning data is acquired through direct teaching by an operator while the lower body, which is the driving part of the robot hand device, is separated, a control artificial intelligence model is trained using the learning data, and an automated process is performed by autonomously controlling a robot arm and a robot hand device through the trained model while the upper body, which is the driving part of the robot hand device, is combined.

Inventors

  • 서형주
  • 서승지
  • 이승준
  • 박대선
  • 노태경

Assignees

  • 주식회사 카본식스코리아

Dates

Publication Date
20260507
Application Date
20250923

Claims (11)

  1. As an imitation learning-based process automation method using a robot hand device for both teaching and motion, The above robot hand device is, An upper body comprising an encoder module for measuring motion and a camera module for taking pictures; a first link module and a second link module each having a shape into which a user's finger can be inserted for direct instruction and rotatably coupled to the upper body; and a lower body comprising a motor that operates according to an external input, wherein the power of the motor can be transmitted to the first link module and the second link module when coupled with the upper body. A learning data acquisition step in which, with the lower body separated, the user's finger is inserted into the first link module and the second link module to perform direct teaching, and learning data including operation information of the encoder module, image information of the camera module, and pose information of the upper body measured by the pose measurement system is acquired during the direct teaching process; A model training step for training a control artificial intelligence model with the above training data; and A process execution step comprising: mutually combining the upper body and the lower body, and, while the robot hand device is coupled to the robot arm, controlling the robot arm and the motor to perform an automated process; and The above upper body is, A first inner shaft rod that performs the role of a rotation axis of the first link module; A second inner shaft rod that performs the role of a rotation axis of the second link module; A first gear coupled to the outer surface of the first inner shaft rod; and It includes a second gear coupled to the outer surface of the second inner shaft rod; and A process automation method in which the first gear and the second gear are in gear contact, so that the operation of the first link module and the second link module is mutually interconnected.
  2. In claim 1, The above training data is, A process automation method in which operation information of the encoder module including one or more operation states or position states among the first link module and the second link module, image information of the camera module that captured the front of the upper body, and pose information of the pose measurement system including position information and orientation information of the upper body are paired according to a time series to form a time series dataset.
  3. In claim 1, The pose measurement system above is, A pose calculation unit that receives joint angle information including the rotation angle of each joint measured from a joint encoder provided at each of the plurality of joints of the robot arm; and The above pose calculation unit is, A process automation method for inputting the above joint angle information into a Forward Kinematics model to calculate the pose information including position information and orientation information of the upper body.
  4. As an imitation learning-based process automation method using a robot hand device for both teaching and motion, The above robot hand device is, An upper body comprising an encoder module for measuring motion and a camera module for taking pictures; a first link module and a second link module each having a shape into which a user's finger can be inserted for direct instruction and rotatably coupled to the upper body; and a lower body comprising a motor that operates according to an external input, wherein the power of the motor can be transmitted to the first link module and the second link module when coupled with the upper body. A learning data acquisition step in which, with the lower body separated, the user's finger is inserted into the first link module and the second link module to perform direct teaching, and learning data including operation information of the encoder module, image information of the camera module, and pose information of the upper body measured by the pose measurement system is acquired during the direct teaching process; A model training step for training a control artificial intelligence model with the above training data; and A process execution step comprising: mutually combining the upper body and the lower body, and, while the robot hand device is coupled to the robot arm, controlling the robot arm and the motor to perform an automated process; and The above-mentioned control artificial intelligence model is, The above pose information is input into a vector encoder to generate a first observation vector in the form of a token, and The above image information is input into a vision encoder to generate an image token, and The pose information and the image token are encoded into a first latent sequence through a transformer encoder, and A process automation method for deriving a first control command for the operation of the robot arm at the next point in time based on the first potential sequence through a transformer decoder.
  5. As an imitation learning-based process automation method using a robot hand device for both teaching and motion, The above robot hand device is, An upper body comprising an encoder module for measuring motion and a camera module for taking pictures; a first link module and a second link module each having a shape into which a user's finger can be inserted for direct instruction and rotatably coupled to the upper body; and a lower body comprising a motor that operates according to an external input, wherein the power of the motor can be transmitted to the first link module and the second link module when coupled with the upper body. A learning data acquisition step in which, with the lower body separated, the user's finger is inserted into the first link module and the second link module to perform direct teaching, and learning data including operation information of the encoder module, image information of the camera module, and pose information of the upper body measured by the pose measurement system is acquired during the direct teaching process; A model training step for training a control artificial intelligence model with the above training data; and A process execution step comprising: mutually combining the upper body and the lower body, and, while the robot hand device is coupled to the robot arm, controlling the robot arm and the motor to perform an automated process; and The above-mentioned control artificial intelligence model is, The above operation information is input into a vector encoder to generate a second observation vector in the form of a token, and The above image information is input into a vision encoder to generate an image token, and The operation information and the image token are encoded into a second latent sequence through a transformer encoder, and A process automation method for deriving a second control command for the operation of the motor at the next point in time based on the second potential sequence through a transformer decoder.
  6. In claim 1, The above-mentioned control artificial intelligence model is, A process automation method comprising one or more artificial neural networks, and continuously generating a first control command for the operation of the robot arm and a second control command for the operation of the motor based on the image information input according to a time series.
  7. As an imitation learning-based process automation method using a robot hand device for both teaching and motion, The above robot hand device is, An upper body comprising an encoder module for measuring motion and a camera module for taking pictures; a first link module and a second link module each having a shape into which a user's finger can be inserted for direct instruction and rotatably coupled to the upper body; and a lower body comprising a motor that operates according to an external input, wherein the power of the motor can be transmitted to the first link module and the second link module when coupled with the upper body. A learning data acquisition step in which, with the lower body separated, the user's finger is inserted into the first link module and the second link module to perform direct teaching, and learning data including operation information of the encoder module, image information of the camera module, and pose information of the upper body measured by the pose measurement system is acquired during the direct teaching process; A model training step for training a control artificial intelligence model with the above training data; and A process execution step comprising: mutually combining the upper body and the lower body, and, while the robot hand device is coupled to the robot arm, controlling the robot arm and the motor to perform an automated process; and The above process execution step is, A similarity situation detection step for detecting teaching image information most similar to currently acquired image information among a plurality of image information included in the training data through the above-mentioned artificial intelligence model for control; and A process automation method comprising: a motor control step of controlling the operation of the robot arm and the operation of the motor based on motion information and pose information recorded in the learning data in correspondence with the next image information of the corresponding teaching image information according to the time series through the control artificial intelligence model above.
  8. In claim 1, The first link module above includes a first inner link module and a first outer link module, and One end of each of the first inner link module and the first outer link module is rotatably coupled to the upper body, and A process automation method in which an internal space for accommodating a user's finger is formed between the first inner link module and the first outer link module.
  9. delete
  10. In claim 1, The above-mentioned robot hand device further includes a control unit, and A process automation method in which the control unit stores the measurement value of the encoder module, the image information of the camera module, and the pose information of the pose measurement system as learning data according to a time series.
  11. As an imitation learning-based process automation system that performs an imitation learning-based process automation method using a robot hand device for both teaching and motion, The above robot hand device is, An upper body comprising an encoder module for measuring motion and a camera module for taking pictures; a first link module and a second link module each having a shape into which a user's finger can be inserted for direct instruction and rotatably coupled to the upper body; and a lower body comprising a motor that operates according to an external input, wherein the power of the motor can be transmitted to the first link module and the second link module when coupled with the upper body. A learning data acquisition unit that, while the lower body is separated, inserts the user's finger into the first link module and the second link module to perform direct teaching, and acquires learning data including operation information of the encoder module, image information of the camera module, and pose information of the upper body measured by the pose measurement system during the direct teaching process; A model learning unit that learns a control artificial intelligence model with the above learning data; and A process execution unit comprising: mutually combining the upper body and the lower body, and, while the robot hand device is coupled to the robot arm, controlling the robot arm and the motor to perform an automated process; and The above upper body is, A first inner shaft rod that performs the role of a rotation axis of the first link module; A second inner shaft rod that performs the role of a rotation axis of the second link module; A first gear coupled to the outer surface of the first inner shaft rod; and It includes a second gear coupled to the outer surface of the second inner shaft rod; and A process automation system in which the first gear and the second gear are in gear contact, so that the operation of the first link module and the second link module is mutually interconnected.

Description

The Method and System for Process Automation Based on Imitation Learning Using a Robot Hand Apparatus for Both Teaching and Motion The present invention relates to a method and system for process automation based on imitation learning using a robot hand device for both teaching and motion, wherein learning data is acquired through direct teaching by an operator while the lower body, which is the driving part of the robot hand device, is separated, a control artificial intelligence model is trained using the learning data, and an automated process is performed by autonomously controlling a robot arm and a robot hand device through the trained model while the upper body, which is the driving part of the robot hand device, is combined. Recently, there has been a surge in demand in the industrial sector to automate complex assembly and handling processes that require delicate human hand movements and judgment, moving beyond standardized, repetitive tasks. Traditional robot automation has primarily relied on experts manually programming every movement of the robot. However, while this method enables highly precise work, it had limitations: the inconvenience of having to modify the entire program whenever the work environment or object changed even slightly, and it required enormous time and cost for development. To overcome these limitations, imitation learning, or demonstration-based learning technology, in which robots learn by observing human behavior, is emerging as a key element of next-generation robotic automation. Imitation learning is gaining attention as a technology highly suitable for modern industrial environments, as it enables even non-experts to easily teach robots new tasks. Conventional robot teaching technology, such as Korean Registered Patent No. 10-2330992, involves a teaching device and control method for manipulating a dual-arm robot, allowing the robot to operate and perform teaching according to control signals transmitted from a controller. However, due to the unintuitive nature of the operation, such robot movements are prone to generating unnatural and inefficient data, and it is difficult to digitize the precise movements (such as grasping) of the robot hand, which are the core of the actual operation. As a result, robot motion data generated through teleoperation differs from natural human movements, leading to reduced work efficiency or frequent instances of movement along inefficient paths. Therefore, there is a need to develop an imitation learning-based process automation method that solves the problems of conventional technology and can reproduce operations with a high success rate in actual processes. FIG. 1 schematically illustrates the concept of use of a robot hand device according to direct teaching according to one embodiment of the present invention. FIG. 2 schematically illustrates a perspective view of a robot hand device according to one embodiment of the present invention. FIG. 3 schematically illustrates a plan view of a robot hand device according to one embodiment of the present invention. FIG. 4 schematically illustrates the structure of a first link module according to one embodiment of the present invention. FIG. 5 schematically illustrates the structure of a first inner link module according to one embodiment of the present invention. FIG. 6 schematically illustrates the process of performing a data acquisition process of a robot hand device according to one embodiment of the present invention. FIG. 7 schematically illustrates an exploded perspective view of the coupling relationship between an upper body and a link module according to one embodiment of the present invention. FIG. 8 schematically illustrates an exploded perspective view of the coupling relationship between a link module and a fingertip according to one embodiment of the present invention. FIG. 9 schematically illustrates a bottom view of a torsion spring coupling structure according to one embodiment of the present invention. FIG. 10 schematically illustrates a torsion spring coupling structure according to one embodiment of the present invention. FIG. 11 schematically illustrates the interlocking structure of a first inner link module and a second inner link module according to an embodiment of the present invention. FIG. 12 schematically illustrates the arrangement structure of a first gear and a second gear according to an embodiment of the present invention. FIG. 13 schematically illustrates the arrangement structure of an encoder module according to one embodiment of the present invention. FIG. 14 schematically illustrates an exploded perspective view for explaining the upper body structure according to one embodiment of the present invention. FIG. 15 schematically illustrates the process of storing training data according to one embodiment of the present invention. FIG. 16 schematically illustrates a plan view of a lower body according to one embodiment of the present invention. FIG. 17