Search

CN-121973196-A - Remote operation and time reversal based robot assembly data generation device and method

CN121973196ACN 121973196 ACN121973196 ACN 121973196ACN-121973196-A

Abstract

The invention discloses a device and a method for generating robot assembly data based on teleoperation and time reversal, wherein the generating device comprises a workbench, a multi-view camera, a mechanical arm, an electric sucking disc, an assembly base, a part to be assembled, a user, VR glasses and a terminal processor; the method comprises the steps that a terminal processor converts the three-position coordinates and quaternion postures of the received wrist relative to the center point of a human body into three-position coordinates and quaternion postures of the tail end coordinate system of the mechanical arm relative to an assembly base according to the orientation of the coordinate system, and the three-position coordinates and quaternion postures are calculated into target angle values of six joints of the mechanical arm through inverse kinematics, a six-dimensional joint angle instruction taking radian as a unit is generated, and finally a robot assembly data set with sample scale and physical fidelity is generated. The invention avoids the difficulty of forward alignment in precision assembly and solves the problems of bottleneck of data acquisition efficiency and scarcity of samples caused by low forward teaching success rate in a micro-gap assembly task.

Inventors

  • WANG LIN
  • LOU WENZHONG
  • FENG HENGZHEN
  • MA WENLONG
  • LI CHENGLONG
  • YU DAZHONG
  • Ding Nanxi

Assignees

  • 北京理工大学

Dates

Publication Date
20260505
Application Date
20260123

Claims (10)

  1. 1. The robot assembly data generating device based on teleoperation and time reversal is characterized by comprising a workbench (1), a multi-view camera (2), a mechanical arm (3), an electric sucker (4), an assembly base (5), an assembly part (6), a user (7), VR glasses (8) and a terminal processor (9); the terminal processor (9) comprises a mechanical arm teleoperation module (91), a sucker teleoperation module (92), a cooperative control module (93), a mechanical arm state data acquisition module (94), a mechanical arm joint angle acquisition module (95), a mechanical arm joint angular speed acquisition module (96), a mechanical arm joint moment acquisition module (97), a reverse disassembly data acquisition module (98), a forward assembly data acquisition module (99) and a track time reversal module (910); The mechanical arm teleoperation module (91) converts the analyzed wrist key point pose of a user into a joint angle command of the mechanical arm (3), the sucking disc teleoperation module calculates (92) a sucking disc action command generated by the operation of the user, the cooperative control module (93) synchronously transmits the execution command generated by the mechanical arm teleoperation module (91) and the sucking disc teleoperation module (92) to the mechanical arm (3) and the electric sucking disc (4), the mechanical arm state data acquisition module (94) reads a bottom sensor data stream fed back by the mechanical arm (3), the mechanical arm joint angle acquisition module (95) extracts joint encoder values from the bottom sensor data stream, analyzes and outputs six-dimensional joint angle vectors of the mechanical arm, the mechanical arm joint angle velocity acquisition module (96) extracts a velocity register value from the bottom sensor data stream and outputs a mechanical arm six-dimensional joint angle velocity vector, the mechanical arm joint moment acquisition module (97) extracts a moment register value from the bottom sensor data stream and outputs a six-dimensional joint moment vector, the mechanical arm state data acquisition module (98) extracts joint encoder values from the bottom sensor data stream, the three-dimensional joint angle data acquisition module (99) is assembled to the positive joint angle data, the reverse joint angle data and the reverse joint angle data acquisition data set is assembled, and the positive joint angle data and the joint angle data set are recorded to generate the multiple angle data and the positive angle data and the joint angle data set are assembled and the data are assembled The track time reversal module (910) carries out time sequence reversal processing on the reverse disassembly data.
  2. 2. The teleoperation and time reversal based robot assembly data generation device according to claim 1, wherein the multi-view camera comprises a side view camera (21), a wrist camera (22) and a global camera (23), the side view camera (21) is located at the side of the workbench (1) and is used for collecting side view image data of the z-axis depth, the wrist camera (22) is located at the tail end of the mechanical arm (3) and is used for collecting local image data of the first-person view, and the global camera (23) is located above the workbench (1).
  3. 3. The device for generating robot assembly data based on teleoperation and time reversal according to claim 1, wherein the mechanical arm teleoperation module (91) converts the three-dimensional coordinates and quaternion postures of the received wrist relative to the center point of the human body into three-dimensional coordinates and quaternion postures of the tail end coordinate system of the mechanical arm relative to the assembly base (5) according to the direction of the coordinate system, and generates six-dimensional joint angle instructions in radian units by inverse kinematics to calculate target angle values of six joints of the mechanical arm.
  4. 4. A method for generating robot assembly data based on teleoperation and time reversal is characterized by comprising the following steps: 1) Arranging a multi-view camera 2) around the mechanical arm 3) and the workbench 1), and placing the assembly base 5) and the to-be-assembled part 6) on the workbench; 2) The user 7) wears VR glasses 8) to carry out reverse teaching, the VR glasses 8) outputs human motion data to the terminal processor 9), a mechanical arm teleoperation module (91) in the terminal processor (9) maps the received human motion data to a mechanical arm base coordinate system through a coordinate system conversion matrix, and calculates a six-dimensional joint angle instruction vector required by the mechanical arm to reach a target pose by utilizing an inverse kinematics algorithm, and meanwhile, a sucker teleoperation module (92) receives space coordinates of key points of fingers, compares the calculated Euclidean distance between fingertips of thumb and index finger with a preset threshold value, and generates a high-low level control instruction for controlling the sucker to be opened or closed; 3) The cooperative control module (93) receives a joint angle instruction output by the mechanical arm teleoperation module (91) and an adsorption instruction output by the sucker teleoperation module (92), and encapsulates a motion instruction of the mechanical arm and a motion instruction of the sucker into a cooperative motion instruction frame containing the same time stamp through multithreading synchronous processing, so that cooperative motion of the mechanical arm (3) and the electric sucker (4) aligned in time sequence is planned; 4) The mechanical arm state data acquisition module (94) receives a cooperative action instruction output by the cooperative control module (93), drives the mechanical arm (3) to move to the position above the assembly base (5), and controls the electric sucker (4) to adsorb the to-be-assembled part (6) for initial grabbing and vertical pulling-out actions; 5) The mechanical arm (3) and the electric sucker (4) horizontally convey the pulled-out parts to be assembled to the workbench (1) for reverse pulling and placing; 6) The mechanical arm state data acquisition module (94) sends the read real-time encoder values of six joints of the mechanical arm, the speed register values of the six joints and the moment register values of the six joints fed back by the bottom layer sensor of the mechanical arm (3) to the mechanical arm joint angle acquisition module (95), the mechanical arm joint angular speed acquisition module (96) and the mechanical arm joint moment acquisition module (97); 7) During reverse disassembly, the real-time joint angle output by the mechanical arm joint angle acquisition module (95) and the real-time joint angular velocity output by the mechanical arm joint angular velocity acquisition module (96) are directionally transmitted to the reverse disassembly data acquisition module (98); 8) The reverse disassembly data acquisition module (98) performs time stamp alignment on received image data of the multi-view camera recorded by the multi-view camera (2) and real-time joint angles and joint angular velocities of six joints of the mechanical arm in the step 7) to generate an original reverse data set; 9) The track time reversal module (10) reads the original reverse data set, rearranges the multi-view image sequence and the joint angle sequence according to a time reverse order, rearranges the joint angular velocity sequence according to the time reverse order and reverses the number value, and generates forward assembly training data containing kinematic information; 10 The user (7) controls the mechanical arm (3) and the electric sucker (4) to carry out forward assembly through the VR glasses (8), and the fitting (6) to be assembled is inserted into the assembly base (5); 11 During forward assembly, real-time joint angles of six joints of the mechanical arm output by the mechanical arm joint angle acquisition module (95), real-time joint angular velocities of six joints of the mechanical arm output by the mechanical arm joint angular velocity acquisition module (96), and real-time joint moments of six joints of the mechanical arm output by the mechanical arm joint moment acquisition module (87) are directionally transmitted to the forward assembly data acquisition module (99); 12 The forward assembly data acquisition module (99) performs time stamp alignment on the received image data of the multi-view camera and the real-time joint angles, joint angular speeds and joint moments of six joints of the mechanical arm obtained in the step 11), so as to generate a forward expert data set; 13 Repeating the steps 1) to 9) to collect reverse data, repeating the steps 10) to 12) to collect forward expert data, and fusing the data output by the track time reversal module (910) and the data output by the forward assembly data collection module (99) by the terminal processor (9) to generate a robot assembly data set with sample size and physical fidelity.
  5. 5. The method for generating robot assembly data based on teleoperation and time reversal of claim 4, wherein in step 2), the human motion data includes three-dimensional position coordinates of a wrist coordinate system, quaternion gesture parameters, and spatial coordinates of finger key points.
  6. 6. The method for generating robot assembly data based on teleoperation and time reversal according to claim 4, wherein in step 3), the cooperative control module (93) establishes a unified time reference through multi-thread synchronous processing, and encapsulates the motion command of the mechanical arm and the motion command of the suction cup into a cooperative motion command frame containing the same time stamp, so as to plan the cooperative motion of the mechanical arm (3) and the electric suction cup (4) aligned in time sequence.
  7. 7. The method for generating robot assembly data based on teleoperation and time reversal according to claim 4, wherein in step 8), the original reverse data set includes samples of a manipulator disassembly assembly base (5) and a fitting (6) arranged in time sequence, the samples being composed of multi-view RGB images, and six-dimensional joint angle vectors and six-dimensional joint angular velocity vectors corresponding to the multi-view RGB images.
  8. 8. The method for generating robot assembly data based on teleoperation and time reversal according to claim 4, wherein in step 9), the forward assembly training dataset comprises samples of a robot assembly base (5) and a part to be assembled (6) which are arranged in a reverse direction in time sequence, the samples are composed of multi-view RGB images and corresponding six-dimensional joint angles which are arranged in a reverse direction in time sequence, and corresponding six-dimensional joint angular velocities which are inverted in numerical values after being arranged in the reverse direction in time sequence.
  9. 9. The method for generating robot assembly data based on teleoperation and time reversal according to claim 4, wherein in step 12), the forward assembly data acquisition module (99) performs time stamp alignment on the received image data of the multi-view camera recorded by the multi-view camera (2) and the real-time joint angles, joint angular velocities and joint moments of six joints of the mechanical arm obtained in step 11), namely, matches the joint angles, the angular velocities and the joint moments of the mechanical arm corresponding to each frame of image, so as to generate a forward expert data set.
  10. 10. The method for generating robot assembly data based on teleoperation and time reversal according to claim 4, wherein in step 12), the forward expert dataset comprises samples of a manipulator assembly base (5) and a part to be assembled (6) arranged in time sequence, the samples consisting of multi-view RGB images, corresponding six-dimensional joint angles, six-dimensional joint angular velocities and six-dimensional joint moments.

Description

Remote operation and time reversal based robot assembly data generation device and method Technical Field The invention relates to the technical field of intelligent manufacturing and automation of robots, in particular to a device and a method for generating robot assembly data based on teleoperation and time reversal. Background With the advancement of intelligent manufacturing technology, a robot control method based on simulation learning has become a key technical path for realizing complex precision assembly tasks, and the core of the robot control method is to train a strategy model by utilizing a large amount of high-quality expert teaching data. Currently, data collection of robot assembly tasks mainly relies on operators to utilize Virtual Reality (VR) equipment or a demonstrator to strictly follow a forward assembly process flow, i.e., control a mechanical arm to complete a full process demonstration from grabbing, carrying, alignment to insertion. However, in a precision hole axis assembly scenario involving a micron-sized gap, there is a significant technical bottleneck with existing forward teaching data acquisition approaches. First, the traditional forward teaching data acquisition method is limited by the physiological functions of operators and the feedback precision of equipment. When facing the precision assembly task of the micron-sized matching gap, the operation difficulty of forward alignment and insertion under extremely narrow tolerance is extremely high due to depth perception errors limited by remote visual feedback and physiological tremors which are difficult to avoid by hands of operators. The high-difficulty operation results in extremely low success rate of forward teaching, and high-frequency rigid collision and jamming are often accompanied in the acquisition process. This not only results in extremely high time costs for obtaining a single valid forward sample, but also is extremely prone to physical loss of the precision force sensor and the workpiece, making construction of large-scale forward assembly datasets extremely expensive and inefficient. Secondly, the existing synthetic data generation method is difficult to solve the physical gap problem between simulation and reality. Although the physical simulation engine can be utilized to quickly generate large-scale synthetic data, for tasks involving non-rigid end characteristics such as suction cup adsorption deformation and complex contact dynamics characteristics such as friction force, air damping and micro deformation, simulation modeling is difficult to accurately reproduce a real physical environment. This results in a strategy model that is trained solely on simulation data that is poorly generalizable in real physical environments and cannot be directly migrated for use. Finally, existing automated acquisition schemes lack efficient integration of multimodal data. Conventional visual servoing assembly schemes often ignore force feedback during assembly and fail to generate multi-modal datasets for end-to-end learning that include vision, kinematics, and dynamics. There is a need for a method for generating robot assembly data with both large-scale sample number and high-fidelity physical characteristics at low cost and high efficiency, so as to solve the problems of low data acquisition efficiency and poor sample quality in the prior art. Disclosure of Invention The invention aims to: in order to solve the technical problems of low forward teaching success rate and difficult acquisition of high-fidelity data in a micro-gap precise assembly task, the invention provides a device and a method for generating robot assembly data based on teleoperation and time reversal, which are suitable for data acquisition of an industrial precise assembly task and training of a robot simulation learning strategy, and recording a reverse disassembly process with high success rate by utilizing a VR teleoperation platform, converting a reverse track into forward assembly kinematic data by a track time reversal technology, combining a small amount of forward expert teaching data containing real contact information, and finally generating a robot assembly training data set with large-scale sample number and high-fidelity physical characteristics. The technical scheme is as follows: the invention relates to a teleoperation and time reversal-based robot assembly data generation device which comprises a workbench, a multi-view camera, a mechanical arm, an electric sucking disc, an assembly base, an assembly to be assembled, a user, VR glasses and a terminal processor. The terminal processor is used as a calculation and storage center for data generation and is used for mapping hand actions of a user into motion instructions of the mechanical arm, synchronously collecting and recording multi-view images, mechanical arm kinematics and dynamic state data, and running a time reversal and data fusion algorithm to generate a final assembly d