CN-121544132-B - Auxiliary assembly artificial efficiency evaluation method and system based on hand track in augmented reality environment
Abstract
The invention discloses an auxiliary assembly artificial efficiency evaluation method and system based on hand trajectories in an augmented reality environment, wherein the three-dimensional coordinates of an acquired hand joint point of an operator are converted into a coordinate system taking the head of the operator as an origin to obtain coordinates of a hand centroid, and the hand motion trajectories are divided into a plurality of discrete motion segments based on the motion speed and the motion direction change of the hand centroid at each moment; and comprehensively calculating the operation visibility, the accessibility and the gesture load scores of each action segment of an operator and the overall gesture load of a task based on the hand motion characteristic indexes to perform real-time artificial efficiency quantitative evaluation. The invention can provide technical support for real-time objective evaluation of artificial efficiency of assembly operation, analysis of action characteristics and optimization of process flow.
Inventors
- ZHANG LANYUN
- ZHAO ZERUI
Assignees
- 南京航空航天大学
Dates
- Publication Date
- 20260512
- Application Date
- 20260116
Claims (8)
- 1. The auxiliary assembly artificial efficiency evaluation method based on the hand track in the augmented reality environment is characterized by comprising the following steps: Collecting three-dimensional coordinates of a head and three-dimensional coordinates of a hand joint point when an operator wears the augmented reality equipment to perform an assembly task, and a triangular surface patch on the surface of an object in the environment; converting the three-dimensional coordinates of the hand joint point into a coordinate system with the head of an operator as an origin, and preprocessing to obtain coordinates of the center of mass of the hand; Calculating the motion speed and motion direction change of the centroid of the hand at each moment, and dividing the motion track of the hand into a plurality of discrete motion segments by setting a speed threshold and a direction abrupt change angle threshold; For each motion segment, calculating hand motion characteristic indexes including eye-hand unit vectors, actual stretching distances, motion amplitudes and motion smoothness; based on hand motion characteristic indexes, comprehensively calculating to obtain operation visibility, reachability and gesture load scores of all action segments of an operator and task overall gesture load, and carrying out real-time artificial efficiency quantitative evaluation, wherein the operation visibility is determined by counting the time proportion of shielding of unit sight vectors from eyes to hands of the operator to triangular patches on the surface of an environmental object in the action segments in the operation process, the reachability is determined by calculating the ratio between the actual extension distance and the arm length of the operator, and the task overall gesture load is determined by carrying out weighted average on the gesture load scores of all the action segments, the duration of the corresponding action segments and preset weight factors, wherein the gesture load score of each action segment is respectively normalized by the motion amplitude, the actual extension distance and the action smoothness characteristics in the action segments and then weighted determination is carried out; The motion amplitude is determined by the length of the motion track of the hand Triaxial peak-to-peak Volume of the path And carrying out weighted combination determination, wherein: ; ; ; Wherein, the Representing action segments The number of the inner discrete points is equal to the number of the inner discrete points, 、 Respectively represent the sampling time after pretreatment 、 Is defined by the hand centroid coordinates of (a), The euclidean distance is represented as, Representing action segments Inner first At each sampling time Three coordinate components of the hand centroid in three dimensions, Respectively represent Is selected from the group consisting of a maximum value of (c), Respectively represent Is the minimum value of (a); the task overall posture load The calculation formula of (2) is as follows: Wherein Representing action segments Is a gesture load score of (2), Representing action segments K represents the number of action segments, Representing action segments Is used for the fatigue weight of the steel sheet, The weight coefficient is represented by a number of weight coefficients, Representing action segments Normalized sub-divisions of the motion amplitude, the actual extension distance and the motion smoothness, and S represents a full value.
- 2. The method for evaluating the artificial efficiency of the auxiliary assembly based on the hand track in the augmented reality environment according to claim 1, wherein the step of dividing the hand motion track into a plurality of discrete action segments by setting a speed threshold and a direction abrupt angle threshold comprises the steps of judging the motion speed and the motion direction change of the mass center of the hand at each moment, directly classifying the motion speed and the motion direction change into a first state if the current speed is smaller than the speed threshold, judging whether the motion direction change is smaller than the direction abrupt angle threshold if the current speed is not smaller than the speed threshold, classifying the motion direction change into a second state if the motion direction change is smaller than the direction abrupt angle threshold, classifying the motion direction change into a third state if the motion direction change is not smaller than the direction abrupt angle threshold, and taking a series of each interval in the same state as one action segment.
- 3. The method for assisted assembly ergonomic evaluation based on hand trajectories in an augmented reality environment of claim 1, wherein the eye-hand unit vector is a unit vector obtained by normalizing hand centroid coordinates, an nth sampling instant within an action segment k Eye-hand unit vector of (2) , Representing the sampling time after preprocessing Is defined by the hand centroid coordinates of (a), Representing the modulo length of the vector.
- 4. The method of claim 1, wherein the actual extension distance is determined by the euclidean distance between the coordinates of the centroid of the hand and the coordinates of the shoulder joint estimated from the height of the operator.
- 5. The method of claim 1, wherein the smoothness of motion is determined by calculating the square integral of jerk of the center of mass of the hand during motion.
- 6. The method of claim 1, further comprising displaying the operator visibility, accessibility, and gesture load scores in each action segment of the operator in a time profile.
- 7. An auxiliary assembly ergonomic evaluation system based on hand trajectories in an augmented reality environment for implementing the auxiliary assembly ergonomic evaluation method based on hand trajectories in an augmented reality environment according to any one of claims 1-6, the system comprising: the data acquisition module is used for acquiring three-dimensional coordinates of a head, three-dimensional coordinates of a hand joint point and a triangular surface patch of the object surface in the environment when an operator wears the augmented reality equipment to perform an assembly task; The data preprocessing module is used for converting the three-dimensional coordinates of the hand joint points into a coordinate system taking the head of an operator as an origin, and preprocessing the coordinates to obtain coordinates of the center of mass of the hand; The motion segmentation module is used for calculating the motion speed and motion direction change of the centroid of the hand at each moment and dividing the motion track of the hand into a plurality of discrete motion segments by setting a speed threshold and a direction abrupt change angle threshold; The hand track analysis module is used for calculating hand motion characteristic indexes including eye-hand unit vectors, actual stretching distances, motion amplitudes and motion smoothness for each motion segment; The artificial efficiency evaluation module is used for comprehensively calculating operation visibility, accessibility and gesture load scores of all action segments of an operator and task overall gesture load based on hand motion characteristic indexes, carrying out real-time artificial efficiency quantitative evaluation, wherein the operation visibility is determined by counting the time proportion of the shielding of unit line-of-sight vectors from eyes to hands of the operator to triangular patches on the surface of an environmental object in the action segments in the operation process, the accessibility is determined by calculating the ratio between the actual extension distance and the arm length of the operator, and the task overall gesture load is determined by carrying out weighted average on gesture load scores of all the action segments, the duration of corresponding action segments and preset weight factors, wherein the gesture load score of each action segment is respectively normalized by the motion amplitude, the actual extension distance and the action smoothness characteristics in the action segments and then weighted determination.
- 8. A computer program product comprising a computer program, characterized in that the computer program when executed by a processor implements the steps of the assisted assembly ergonomics assessment method based on hand trajectories in an augmented reality environment according to any one of claims 1-6.
Description
Auxiliary assembly artificial efficiency evaluation method and system based on hand track in augmented reality environment Technical Field The invention belongs to the technical field of digital assembly fused with intelligent manufacturing and artificial engineering, relates to augmented reality (Augmented Reality , AR) and artificial efficiency evaluation, and particularly relates to an auxiliary assembly artificial efficiency evaluation method and system based on hand tracks in an AR environment. Background In recent years, with rapid development of AR technology, man-machine interaction modes are gradually expanding from two-dimensional interfaces to three-dimensional spaces, and interaction requirements of scenes such as assembly operation, virtual manufacturing and intelligent training are continuously improved. In the complex industrial assembly process, how to accurately identify the actions of operators and evaluate the assembly work efficiency in real time in the virtual and reality fused environment becomes an important research direction in the fields of intelligent manufacturing and human engineering. Quick upper limb assessment (Rapid Upper Limb Assessment, RULA) and hand activity level (HAND ACTIVITY LEVEL, HAL) are two widely used ergonomic assessment methods, mainly for assessing the risk of musculoskeletal diseases, focused on upper limb posture and hand high intensity operation tasks, respectively. RULA is often automated in conjunction with sensing techniques, such as using Kinect et al systems to estimate the posture of the operator, enabling non-invasive ergonomic assessment. In recent years, researchers further utilize a Kinect v2 sensor to realize real-time ergonomic evaluation and automatic RULA scoring, and a low-cost approach is provided for evaluation of working postures. However, these methods still have limitations such as excessive reliance on expert scoring, which are not only time consuming and laborious, but also susceptible to inter-rater variability. HAL methods quantify the risk of work efficiency based on task frequency and cycle time, but also rely on subjective evaluations, resulting in increased difficulty in consistency and standardized evaluations. In addition, the perception systems such as Kinect have certain defects, such as shielding problem, insufficient precision under dynamic environment and difficulty in tracking complex actions, which weaken the accuracy and robustness of the work efficiency evaluation. Disclosure of Invention Aiming at the defects of the prior art, the invention aims to provide the auxiliary assembly artificial efficiency evaluation method and the auxiliary assembly artificial efficiency evaluation system based on the hand track in the augmented reality environment, which can realize real-time dynamic quantification of three core artificial indexes of visibility, accessibility and gesture load in the AR environment by a non-invasive data acquisition mode integrated in the augmented reality system, further realize artificial efficiency accurate quantification evaluation of the manual assembly process in the AR environment and provide reliable technical support for implementing process flow optimization. The technical scheme is that in order to achieve the purpose, the auxiliary assembly artificial efficiency assessment method based on the hand track in the augmented reality environment comprises the following steps: Collecting three-dimensional coordinates of a head and three-dimensional coordinates of a hand joint point when an operator wears the augmented reality equipment to perform an assembly task, and a triangular surface patch on the surface of an object in the environment; converting the three-dimensional coordinates of the hand joint point into a coordinate system with the head of an operator as an origin, and preprocessing to obtain coordinates of the center of mass of the hand; Calculating the motion speed and motion direction change of the centroid of the hand at each moment, and dividing the motion track of the hand into a plurality of discrete motion segments by setting a speed threshold and a direction abrupt change angle threshold; For each motion segment, calculating hand motion characteristic indexes including eye-hand unit vectors, actual stretching distances, motion amplitudes and motion smoothness; Based on hand motion characteristic indexes, comprehensively calculating operation visibility, reachability and gesture load scores of all action sections of an operator and task overall gesture load, and carrying out real-time artificial efficiency quantitative evaluation, wherein the operation visibility is determined by counting the time proportion of shielding of unit sight line vectors from eyes to hands of the operator to triangular patches on the surface of an environmental object in the action sections in the operation process, the reachability is determined by calculating the ratio between the actual extension dist