Search

US-12623344-B2 - Method for robot assisted multi-view 3D scanning measurement based on path planning

US12623344B2US 12623344 B2US12623344 B2US 12623344B2US-12623344-B2

Abstract

The present invention provides a method for robot assisted multi-view 3D scanning measurement based on path planning. Firstly, establishing a virtual simulation platform to complete the setting of measurement poses and measurement paths and perform the path evaluations of measurement paths, which improves the measurement efficiency and guarantees the measurement safety. Then, completing the preliminary hand-eye calibration based on the properties of Kronecker product, which reduces the system noise in the process of calibration and the error influence caused by the calculation, and the preliminary hand-eye calibration is optimized by establishing a reprojection error cost function as the fitness function of the particle swarm optimization algorithm, which further improves the accuracy of hand-eye calibration and guarantees the registration quality of single-view point cloud. Lastly, moving the robot in turn to the measurement poses of the planned measurement paths, obtaining a single-view point cloud of the measured object and transforming it from the camera coordinate system to the robot base coordinate system to obtain a registered single-view point cloud based on the optimized hand-eye matrix. when registered single-view point clouds of all measurement poses are obtained, the points under the robot base coordinate system form a complete point cloud of the measured object, the 3D the measurement is completed.

Inventors

  • Chun Yin
  • Yan Gao
  • Zhongbao YAN
  • Kai Chen
  • Yuhua Cheng
  • Xutong TAN
  • Junyang LIU

Assignees

  • UNIVERSITY OF ELECTRONIC SCIENCE AND TECHNOLOGY OF CHINA

Dates

Publication Date
20260512
Application Date
20230428
Priority Date
20230317

Claims (1)

  1. 1 . A method comprising: step 1: establishing a virtual simulation platform step 1.1: based on a real measurement environment, establishing the virtual simulation platform on a computer with an open source robot operating system (ROS), importing a description file of a real robot of the real measurement environment into the virtual simulation platform to create a virtual robot in the virtual simulation platform, the virtual robot having the same links, joints, kinematics, and dynamic parameters as the real robot, and on the virtual simulation platform, installing virtual structured light measuring equipment and a virtual depth camera that correspond to real structured light measuring equipment and a binocular camera in the real measurement environment on a virtual fixture of the virtual robot's end flange; step 1.2: simulating a measured object to create a simulated object and placing the simulated object right in front of the virtual robot and the virtual structured light measuring equipment, determining multiple measurement surfaces of the simulated object and multiple measurement paths on each measurement surface for image taking of the simulated object by the virtual structured light measuring equipment, wherein an i th measurement path of a k th measurement surface is denoted by k S i k=1, 2, . . . , K, i=1, 2, . . . , M k , K is the number of measurement surfaces, M k is the number of measurement paths of the k th measurement surface, wherein a j th measurement point, namely measurement pose of the i th measurement path of the k th measurement surface is denoted by k P i,j , j=1, 2, . . . , N k , N k is a number of measurement poses of a measurement path of the k th measurement surface; and a view field of the virtual structured light measuring equipment is a rectangle with a length of m cm and a width of n cm, an outer rectangle of the k th measurement surface is k a in length and k b in width, the measurement paths are vertical strips along a length, which satisfy the following constraints: any two adjacent measurement paths have a strip of common area, the number M k of measurement paths of the k th measurement surface is greater than or equal to k a/m, the number N k of the measurement poses of a measurement path of the k th measurement surface is greater than or equal to k b/n, and the j th measurement pose k P ij of the i th measurement path of the k th measurement surface is obtained according to the constraints; step 2: planning measurement paths on the virtual simulation platform step 2.1: on the virtual simulation platform, to the i th measurement path of the k th measurement surface, dragging an arm end of the virtual robot to a measurement pose k P ij , wherein the measurement pose k P ij is ( k Px ij , k Py ij , k Pz ij , k Rx ij , k Ry ij , k Rz ij ), ( k Px ij , k Py ij , k Pz ij ) being a position coordinate of the arm end of the virtual robot, and ( k Rx ij , k Ry ij , k Rz ij ) being an attitude coordinate of the arm end of the virtual robot; step 2.2: in sequence: taking a RGB-D image of the simulated object at the measurement pose k P ij by the virtual depth camera and converting the RGB-D image into a grayscale image, filtering out a background of the RGB-D image by using threshold segmentation and extracting a contour from the RGB-D image to obtain a minimum circumscribed rectangle of the simulated object, traversing depths of all pixels of the RGB-D image within the minimum circumscribed rectangle to find out a shortest distance d min between the simulated object and the virtual structured light measuring equipment and recording a pixel coordinate (u,v) that corresponds to the shortest distance d min on the RGB-D image, judging whether the shortest distance d min satisfies a measurement range of the virtual structured light measuring equipment using the equation d min ∈[D−δ,D+δ], wherein D is a focal length of the virtual structured light measuring equipment for camera calibration and δ is an allowable measurement error range of the virtual structured light measuring equipment; step 2.3: in response to judging that the shortest distance d min does not satisfy the measurement range of the virtual structured light measuring equipment, in sequence: obtaining a corresponding spatial coordinate (x (u,v) , y (u,v) , z (u,v) ) of the pixel coordinate (u,v) through a coordinate transformation, determining a spatial line according to the spatial coordinate (x (u,v) , y (u,v) , z (u,v) ) and the position coordinate ( k Px ij , k Py ij , k Pz ij ) of the arm end of the virtual robot using the equation: x - k Px ij x ( u , v ) - k Px ij = y - k Py ij y ( u , v ) - k Py ij = z - k Pz ij z ( u , v ) - k Pz ij wherein (x,y,z) is a spatial coordinate on the spatial line; finding a position coordinate from the position coordinate ( k Px ij , k Py ij , k Pz ij ) on the spatial line to obtain a position coordinate ( k {tilde over (p)}x ij , k {tilde over (p)}y ij , k {tilde over (p)}z ij ), which satisfies the following measurement conditions: { d ~ min = ( k p ~ ⁢ x ij - x ( u , v ) ) 2 + ( k p ~ ⁢ y ij - y ( u , v ) ) 2 + ( k p ~ ⁢ z ij - z ( u , v ) ) 2 d ~ min ∈ [ D - δ , D + δ ] , and combining the position coordinate ( k {tilde over (p)}x ij , k {tilde over (p)}y ij , k {tilde over (p)}z ij ) with the attitude coordinate ( k Rx ij , k Ry ij , k Rz ij ) to form a measurement pose ( k {tilde over (p)}x ij , k {tilde over (p)}y ij , k {tilde over (p)}z ij , k Rx ij , k Ry ij , k Rz ij ) and updating the measurement pose k P ij with the measurement pose ( k {tilde over (p)}x ij , k {tilde over (p)}y ij , k {tilde over (p)}z ij , k Rx ij , k Ry ij , k Rz ij ); step 2.4: in response to judging that the shortest distance d min satisfies the measurement range of the virtual structured light measuring equipment, recording the measurement pose k P ij and going to step 2.1 for judging a next measurement pose k P i(j+1) until the shortest distances of all measurement poses of the measurement path k S i are judged; step 2.5: in response to the shortest distances of all measurement poses of the measurement path k S i being judged, evaluating the measurement path k S i step 2.5.1: executing the measurement path k S i on the virtual simulation platform, the virtual robot moving continuously from a measurement pose k P i1 to a measurement pose k P iN k , sampling and recording a position k {dot over (P)} it in equal time during movement, wherein all positions k {dot over (P)} it =0, 1, . . . , T form a motion path k {dot over (S)} i , t is a serial number of sampling, and T is a number of samplings; calculating a shortest distance l 1 between a starting position ( k Px i1 , k Py i1 , k Pz i1 ) and an ending position ( k Px iJ , k Py iJ , k Pz iN k ) of the measurement path k S i using the equation: l 1 = ( k Px i ⁢ 1 - k Px i ⁢ N k ) 2 + ( k Py i ⁢ 1 - k Py i ⁢ N k ) 2 + ( k Pz i ⁢ 1 - k Pz i ⁢ N k ) 2 calculating a motion distance between the starting position ( k {dot over (P)}x i1 , k {dot over (P)}y 11 , k {dot over (P)}z i1 ) and the ending position ( k {dot over (P)}x iT , k {dot over (P)}y iT , k {dot over (P)}z iT ) of the motion path k {dot over (S)} i using the equation: l 2 = ∑ t = 1 T - 1 ( k P . ⁢ x i ⁡ ( t + 1 ) - k P . ⁢ x it ) 2 + ( k P . ⁢ y i ⁡ ( t + 1 ) - k P . ⁢ y it ) 2 + ( k P . ⁢ z i ⁡ ( t + 1 ) - k P . ⁢ z it ) 2 , and obtaining a distance difference L, wherein L=l 2 −l 1 , and creating an evaluation function ƒ 1 (L), wherein ƒ 1 (L)=(δ 1 −L)/δ 1 , 0≤L≤δ 1 , δ 1 is a maximum error threshold; step 2.5.2: in sequence: finding a position k {dot over (P)} iw which has a shortest distance to a barycenter of the simulated object from the motion path k {dot over (S)} i , wherein the shortest distance is denoted by l 3 , and creating an evaluation function ƒ 2 (l 3 ), wherein ƒ 2 (l 3 )=(l 3 −δ 2 )/l 3 , δ 2 is a shortest distance which guarantee no collision between the virtual structured light measuring equipment and the simulated object; step 2.5.3: in sequence: creating an overall evaluation function k F i for the measurement path k S i , wherein k F i =(0.5ƒ 1 (L)+0.5ƒ 2 (l 3 ))*100, determining that the measurement path k S i has passed an evaluation according to the overall evaluation function k F i when the overall evaluation function k F i >g, wherein g is a threshold which is determined by measurement operator according to an actual measurement scenario, and 0<g<100; step 2.6: in response to the measurement path k S i not passing the evaluation, in sequence: traversing all measurement poses of the measurement path k S i to find a measurement pose k P is which has a shortest distance to position k {dot over (P)} iw , moving the virtual robot to the measurement pose k P is , dragging the arm end of the virtual robot to manually increase the shortest distance d min between the simulated object and the virtual structured light measuring equipment, wherein the increase of the shortest distance d min satisfies the measurement range of the virtual structured light measuring equipment: d min ∈[D−δ,D+δ], replacing the measurement pose k P is with a measurement pose after the increase of the shortest distance d min to complete a correction of the measurement path k S i , and going to step 2.7; step 2.7: in response to the measurement path k S i passing the evaluation, for each measurement path of each measurement surface, performing step 2.1 to step 2.6 to complete the simulation of planning of the measurement paths, converting the planned measurement paths into communication messages that are recognized by the real robot, and sending the communication messages from the virtual simulation platform to the real robot in the real measurement environment; step 3: measuring in a real measurement environment step 3.1: establishing a hand-eye calibration equation in sequence: in the real measurement environment, installing the real structured light measuring equipment on a fixture of an end flange of the real robot, operating the real robot to move to an i′ th measurement pose and taking an image of a corner of a checkerboard, namely a calibration board to obtain a calibration board image, recording the pose information of the real robot and the position P i′ of the corner in camera coordinate system, for n′ measurement poses, obtaining n′ calibration board images, and their corresponding pose information of the real robot and positions P i′ of the corner in camera coordinate system; in sequence: obtaining a rotation matrix R ci′ and a translation vector T ci′ of the calibration board relative to the binocular camera of the real structured light measuring equipment in each calibration board image according to a Zhang calibration method, combining the rotation matrix R ci′ and the translation vector T ci′ into an extrinsic parameter matrix H ci′ , i′=1, 2, . . . ,n′, and obtaining the rotation matrix R gi′ and the translation vector T gi′ of the end flange of the real robot relative to a base of the real robot according to the measurement pose, and combining the rotation matrix R gi′ and the translation vector T gi′ into a robot pose matrix H gi′ , i′=1, 2, . . . , n′, where: H ci ′ = [ R ci ′ T ci ′ 0 1 ] , H gi ′ = [ R gi ′ T gi ′ 0 1 ] ; establishing a hand-eye calibration equation based on the extrinsic parameter matrices H cu′ , H cv′ and robot pose matrices H gu′ , H gv′ of any two measurement poses based on the equation: [ H gv ′ - 1 ⁢ H gu ′ ] ⁢ H cg = H cg [ H cv ′ ⁢ H cu ′ - 1 ] wherein H cg is a hand-eye matrix, u′, v′ are serial numbers of any two measurement poses, u′≠v′ and: [ H gv ′ - 1 ⁢ H gu ′ ] = [ R gu ′ , v ′ T gu ′ , v ′ 0 1 ] , H cg = [ R cg T cg 0 1 ] ⁢ and [ H cv ′ ⁢ H cu ′ - 1 ] = [ R cu ′ , v ′ T cu ′ , v ′ 0 1 ] ; wherein R gu′,v′ is the rotation matrix of the matrix [ H gv ′ - 1 ⁢ H gu ′ ] , T gu ′ ⁢ v ′ is the translation vector of the matrix [ H gv ′ - 1 ⁢ H gu ′ ] , R cg is the rotation matrix of the hand-eye matrix H cg , T cg is the translation vector of the hand-eye matrix H cg , R cu′,v′ is the rotation matrix of the matrix [ H cv ′ ⁢ H cu ′ - 1 ] , is the translation vector of the matrix [ H cv ′ ⁢ H cu ′ - 1 ] ; step 3.2: based on properties of a Kronecker product, transforming the hand-eye calibration equation into a least squares problem, and calculating the hand-eye matrix H cg by using a singular value decomposition; establishing a linear equation set: ( R gu ′ , v ′ ⊗ I - I ⊗ R cu ′ ⁢ v ′ T ) · vec ⁡ ( R cg ) = 0 , wherein I is a unit matrix, ⊗ is an operator of Kronecker product, vec is the operator of vectorization; placing all matrices ( R gu ′ , v ′ ⊗ I - I ⊗ R cu ′ , v ′ T ) of all any two measurement poses by column to obtain a matrix R; performing a singular value decomposition on the matrix R to obtaining a matrix V, namely the right singular matrix of the matrix R, and taking out the 9 elements of last column of the matrix V to revert to a matrix R cg of 3×3; performing a singular value decomposition on the matrix R cg , i.e. R _ c ⁢ g = U R ⁢ ∑ R ⁢ V R T , wherein U R is the left singular matrix of the matrix R cg , Σ R is the singular value matrix of the matrix R cg , V R is the right singular matrix of the matrix R cg , then obtaining the rotation matrix R cg of the hand-eye matrix H c ⁢ g : R c ⁢ g = U R ⁢ V R T ; placing all matrices (R gu′,v′ −I) of all any two measurement poses by column to obtain a matrix R g , placing all matrices (R cg T cu′,v′ −T gu′,v′ ) of all any two measurement poses by column to obtain a matrix T′, and calculating the translation vector of the hand-eye matrix H c ⁢ g : T c ⁢ g = R g - 1 ⁢ T ′ step 3.3: based on minimizing a reprojection error, optimizing the hand-eye matrix H cg by using a particle swarm optimization algorithm step 3.3.1: creating a particle swarm with a population size of K′, wherein the position and the velocity of a k′ th particle are denoted by p k′ and v k′ , respectively, then initializing the positions p k′ and the velocities v k′ ′ of all K′ particles: p k′ =H cg v k′ =rand( v min ,v max ) wherein k′=1, 2, . . . , K′, v min , v max are an upper threshold and a lower threshold of velocity at each iteration of particle and rand( ) is rand function; step 3.3.2: establishing a reprojection error cost function as the fitness function of the particle swarm optimization algorithm: φ ( p k ′ ) = ∑ i ′ = 1 , j ′ ≠ i ′ n ′  P i ′ - 1 s i ′ ⁢ K * ⁢ exp [ p k ′ ⁢ H gj ′ ⁢ H gi ′ - 1 ⁢ p k ′ - 1 ] 3 × 4 ⁢ P j ′  2 2 wherein s i′ is the scale factor of the i′ th measurement pose, K* is the intrinsic parameter matrix of the binocular camera, exp[⋅] 3×4 is an operator of choosing a submatrix of 3×4 of the left upper corner of the matrix in square bracket, 2 is a square-normal operator, and P j′ is the position of the corner of the checkerboard of any non-i th measurement pose in camera coordinate system; step 3.3.3: according to the fitness function φ(p k′ ), finding a position of a current individual extremum min { φ ⁡ ( p k ′ ) , φ ⁡ ( p k ′ * ) } of each particle and taking it as a historical optimal position p k ′ * , finding a position of a global extremum min {φ(p k′ ), k′=1, 2, . . . , K′}, namely the position of the particle of the minimum value of fitness functions and taking it as a global optimal position g* { p k ′ * ← min ⁢ { φ ( p k ′ ) , φ ( p k ′ * ) } g * ← min ⁢ { φ ⁡ ( p k ′ ) , k ′ = 1 , 2 , … , K ′ } step 3.3.4: updating the velocity v k′ and position p k′ of each particle: { v k ′ = ω ⁢ v k ′ + c 1 ⁢ rand ( 0 , 1 ) × ( p k ′ * - p k ′ ) + c 2 ⁢ rand ⁡ ( 0 , 1 ) × ( g * - p k ′ ) p k ′ = p k ′ + v k ′ wherein ω is an inertia factor, c 1 and c 2 are acceleration factors; returning to step 3.3.3 until a termination condition is reached; step 3.3.5: taking the global optimal position g* as an optimized hand-eye matrix, which is denoted by H c ⁢ g r ⁢ e ⁢ v ⁢ i ⁢ s ⁢ e ⁢ d ; step 3.4: performing a 3D measurement step 3.4.1: adjusting the binocular camera of the real structured light measuring equipment so that it clearly captures the measured object, with left and right cameras being kept in the same horizontal position with a certain distance, thereby completing binocular calibration; adjusting the real robot so that it carries the real structured light measuring equipment to perform the 3D measurement and guarantee the full view of the measured object will be taken; step 3.4.2: after receiving the planned measurement paths from the virtual simulation platform, moving the real robot in turn to the measurement poses of the planned measurement paths; step 3.4.3: in sequence: for a measurement pose k P ij , calculating a pose matrix k H ij g from the coordinate system of the end flange of the real robot to a real robot base coordinate system according to the measurement pose k P ij , then obtaining a rigid body transformation matrix: k H i ⁢ j p ⁢ c ⁢ l = k H i ⁢ j g * H c ⁢ g r ⁢ e ⁢ v ⁢ i ⁢ s ⁢ e ⁢ d obtaining a single-view point cloud k I ij of the measured object through the binocular camera of the real structured light measuring equipment and transforming the single-view point cloud k I ij of the measured object from the camera coordinate system to the real robot base coordinate system to obtain a registered single-view point cloud k I ij h ⁢ o ⁢ m ⁢ o : k I ij h ⁢ o ⁢ m ⁢ o = k H ij pcl * k I ij wherein when registered single-view point clouds of all measurement poses are obtained, the points under the real robot base coordinate system form a complete point cloud of the measured object, the 3D measurement is completed.

Description

FIELD OF THE INVENTION This application claims priorities under the Paris Convention to Chinese Patent Application No. 202310261035.X, filed on Mar. 14, 2023 and Chinese Patent Application No. 202310297593.1, filed on Mar. 24, 2023, the entirety of which is hereby incorporated by reference for all purposes as if fully set forth herein. The present invention relates to the field of three dimensional (3D) scanning measurement, more particularly to a method for robot assisted multi-view 3D scanning measurement based on path planning. BACKGROUND OF THE INVENTION With the modernization of machinery manufacturing industry, higher requirements are put forward for the size of complex mechanical component. For large and complex mechanical component, 3D measuring equipment, such as three-coordinate measuring equipment or laser scanner is used in traditional 3D measurement, which is not only costly but also inefficient. In addition, the spaces of some parts of a large and complex component may be limited, which makes the field of view of the measuring equipment difficult to touch, so that the shape, namely 3D measurement cannot be completed. The general solution for the 3D measurement of a large and complex mechanical component is to move a structured light measuring equipment to the pose corresponding to each angle of view, obtain multiple single-view point clouds of the large and complex mechanical component from multiple angles of views and register the multiple single-view point clouds to make the point cloud of the large and complex mechanical component complete. The three-dimensional reconstruction technology based on industrial robot uses industrial robot as motion carrier and determines the pose relationship of the unit measurement coordinate systems of multiple angles of views through the constraints between the coordinate systems of axes of the robot, which make the characteristics, such as non-contact, high speed of visual measurement technology retained, and meanwhile the flexibility of the entire measurement system enhanced through the fast and flexible characteristic of the robot. In factory, traditional method of generating a measurement path for a robot is usually through manual teaching. The measurement operator selects the path points one by one for a 3D measurement, which requires a higher technical level of the measurement operator and a higher familiarity of the field working environment. For the reason that the accuracy of single-view point cloud is greatly affected by the distance measurement of structured light measuring equipment, the measurement operator needs to manually measure the distance between the structured light measuring equipment and the large and complex mechanical component and judges whether it meets the shooting distance requirements. However, in some measurement conditions, for example, in high temperature environment, it is difficult to obtain the distance between the structured light measuring equipment and the large and complex mechanical component, how to ensure that the distance between the structured light measuring equipment and the large and complex mechanical component meets the requirement under each measurement pose has become a problem. So it is necessary to adopt an effective method to reduce the difficulty of operating a robot in 3D measurement and obtain higher quality single-view point clouds by precisely obtaining the distance between the structured light measuring equipment and the large and complex mechanical component. At the same time, in current industrial robot related technology, a hand-eye calibration equation is established through a hand-eye calibration by acquiring and recording the camera information and robot attitude information of the robot under different poses, then a hand-eye relationship matrix can be obtained by linearly solving the hand-eye calibration equation. However, in the process of solving in above method, the only considered is minimizing the algebraic error, the influence of image noise on the hand-eye calibration accuracy is not considered, which leads to a lower calibration accuracy of the attitude relationship obtained by linear solving. The lower calibration accuracy of the attitude relationship will affect the accuracy and integrity of the three-dimensional measurement. SUMMARY OF THE INVENTION The present invention aims to overcome the deficiencies of the prior art and provides a method for robot assisted multi-view 3D scanning measurement based on path planning. One aim of the present invention is to improve the work efficiency of structured light measuring equipment to shorten the measurement time and provide good initial values to lighten the burden of the subsequent fine registration; the other aim of the present invention is to optimizing the accuracy of hand-eye calibration to guarantee the accuracy of registering multiple single-view point clouds, so as to improve the accuracy and integrity of 3D measurement. To achie