Search

CN-119228893-B - Mechanical arm space pose self-adaptive estimation method for fusion of motion information and visual information

CN119228893BCN 119228893 BCN119228893 BCN 119228893BCN-119228893-B

Abstract

A mechanical arm space pose self-adaptive estimation method based on fusion of motion information and visual information comprises the steps of firstly combining an internal reference matrix of a binocular camera, realizing image two-dimensional characteristic three-dimensional reconstruction based on an epipolar constraint method, then completing 'hand-eye calibration' through a mode of combined measurement of a laser tracker and the binocular camera, calculating a relative position relation between a machining tail end of a mechanical arm and a workpiece center based on a calibration relation, finally providing a self-adaptive expansion Kalman filter, combining a known binocular camera to measure a noise covariance matrix, self-adaptively adjusting state noise covariance matrix of tail ends of the mechanical arm at different positions, realizing smooth estimation of tail end positions in the motion process of the mechanical arm, and finally enabling the tail ends of the mechanical arm to accurately move into an effective visual field range of the binocular camera. The method solves the problem that the tail end position of the mechanical arm is difficult to estimate accurately under the complex working condition, and can realize the self-adaptive accurate estimation of the tail end position of the mechanical arm without the need of priori accurate observation and motion noise covariance matrix.

Inventors

  • LU YONGKANG
  • WANG JIAWEI
  • HAN LEI
  • SONG LIGANG
  • ZHENG YAN
  • WEN KE
  • ZHANG YANG
  • LIU WEI

Assignees

  • 大连理工大学

Dates

Publication Date
20260508
Application Date
20240929

Claims (1)

  1. 1. A mechanical arm space pose self-adaptive estimation method integrating motion information and visual information is characterized by comprising the following steps: First, three-dimensional coordinates of feature points in a camera coordinate system are obtained Firstly, calibrating internal and external parameters of a binocular camera to obtain internal reference matrixes K L and K R and external reference matrixes R and t of a left camera and a right camera, then, arranging visual mark points in a local area to be detected, acquiring an image of the area to be detected by adopting the binocular camera, further, carrying out binarization division on the image to obtain a foreground and a background of the image, carrying out edge detection on the image according to an edge detection algorithm, and finally, screening image characteristics by adopting roundness constraint and area constraint, wherein the result after image screening is shown as formula (1): Wherein (S min ,S max ) is a set area threshold, T C is a set roundness threshold, C is a perimeter of each region contour, S is an area of each region contour, and e represents roundness of each region contour; Then, a base matrix is calculated from the reference matrix of the binocular camera: Wherein, the An antisymmetric matrix calculated for a translation vector between the left and right cameras; Secondly, calculating central coordinates [ u L v L 1] T and [ u R v R 1] T ] of visual mark points of a characteristic region in an image shot by a camera by adopting an ellipse fitting method, and constructing the following equation: Wherein [ ab c ] T is the product of the basis matrix F and [ u R v R 1] T ; converting the formula (3) into a linear equation, namely an epipolar equation: au L +bv L +c=0 (4) When the distance between the image point and the corresponding polar line is smaller than the set constraint threshold, the image point is considered as a matching point corresponding to the image point of the other image; Finally, combining the depth information Z of each mark point to realize the three-dimensional reconstruction of the mark point: wherein, (X Y Z) is the three-dimensional coordinates of the marker point in the camera coordinate system, c x and c y are the pixel coordinates of the main point of the camera, f is the focal length of the camera, d is the parallax of the left and right images, and B is the distance between the optical centers of the left and right cameras; second, mechanical arm tail end on-line positioning based on combined measurement Firstly, n laser trackers and vision cooperative target points are placed on a calibration plate, and three-dimensional coordinates of the n laser trackers and vision cooperative target points are measured by using the laser trackers and a binocular camera respectively: P c ={p 1 p 2 p 3 …p n }, Q l ={q 1 q 2 q 3 …q n } (6) wherein, P c and Q l are the three-dimensional coordinates of the visual cooperation target point in the camera coordinate system and the laser tracker coordinate system respectively; then, solving a conversion relation between a camera coordinate system and a laser tracker coordinate system based on a least square principle: Wherein, the The rotation matrix and the translation vector are respectively arranged between the two coordinate systems; Then, three laser tracker targets are installed on the tooling of the binocular camera, the coordinates of the targets are measured by using a laser tracker, a tooling coordinate system is established by using the three points, and the coordinates of the three points in the tooling coordinate system are calculated: Further, conversion relations R f l and t f l between the tool coordinate system and the laser tracker coordinate system are calculated, and relations R c f and t c f between the camera coordinate system and the tool coordinate system are calculated based on the relations between the camera coordinate system and the laser tracker coordinate system: R c f =(R f l ) -1 R c l ,t c f =(R f l ) -1 (t c l -t f l ) (9) then, by changing the pose of the mechanical arm, the transformation pose among the tool coordinate systems under the pose states of a plurality of mechanical arms can be obtained Position and posture changing between mechanical arm tail end coordinate systems Representing the pose transformation relation of the tool coordinate system under the states from the jth pose to the ith pose, The pose transformation relation of the tail end coordinate system of the mechanical arm under the states from the j-th pose to the i-th pose can be expressed, and the following relation can be obtained: Wherein, the And The relation between the tool coordinate system of the jth pose and the ith pose and the end coordinate system of the mechanical arm is respectively, and the transformation relation is fixed under any pose; Namely: wherein R and T respectively represent a rotating part and a translating part corresponding to the pose transformation relation T; The calculation process can be regarded as solving a typical nonlinear optimization problem ax=xb, obtained by solving the above equation set Namely, the conversion relation between the tool coordinate system and the mechanical arm tail end coordinate system The conversion relation R c e 、t c e between the camera coordinate system and the tail end coordinate system of the mechanical arm can be calculated by combining the formula (9), namely the hand-eye calibration is finished: After the 'hand-eye calibration' relation is obtained, a laser tracker target seat is fixed at the processing end on the flange plate at the end of the mechanical arm, the laser tracker target seat is used as a positioning object, the laser tracker is used for measuring and obtaining the coordinate of the target seat to be P l m , and based on the conversion relation T l c between the laser tracker and the binocular camera in the formula (7), the three-dimensional coordinate of the positioning point of the processing end under the camera coordinate system can be obtained Then, the three-dimensional coordinates of the center of the workpiece to be processed in the camera coordinate system obtained according to the formula (5) The conversion relation between the tail end coordinate system of the mechanical arm and the base coordinate system of the mechanical arm is solved based on pose information obtained by the mechanical arm per se, and the conversion relation is as follows Solving three-dimensional coordinates of processing end and workpiece center in mechanical arm base coordinate system And Wherein, the Is the conversion relation between the camera coordinate system and the mechanical arm terminal coordinate system Finally, the relative position between the processing tail end and the center of the workpiece is calculated and transmitted to a mechanical arm controller to control the tail end of the mechanical arm to reach a target area along a designated path; third, mechanical arm end position estimation based on improved adaptive extended Kalman filtering In order to enable the mechanical arm to accurately reach a target area, dynamically estimating the tail end position in the motion process of the mechanical arm through an improved self-adaptive extended Kalman filter; first, an observation matrix of the system is calculated and linearized: Z k =H·X k +H·X k|k-1 +v k (15) Wherein v k is the system observation noise with the mean value of 0 and the covariance matrix of R, H is the observation matrix, X k is the state vector of the tail end of the mechanical arm at k moment, and X k|k-1 is the state prediction of k moment based on the information at k-1 moment; then, the motion state prediction equation of the mechanical arm end is: X k =A·X k-1 +Bu k +w k (16) Wherein A is a state transition matrix, B is a control input matrix, u k is a control vector, w k is motion state noise with mean value of 0 and covariance matrix of Q; Based on the initialized motion state noise covariance matrix Q, an adjustment quantity delta Q is introduced, a covariance matrix R of observation noise can be obtained through calibration, and is a known quantity, then a covariance matrix estimation error is derived from the delta Q, and a prediction equation of an actual error covariance matrix is adjusted as follows: Wherein, P k is the error covariance matrix at k time, P k|k-1 is the predicted error covariance matrix at k time based on the information at k-1 time, P k-1 and Q k-1 are the observed and motion state noise covariance matrices at k-1 time, respectively, and DeltaQ k-1 is the motion state noise covariance matrix adjusted at k-1 time; according to the covariance matching theory of kalman filtering, the covariance expectation of the observation residual can be expressed as: Wherein, the Is the observation residual; Then, according to the expectation of the observation residual, a formula for adjusting the motion state noise covariance matrix is deduced: ΔQ=(K·H) -1 (KE(ε k ·ε k T )K T -P k|k-1 H T ·K T )·((K·H) -1 ) T (19) wherein K is Kalman gain; finally, based on the information, updating the motion state noise covariance matrix Motion state matrix X k , error covariance matrix: all the observed data are processed in turn, and the motion state noise variance matrix is dynamically adjusted And updating state estimation of the system according to the new observed value, so that the machining tail end of the mechanical arm smoothly and accurately moves to the effective field of view of the binocular camera.

Description

Mechanical arm space pose self-adaptive estimation method for fusion of motion information and visual information Technical Field The invention belongs to the fields of digital measurement and robots, and relates to a mechanical arm space pose self-adaptive estimation method for fusing motion information and visual information. Background The high-end equipment in the aerospace field has the advantages of larger and larger component size and more complex structure, and the working range of a machine tool is gradually exceeded. The movable industrial mechanical arm has the advantages of large operation range, good flexibility, high efficiency and the like, and has great potential in the aspect of processing large parts. However, the absolute positioning accuracy of the industrial mechanical arm is low, the machining accuracy requirement is difficult to meet only by means of the positioning accuracy of the industrial mechanical arm, and an additional accuracy guarantee technology must be introduced. At present, the method for improving the absolute positioning accuracy of the mechanical arm mainly comprises an off-line calibration method and an on-line positioning method. The method has the advantages that the method can accurately acquire the current actual pose of the tail end of the mechanical arm through online feedback of measurement information, serial calculation is not needed through each joint parameter, and therefore the absolute positioning precision of the tail end of the mechanical arm is ensured. The robot tail end auxiliary positioning of 'eyes on hands' is one of the most widely used online positioning methods at present, namely a binocular camera is arranged at the tail end of a mechanical arm, coordinate information of positioning points of a region to be detected is acquired through the binocular camera in a short distance, and positioning information is transmitted to a mechanical arm control system, so that high-precision positioning of the tail end of the mechanical arm is realized. However, the visual field and the depth of field of the binocular vision camera are limited, and the tail end of the mechanical arm is difficult to directly move to the binocular positioning area under the complex working condition, so that the robustness and the reliability of binocular positioning are seriously reduced. In order to assist the tail end of the mechanical arm to reliably move to the binocular vision measurable area, the real-time motion information of the mechanical arm is combined to accurately estimate the space pose of the tail end of the mechanical arm. The traditional mechanical arm tail end space pose estimation method needs to accurately observe and move state noise covariance matrixes a priori, and is insufficient in self-adaption capability under complex working conditions, and the mechanical arm tail end pose self-adaption estimation method still needs to be suitable for complex sites. Patent ZL202111045806.9 discloses a calculation method for calculating the tail end pose of a mechanical arm by combining camera information and joint rotation angle information. Patent ZL201710936467.0 discloses a method for estimating the grabbing gesture of a mechanical arm based on internal parameters and image information of a binocular camera. The two kinds of the invention realize the estimation of the pose of the mechanical arm through the binocular camera and other mechanical arm parameters, but the mechanical arm joint parameters are difficult to accurately calibrate in the operation process of the mechanical arm, and the tail end position of the mechanical arm is difficult to accurately estimate only by virtue of motion information, so that the mechanical arm cannot accurately move to the effective visual field range of the binocular camera. Therefore, the invention provides the mechanical arm space pose self-adaptive estimation method for fusing the motion information and the visual information, and the mechanical arm tail end pose self-adaptive accurate estimation is realized. Disclosure of Invention The invention provides a mechanical arm space pose self-adaptive estimation method based on fusion of motion information and visual information, which is oriented to the positioning requirement in an industrial mechanical arm operation scene and aims at solving the problem that the pose information of the processing end of the mechanical arm is difficult to accurately estimate by only depending on the motion information of the mechanical arm. The method comprises the steps of combining an internal reference matrix of a binocular camera, realizing image two-dimensional characteristic three-dimensional reconstruction based on an epipolar constraint method, completing hand-eye calibration through a combined measurement mode of a laser tracker and the binocular camera, calculating the relative position relation between the machining tail end of the mechanical arm and the center of a workpiece base