Search

CN-121973280-A - Welding robot work monitoring method and system based on machine vision

CN121973280ACN 121973280 ACN121973280 ACN 121973280ACN-121973280-A

Abstract

The invention discloses a welding robot work monitoring method and a system based on machine vision, wherein the scheme is that a monitoring platform consisting of an industrial robot, a welding head, a machine vision sensor, an industrial personal computer and a controller is constructed; on the basis, a unified coordinate system is established through camera internal parameter calibration, hand-eye calibration and tool center point calibration, welding images, robot pose and technological parameters are collected, welding seam characteristics are extracted through preprocessing and dynamic region-of-interest locking, three-dimensional reconstruction and track fitting are conducted through combination of depth information, a welding seam reference track and a welding gun theoretical pose are generated, finally working states are judged through fusion of deviation parameters, tracks and technological parameters, hysteresis compensation, online deviation correction and abnormal alarm are conducted, and the scheme can improve welding seam monitoring precision, tracking stability and welding quality consistency under a strong interference working condition.

Inventors

  • ZHENG XIANGPAN
  • ZHANG PENGTU
  • CHEN YUE
  • ZHU YUCHENG
  • CAI LISHENG

Assignees

  • 闽江学院

Dates

Publication Date
20260505
Application Date
20260323

Claims (10)

  1. 1. The welding robot work monitoring method based on machine vision is applied to a welding robot vision monitoring platform, the welding robot vision monitoring platform comprises an industrial robot, an industrial personal computer, a man-machine interaction terminal, a welding head, a machine vision sensor and a robot controller, wherein the welding head, the machine vision sensor and the robot controller are deployed on the industrial robot, the machine vision sensor is used for performing global monitoring on a welding area, the industrial personal computer is in communication connection with the robot controller and is used for performing work control and information interaction on the industrial robot, the machine vision sensor and the welding head, the man-machine interaction terminal is in communication connection with the industrial personal computer and the robot controller and is used for performing work information interaction, and the work monitoring method is characterized by comprising the following steps: S1, acquiring a planned welding track, welding process parameters and a monitoring threshold value of a workpiece to be welded, and establishing a monitoring object set of a welding task; S2, performing camera internal parameter calibration and distortion correction on the machine vision sensor, performing hand-eye calibration between the machine vision sensor and an industrial robot base coordinate system, and calibrating a welding tool center point to establish a unified coordinate transformation relation among a vision coordinate system, a robot coordinate system and a welding tool coordinate system; S3, respectively acquiring a welding area image sequence before and during welding, and synchronously acquiring pose information and welding process parameter information of the industrial robot; S4, preprocessing the acquired image sequence, and constructing a dynamic interest area according to the characteristic position of the welding seam at the previous moment to obtain a candidate welding monitoring area at the current moment; S5, performing weld feature extraction in the candidate welding monitoring area to obtain two-dimensional weld feature and depth information, and calculating working monitoring parameters of the welding gun relative to the weld; S6, constructing a three-dimensional point cloud of a welding area according to the extracted two-dimensional welding seam characteristics and depth information, registering and fusing multi-frame point clouds, and performing curve fitting on a welding seam space track to generate a welding seam three-dimensional reference track and a corresponding welding gun theoretical posture field; And S7, fusing the welding line characteristics, the work monitoring parameters, the welding line three-dimensional reference track, the actual pose of the industrial robot and the welding process parameters at the current moment, and judging the current welding working state of the industrial robot.
  2. 2. The method of claim 1, wherein the industrial robot is a six-axis industrial robot, the machine vision sensor is an active vision sensor that is a line structure light sensor that is mounted on a flange structure at the end of the industrial robot and rigidly connected to a welding head to form an eye-in-hand vision monitoring structure.
  3. 3. The welding robot work monitoring method based on machine vision according to claim 1, wherein in step S2, the camera internal reference calibration is performed by Zhang Zhengyou calibration method, the hand-eye calibration is performed by matrix solution based on standard sphere multi-pose sampling, so as to obtain a rotation matrix and a translation vector of the machine vision sensor relative to the end flange of the industrial robot, and further, a unified coordinate transformation model is established by combining the calibration result of the center point of the welding tool.
  4. 4. The welding robot work monitoring method based on machine vision according to claim 1, wherein the preprocessing of the image sequence in step S4 includes de-distortion correction, gray level normalization, anisotropic diffusion filtering, morphological opening and closing operation, and background suppression, and the dynamic interest area is updated according to the weld center position, groove edge position, or structured light center position of the previous frame.
  5. 5. The welding robot work monitoring method based on machine vision according to claim 1, wherein in step S5, the feature extraction of the weld is realized by combining a lightweight semantic segmentation network with sub-pixel positioning, a weld candidate region is obtained by using the lightweight semantic segmentation network, and then a gray-scale gravity center method or an edge normal intersection method is used for calculating the coordinates of a center line or the coordinates of a feature point of the weld; In step S5, after the weld feature extraction is carried out, at least one feature of a weld center line, a groove edge, a structured light stripe, a molten pool profile and a welding gun projection is obtained; The operation monitoring parameter is at least one of lateral deviation, height deviation and attitude deviation of the welding gun relative to the welding line.
  6. 6. The welding robot work monitoring method based on machine vision according to claim 1, wherein in the step S6, a mode of combining iterative closest point registration and normal vector feature-based registration is adopted when multi-frame point cloud registration fusion is carried out; in S7, the welding working state at least includes a normal working state, a state to be corrected and an alarm state.
  7. 7. The machine vision based welding robot work monitoring method of claim 1, wherein in step S7, the work monitoring parameters include at least lateral deviation, height deviation, attitude deviation, weld continuity, and visual disturbance index; when the transverse deviation, the height deviation and the attitude deviation are all in a preset first-level threshold range, and the weld continuity is not lower than the continuity threshold and the visual interference index is not higher than the interference threshold, judging that the welding is in a normal operation state; when at least one working monitoring parameter exceeds a primary threshold value but does not exceed a secondary threshold value, judging a state to be corrected; and judging an alarm state when at least one operation monitoring parameter exceeds a secondary threshold value or a plurality of continuous frames cannot extract effective weld joint characteristics.
  8. 8. The machine vision based welding robot work monitoring method of claim 6 or 7, further comprising: And S8, when the judging result is in the state to be corrected, compensating and predicting the working monitoring parameters according to the visual feedback time delay and the robot motion response time delay, outputting a robot tail end correction instruction, and when the judging result is in the alarm state, outputting alarm information and recording an abnormal image, an abnormal position and process parameters.
  9. 9. The welding robot work monitoring method based on machine vision according to claim 8, wherein in step S8, the visual feedback time delay comprises an image acquisition time delay, an image processing time delay and a communication time delay, the robot motion response time delay comprises a controller processing time delay and an actuating mechanism inertia response time delay, a discrete state prediction model is built according to the visual feedback time delay and the robot motion response time delay, and a robot end correction instruction is output after the current work monitoring parameter is subjected to prospective prediction.
  10. 10. A machine vision based welding robot work monitoring system, comprising: the task initialization module is used for reading the theoretical welding track, welding process parameters and monitoring threshold value of the workpiece to be welded and establishing a monitoring object set of the welding task; the calibration modeling module is used for performing camera internal parameter calibration, hand-eye calibration and welding tool center point calibration so as to establish a unified coordinate transformation relation; The visual acquisition module is used for acquiring image sequences before and during welding and synchronously acquiring pose information and welding process parameter information of the industrial robot; the image preprocessing module is used for preprocessing the image sequence and constructing a dynamic interest area; The feature extraction module is used for extracting at least one feature of a welding line, a groove edge, a structural light stripe, a molten pool profile and a welding gun projection, and calculating a work monitoring parameter; the three-dimensional reconstruction module is used for constructing a three-dimensional point cloud of a welding area and generating a three-dimensional reference track of a welding line and a theoretical posture field of a welding gun; The state evaluation module is used for fusing welding seam characteristics, working monitoring parameters, a welding seam three-dimensional reference track, an actual pose of the industrial robot and welding process parameters and judging the current welding working state; the compensation output module is used for outputting a robot tail end correction instruction when the current welding working state is a state to be corrected; And the alarm recording module is used for outputting alarm information and recording abnormal images, abnormal positions and process parameters when the current welding working state is an alarm state.

Description

Welding robot work monitoring method and system based on machine vision Technical Field The invention relates to the technical field of intelligent manufacturing and welding, in particular to a welding robot work monitoring method and system based on machine vision. Background The welding robot is taken as an important component of intelligent manufacturing equipment and is widely applied to the scenes of automobile manufacturing, engineering machinery, rail transit, new energy battery components, high-end equipment component connection and the like. In particular, the laser welding and high-precision arc welding processes have the advantages of small heat input, high welding speed, high forming quality, easy automatic integration and the like, so that the welding robot has higher requirements on track precision, attitude control and process stability. The annex text also clearly indicates that visual guidance welding has become an important direction to improve welding quality and efficiency in the context of manufacturing industry digitization and intelligent upgrades. The existing welding robot executes welding operation in a teaching reproduction mode, namely, a welding track is obtained through manual teaching, and then a preset path is repeatedly executed by the robot. The method has certain feasibility under regular workpieces and stable working conditions, but is difficult to perform self-adaptive adjustment in time for workpiece assembly errors, weld joint position deviations, gap fluctuations, workpiece thermal deformations and tool repeated positioning errors which are commonly existing in actual production. Particularly in a high-precision welding scene, the welding spot size is small, the centering precision requirement on a welding gun and a welding line is extremely high, and the problems of welding deviation, welding leakage, inconsistent formation and the like are easy to occur by simply relying on an off-line track or a teaching track. The attachment text has been discussed explicitly for these limitations. To ameliorate the above problems, the prior art began to introduce visual sensors for weld seam identification and tracking. However, the existing visual monitoring scheme still has the defects that a passive visual mode is easily influenced by arc light, smoke dust, splashing and high reflection of a workpiece, a high signal-to-noise ratio image is difficult to acquire stably, a part of image processing methods can extract two-dimensional weld characteristics, but the spatial position of a weld and relative posture information of a welding gun are difficult to reflect accurately, a part of recognition methods based on deep learning are high in interference resistance, but dependence on training data scale and calculation force resources is high, limitation still exists in terms of real-time performance of an industrial field, and fourth, the prior art is always more concerned with weld recognition or track generation per se, and a unified monitoring and grading judging mechanism is lacking in transverse deviation, height deviation, posture deviation, visual distortion and process abnormality in the operation process of a welding robot. The accessory text can exactly support the proposal of the technical problem about weld joint visual identification, three-dimensional point cloud processing and analysis of current situation and bottleneck of track planning. On the other hand, even if the welding line image or the local point cloud can be obtained in the prior art, the problems of vision acquisition time delay, image processing time delay, communication time delay and inertial response time delay superposition of the robot still commonly exist in practical application. That is, the perceived deviation of the system is often already past, and the robot motion adjustment is now and next, which are naturally unsynchronized. If a unified coordinate calibration mechanism, a three-dimensional reference track generation mechanism and a time lag compensation mechanism are absent, accurate monitoring, timely correction and reliable alarm of the welding robot work state are difficult to realize. The hand-eye calibration, three-dimensional reconstruction, NURBS track fitting, MPC real-time deviation correction and monitoring software platform proposed in the attachment body reflects the urgent need of the existing system to evolve from a single recognition function to a full-flow closed-loop monitoring direction. Therefore, it is necessary to provide a welding robot work monitoring method and system based on machine vision, so as to solve at least one of the problems in the prior art that the extraction of the welding seam characteristics is unstable, the welding seam space information and the welding gun posture are difficult to uniformly acquire, the welding work state lacks continuous monitoring, the visual feedback and the robot movement have time lag mismatch, the abnormal working condition i