CN-122018492-A - Visual guidance control method for underwater hull cleaning robot
Abstract
The invention discloses a visual guidance control method of an underwater hull cleaning robot, which relates to the technical field of visual guidance and comprises the steps of obtaining underwater original images of vertical and inclined surfaces of a side hull, longitudinally partitioning the images along the side height based on the gravity direction, combining partition brightness distribution priori, carrying out self-adaptive exposure compensation on each partition, eliminating brightness mutation of adjacent partitions through longitudinal consistency constraint, generating a continuous brightness side image, extracting structural characteristic lines of a hull weld joint and a reinforcing rib on the basis, constructing a visual reference coordinate system consistent with the hull structure, carrying out real-time correction control on the cleaning robot according to visual offset, and guiding the cleaning robot to execute stable and continuous side cleaning paths along the structural characteristic line direction, thereby solving the problems of visual distortion, unstable structural characteristic identification and easy deviation of the cleaning paths caused by uneven illumination and posture change in an underwater side cleaning scene.
Inventors
- BAI HAOLONG
- BA YUE
Assignees
- 越山海特种机器人(滨海)有限公司
Dates
- Publication Date
- 20260512
- Application Date
- 20251225
Claims (8)
- 1. The visual guidance control method for the underwater hull cleaning robot is characterized by comprising the following steps of: acquiring underwater original visual images of vertical and inclined surfaces of the side ship body; performing broadside longitudinal partitioning on the original visual image based on the gravity direction to obtain a plurality of longitudinal first image partitions arranged along the broadside height direction; performing adaptive exposure compensation on each first image partition based on partition brightness distribution prior, and generating a brightness compensated second partition image; continuously correcting brightness boundaries between second partition images adjacently arranged along the gravity direction based on longitudinal consistency constraint to generate a third broadside image with uniform brightness; extracting structural feature lines of a ship weld joint and a reinforcing rib on the third broadside image, and constructing a visual reference coordinate system; And performing vision guiding deviation correcting control on the cleaning robot based on the deviation amount in the vision reference coordinate system, and performing broadside cleaning path control along the direction of the structural characteristic line.
- 2. The method for controlling the visual guidance of an underwater hull cleaning robot according to claim 1, wherein the capturing of the underwater raw visual images of the vertical and inclined surfaces of the side hulls is specifically: After the cleaning robot is attached to the side hull surface and enters a cleaning operation state, Based on an underwater camera mounted to the front end of the robot toward the surface of the side hull, Under the condition of keeping the normal component of the optical axis of the camera towards the surface of the ship body, Original underwater image frames covering the vertical plane and the inclined transition plane of the side ship are continuously acquired, And forming the original image frames into a broadside original visual image sequence according to the acquisition time sequence to obtain an original visual image.
- 3. The method for controlling the visual guidance of the underwater hull cleaning robot according to claim 2, wherein the performing the broadside longitudinal division on the original visual image based on the gravity direction obtains a plurality of longitudinal first image divisions arranged along the broadside height direction, specifically: for each frame of the sequence of raw visual images on the broadside, based on the gravity direction vector output by the inertial measurement unit in the robot body coordinate system, Projecting the gravity direction vector in an image coordinate system to determine an image longitudinal axis corresponding to the broadside hull height direction; Taking the longitudinal axis as a zoning direction, dividing the original visual image in equal proportion along the direction, Dividing the longitudinal dimension of the image into a plurality of continuous and non-overlapping longitudinal image areas according to a preset partition proportion; For each longitudinal image region, extracting its set of pixels in the original visual image, And the pixel sets corresponding to the longitudinal image areas are respectively defined as first image partitions which are arranged from top to bottom along the broadside height direction, Obtaining a group of first image partitions which are longitudinally arranged from a single frame of original visual image; A set of longitudinally aligned processes is repeated for each frame of the original visual image in the sequence of broadside original visual images, A first sequence of image partitions is obtained that is continuously updated over time.
- 4. The vision guidance control method of an underwater hull cleaning robot according to claim 3, wherein the adaptive exposure compensation is performed on each first image partition based on the partition brightness distribution prior, and a brightness compensated second partition image is generated, specifically: counting the square distribution of the pixel gray values based on the pixel set in each first image partition in the first image partition sequence; calculating an average brightness value, a brightness variance and a highlight pixel duty ratio corresponding to the square distribution; The average brightness value, the brightness variance and the high brightness pixel duty ratio are combined to form a brightness distribution characteristic vector of the first image partition; Mapping the brightness distribution feature vector to a corresponding target brightness interval based on a pre-established topside ship brightness distribution prior model, and taking the brightness distribution feature vector as an exposure compensation parameter of a first image partition according to the deviation between the target brightness interval and the current brightness distribution; Performing brightness mapping transformation on the pixel gray values in the first image partition based on the exposure compensation parameters, and mapping the original gray values of each pixel in the first image partition into the target brightness interval to obtain a second partition image with exposure compensation processing completed; An exposure compensation processing step is performed for each of the first image partitions arranged in the side height direction, Generating a second partition image set which corresponds to the first image partition one by one and keeps the original spatial position relationship unchanged, Thereby obtaining a brightness compensated second segmented image sequence arranged in the broadside height direction.
- 5. The vision guidance control method of an underwater hull cleaning robot according to claim 4, wherein the continuously correcting the brightness boundary between the second partitioned images adjacently arranged in the gravity direction based on the longitudinal consistency constraint generates a brightness-unified third broadside image, specifically: Sequentially selecting two adjacent second partition images in the gravity direction as a group of adjacent partition image pairs according to the space arrangement sequence determined in the first image partition stage for the second partition image set arranged in the broadside height direction; respectively extracting a pixel gray level set in a corresponding boundary zone in an adjacent boundary zone along the broadside height direction of each group of adjacent partition image pairs, and calculating the average value and gradient distribution of the pixel gray levels in the boundary zone so as to represent the brightness continuous state of the adjacent second partition image at the partition boundary; Based on the boundary gray mean difference and gradient direction consistency of the adjacent partition image pair, constructing a brightness consistency constraint function aiming at the adjacent partition image pair, and taking gray abrupt change amplitude at the boundary as a constraint error term; under the condition of keeping the internal brightness mapping relation of each second partition image unchanged, taking the brightness consistency constraint function as a target, applying boundary brightness correction mapping to at least one partition image in the adjacent second partition images, and continuously adjusting the gray value of pixels in a boundary zone to enable the gray mean value difference of the adjacent second partition images at the contact boundary of the adjacent second partition images to be within a preset consistency threshold; And sequentially performing the boundary brightness continuity correction processing on all adjacent second partition image pairs arranged along the side height direction, and re-splicing the second partition images subjected to the boundary brightness correction on the premise of not changing the original spatial arrangement sequence of the second partition images to generate a third side image which is continuous in brightness along the side height direction and eliminates partition exposure abrupt change.
- 6. The method for controlling visual guidance of an underwater hull cleaning robot according to claim 5, wherein the extracting structural feature lines of the hull weld and the reinforcing ribs on the third side image and constructing a visual reference coordinate system specifically comprises: Performing edge enhancement filtering processing on the third broadside image based on the pixel gray distribution thereof, suppressing large-scale brightness variation and highlighting gray gradient response of a linear structure; on a third broadside image subjected to edge enhancement processing, calculating the gradient amplitude and gradient direction of each pixel point, and generating a corresponding gradient amplitude diagram and gradient direction diagram; Based on the gradient amplitude diagram, adopting non-maximum suppression processing of threshold constraint to extract a candidate edge pixel set with significant gradient response; performing linear structure clustering on the basis of gradient direction consistency and space continuity constraint in the candidate edge pixel sets, and merging the pixel sets meeting a preset minimum length threshold and a preset direction consistency threshold into linear structure candidate sets; for each linear structure candidate set, calculating corresponding linear parameters by adopting a least square linear fitting method, and determining a linear structure meeting the linear fitting residual error threshold constraint as a ship weld joint or reinforcing rib structure characteristic line; based on the straight line parameters of the structural feature lines, selecting the structural feature lines extending along the broadside height direction as main reference lines, and defining the main axis direction in a visual reference coordinate system by using the direction vectors of the main reference lines; in the visual reference coordinate system, mapping a direction vector of a main reference line to a longitudinal reference axis with an image coordinate system of a third broadside image as a benchmark, and defining a direction orthogonal to the longitudinal reference axis as a transverse reference axis; thereby completing the extraction of the structural characteristic lines of the weld joints and the reinforcing ribs of the ship body based on the third broadside image, and constructing a visual reference coordinate system consistent with the structural direction of the broadside ship body.
- 7. The method according to claim 6, wherein the vision-guided deviation correction control is performed on the cleaning robot based on the offset in the vision reference coordinate system, and the broadside cleaning path control is performed along the structural feature line direction, specifically: mapping the position of the cleaning robot in the current third broadside image into the visual reference coordinate system based on the visual reference coordinate system, and acquiring the lateral offset and the heading angle offset of the current position of the cleaning robot relative to the main axis direction of the visual reference coordinate system; The transverse offset is used as transverse position error input, the course angle offset is used as attitude error input, and corresponding transverse deviation correction control quantity and course correction control quantity are generated based on a preset proportion or proportion integration control relation; Respectively converting the transverse deviation correcting control quantity and the course correcting control quantity into control instructions for a thrust distribution and posture adjustment mechanism of a cleaning robot propeller, and executing transverse position deviation correction and course posture correction on the cleaning robot to ensure that the cleaning robot keeps stable attaching movement along the main axis direction of a visual reference coordinate system; After the deviation correction control is completed and the preset stability threshold condition is met, controlling a cleaning robot to take the main axis direction of the visual reference coordinate system as the cleaning path direction, and executing continuous broadside cleaning path movement along the extending direction of the welding seam or the reinforcing rib structural characteristic line of the ship body; and continuously recalculating the transverse offset and the course angle offset based on the updated third broadside image in the cleaning path execution process, and updating the correction control quantity in real time, so as to form a closed-loop visual guide correction control and broadside cleaning path control process based on a visual reference coordinate system.
- 8. The vision guidance control method of an underwater hull cleaning robot according to claim 7, wherein the prior model based on the pre-established ship side hull brightness distribution is specifically: Based on the underwater topside image samples acquired during the historical topside hull cleaning operation, Longitudinally partitioning a sample image according to the broadside height direction, and respectively counting the average brightness range, the brightness variance range and the highlight pixel duty ratio range of the pixel gray values in each height partition; Based on the statistics corresponding to each height partition, The target luminance interval in the broadside height direction is expressed as a piecewise function that monotonically varies with height, And storing the target brightness interval parameters corresponding to each height partition as a topside ship brightness distribution prior model, And the method is used for obtaining a corresponding target brightness interval by looking up a table according to the position index of the first image partition in the broadside height direction in the subsequent exposure compensation processing.
Description
Visual guidance control method for underwater hull cleaning robot Technical Field The invention relates to the technical field of visual guidance, in particular to a visual guidance control method of an underwater hull cleaning robot. Background In the conventional underwater hull cleaning robot, a guiding control method based on underwater vision is commonly adopted in the broadside cleaning operation. The method generally obtains the image information of the surface of the ship body in real time through an underwater camera arranged at the front end or the bottom of the robot, and combines the image processing and the motion control algorithm to realize the positioning, the posture adjustment and the cleaning path planning of the robot on the surface of the ship body. In the prior art, the visual information is mainly used for assisting in judging the position relation of the robot relative to the ship body, correcting the course deviation and identifying the structural characteristics of the surface of the ship body to a certain extent, so that the robot is guided to complete cleaning operation along a preset path. Such methods have had a certain application base in flat areas or in scenes where the lighting conditions are relatively uniform. However, in the cleaning scenario of a ship hull broadside vertical or high-pitch surface, there are significant shortcomings with existing visual guidance control methods. The broadside area is usually a large-area vertical or inclined structure, and the robot needs to resist the posture disturbance caused by unbalanced gravity and buoyancy at the same time during the operation, so that the movement stability requirement is obviously improved. Meanwhile, the side of the ship body is commonly distributed with structural features with definite directivity, such as welding lines, reinforcing ribs and the like, and the structures are important references of cleaning paths and can have obvious influence on image gray scale and texture distribution. In addition, after the underwater natural light and the artificial light source are overlapped, nonuniform illumination distribution of 'up-bright-down-dark' is often formed, so that imaging brightness differences of the same broadside area at different height positions are obvious. Under the scene, the vision guidance not only bears the functions of gesture and position control, but also is used for structural feature recognition and motion direction constraint, and the existing general vision method is difficult to simultaneously meet the complex requirements. In existing vision guidance schemes, a predictive window or extrapolated window mechanism is typically introduced for estimating in advance the vision information and motion trend in front of the robot. However, when the setting of the prediction window is too large, although structural information of a longer distance can be obtained, too much history or brightness and structural change of a far-end area are easily introduced, so that control response is delayed, deviation correction accuracy is reduced, and when the setting of the prediction window is too small, the prediction window is extremely sensitive to local noise, illumination fluctuation and short-scale shielding, and frequent fluctuation of a control instruction is easily caused. The high and low values can have adverse effects on the stable operation of the cleaning robot, further increase the load fluctuation frequency of the propeller and the gesture adjusting mechanism, and cause uncertainty to equipment service life, energy consumption evaluation and long-term asset management. Aiming at the problem of uneven brightness in the ship side underwater vision, research attempts are made to reference the idea of 'subarea exposure + brightness distribution priori compensation' in the aerial remote sensing image processing. The idea is to divide a large-format image into a plurality of spatial partitions and combine a priori brightness distribution model to implement differential brightness correction on different areas, so that the usability and contrast consistency of the image are improved as a whole. When the analogy is applied to an underwater broadside scene, aiming at the inherent brightness change characteristic in the broadside height direction, the brightness distribution prior formed in the history cleaning operation can be introduced, and the targeted exposure or brightness compensation is carried out on different height areas, so that the imaging stability of the existing visual guiding method under the condition of strong non-uniform illumination is improved. At the same time, however, the zonal exposure and brightness compensation itself introduce new problems. Because each partition has a difference in exposure or brightness mapping parameters, if an effective continuity constraint is lacking, a brightness discontinuity phenomenon easily occurs between adjacent partitions. The discontinuity ap