Search

CN-122023538-A - Underwater hull cleaning robot area cleaning method based on binocular vision

CN122023538ACN 122023538 ACN122023538 ACN 122023538ACN-122023538-A

Abstract

The invention discloses a binocular vision-based underwater hull cleaning robot region cleaning method, which relates to the technical field of hull cleaning and comprises the steps of obtaining binocular synchronous images of an underwater hull cleaning robot, obtaining working distance between the robot and a hull, generating composite structured light meeting propagation conditions by combining optical attenuation characteristics of a water body and projecting, obtaining binocular images under the condition of structured light enhancement and forming constrained dense parallax information, generating an enhanced depth map for protruding thickness variation of attachments based on the information, identifying boundaries of the attachment regions according to thickness variation distribution, generating regions to be cleaned, and guiding the robot to complete regional cleaning operation in the corresponding regions, thereby solving the problems that the attachment regions in straight low-texture surfaces of the underwater hulls are difficult to stably distinguish and small thickness variation is difficult to reliably perceive.

Inventors

  • BAI HAOLONG
  • BA YUE

Assignees

  • 越山海特种机器人(滨海)有限公司

Dates

Publication Date
20260512
Application Date
20251229

Claims (9)

  1. 1. The method for cleaning the underwater hull cleaning robot area based on binocular vision is characterized by comprising the following steps of: acquiring binocular synchronous images of the underwater hull cleaning robot, and performing initial dense parallax analysis on the binocular synchronous images to obtain an average working distance from the robot to the hull surface; Determining the upper limit of the spatial frequency of the structured light under the current distance condition based on the average working distance and by combining with a preset water body optical attenuation model, and generating a structured light frequency constraint parameter; Generating a composite structure light pattern comprising low-frequency and medium-frequency fringes according to the structure light frequency constraint parameters, and controlling a structure light projection module to project the composite structure light pattern to a ship body area to be cleaned; under the projection condition of the composite structure light pattern, synchronously acquiring a left-eye image and a right-eye image containing the structure light modulation information to obtain a structure light enhanced binocular image pair; Based on the structured light enhanced binocular image pair, performing dense parallax matching in an effective space frequency band defined by the structured light frequency constraint parameter to generate a structured light constraint dense parallax map; generating an enhanced depth map of the hull surface based on the structured light constrained dense disparity map, wherein the enhanced depth map is used for highlighting thickness variation of attachments; extracting the boundary of an attachment area on the surface of the ship body based on the spatial distribution characteristics of the thickness change in the enhanced depth map, and generating an area to be cleaned; And controlling the underwater hull cleaning robot to execute regional cleaning operation in the corresponding region according to the region to be cleaned.
  2. 2. The method for cleaning the area of the underwater hull cleaning robot based on binocular vision according to claim 1, wherein the obtaining the binocular synchronization image of the underwater hull cleaning robot, performing an initial dense parallax analysis on the binocular synchronization image, and obtaining an average working distance from the robot to the hull surface, specifically comprises: Acquiring left-eye images and right-eye images of the underwater hull cleaning robot in the current working posture, and performing time synchronization and visual axis alignment treatment on the left-eye images and the right-eye images to obtain aligned binocular image pairs; performing epipolar correction on the pair of binocular images based on the intrinsic and extrinsic information of the binocular camera, generating a corrected binocular image of epipolar alignment; Performing dense parallax matching on the corrected binocular image to obtain an initial parallax map covering the current field of view; selecting effective parallax areas with continuous parallaxes and stable distribution from the initial parallax map, and generating a parallax area set for distance estimation; And mapping the parallax information into the distance distribution from the robot to the hull surface based on the parallax region set to obtain the average working distance from the robot to the hull surface.
  3. 3. The binocular vision-based underwater hull cleaning robot area cleaning method according to claim 2, wherein the determining the spatial frequency upper limit of the structured light under the current distance condition based on the average working distance and combined with a preset water body optical attenuation model, and generating the structured light frequency constraint parameters specifically comprises: Based on the average working distance, calling a water body optical attenuation model corresponding to the current operation water area type, and reading a contrast attenuation curve parameter related to the distance in the model; inputting the average working distance into the water optical attenuation model, and acquiring a spatial frequency response interval in which the contrast of the structural light fringes is kept above a preset stability threshold under the distance condition; extracting the highest spatial frequency value capable of forming a stable fringe boundary in binocular imaging in the spatial frequency response interval as a structured light spatial frequency upper limit; and constructing a structural light frequency constraint parameter comprising a frequency upper limit value, an allowable frequency bandwidth range and a frequency band identifier based on the spatial frequency upper limit, wherein the structural light frequency constraint parameter is used for limiting the generation of a subsequent structural light pattern to an effective spatial frequency band matched with parallax.
  4. 4. The method for cleaning the underwater hull cleaning robot area based on binocular vision according to claim 3, wherein the generating a composite structure light pattern including low frequency and medium frequency fringes according to the structure light frequency constraint parameter, and controlling the structure light projection module to project the composite structure light pattern to the hull area to be cleaned comprises: Reading a spatial frequency upper limit value and an allowable frequency bandwidth range in the structured light frequency constraint parameter, and dividing the allowable frequency bandwidth range into a low frequency band and an intermediate frequency band; Generating a first structural light stripe primitive in the low-frequency band, generating a second structural light stripe primitive in the medium-frequency band, and setting stripe direction, phase distribution and cycle parameters for the two types of stripe primitives respectively; Generating a composite structure light pattern according to a preset space superposition rule based on the first structure light stripe primitive and the second structure light stripe primitive, so that the whole space frequency spectrum of the composite structure light pattern is not beyond the effective frequency band limited by the structure light frequency constraint parameter; mapping the composite structure light pattern to a projection coordinate system of a structure light projection module, and performing geometric correction on the projection pattern according to the current robot gesture; And controlling the structured light projection module to project the corrected composite structured light pattern to the ship body area to be cleaned, and introducing frequency-constrained structured light modulation information for subsequent binocular imaging.
  5. 5. The method for cleaning the area of the underwater hull cleaning robot based on binocular vision according to claim 4, wherein the left-eye image and the right-eye image containing the structural light modulation information are synchronously acquired under the condition of the projection of the composite structural light pattern, so as to obtain a structural light enhanced binocular image pair, specifically: Triggering the binocular camera to enter a synchronous imaging state while the structured light projection module executes the projection of the composite structured light pattern, and locking an exposure time sequence consistent with the structured light projection period; Based on the synchronous imaging state, respectively acquiring a left-eye original image and a right-eye original image containing composite structure light stripe modulation information; Performing time stamp alignment and imaging time consistency check on the left-eye original image and the right-eye original image, and eliminating abnormal frames which are not in the same projection period; forming a binocular image pair with clear corresponding relation by the left-eye original image and the right-eye original image which pass through consistency verification, and taking the binocular image pair as a structured light enhanced binocular image pair; and outputting the structured light enhanced binocular image pair for subsequent execution of dense parallax matching within an effective spatial frequency band defined by the structured light frequency constraint parameter.
  6. 6. The binocular vision-based underwater hull cleaning robot area cleaning method of claim 5, wherein the performing dense parallax matching within the effective spatial frequency band defined by the structured light frequency constraint parameter based on the structured light enhanced binocular image pair generates a structured light constrained dense parallax map, specifically: Reading the upper limit of the spatial frequency and the allowable frequency bandwidth range in the structural optical frequency constraint parameter, and constructing an effective spatial frequency band mask corresponding to the upper limit of the spatial frequency and the allowable frequency bandwidth range; Performing spatial spectrum transformation on the structured light enhanced binocular image respectively, and performing frequency band screening on a spectrum result based on the effective spatial frequency band mask to obtain a left eye feature map and a right eye feature map with limited frequency; Constructing a parallax search interval along the polar line direction on the left eye feature map and the right eye feature map with limited frequency, wherein the parallax search interval inherits a parallax range obtained by initial dense parallax analysis; in the parallax search interval, performing pixel-by-pixel matching on the left eye feature map and the right eye feature map with limited frequency to form structural light constraint matching cost distribution; Generating a structured light constraint dense parallax map covering the current view field based on the structured light constraint matching cost distribution; and outputting the structured light constrained dense parallax map for constructing a subsequent hull surface enhanced depth map.
  7. 7. The binocular vision-based underwater hull cleaning robot area cleaning method of claim 6, wherein the generating an enhanced depth map of the hull surface based on the structured light constrained dense disparity map is used for highlighting thickness variations of attachments, specifically: reading the structured light constraint dense parallax map, calling a baseline parameter and an imaging calibration parameter of the binocular camera, and establishing a mapping relation between a parallax value and a spatial depth; based on the mapping relation, converting the parallax value in the structured light constraint dense parallax map into a corresponding hull surface depth value, and generating an initial depth map; In the initial depth map, constructing a local depth reference surface along the normal direction of the surface of the ship body, and representing the form of the basic ship body under the condition of no attachments; performing difference processing on the depth value in the initial depth map and a local depth reference surface at a corresponding position to obtain a relative depth change map representing surface relief; Performing spatial continuity constraint processing on the relative depth change map, and inhibiting isolated abnormal depth caused by water scattering and residual parallax error; And outputting the relative depth change image subjected to the continuity constraint treatment as an enhanced depth image of the hull surface, wherein the enhanced depth image is used for projecting thickness change characteristics of attachments and extracting and using the attachment areas.
  8. 8. The method for cleaning the underwater hull cleaning robot area based on binocular vision according to claim 7, wherein the method for cleaning the underwater hull cleaning robot area based on the spatial distribution characteristics of thickness variation in the enhanced depth map extracts the boundary of the attached area on the hull surface to generate the area to be cleaned, specifically comprises the following steps: reading the enhanced depth map, and obtaining spatial distribution data representing relative depth change of the surface of the ship body; constructing a thickness variation intensity map on the enhanced depth map, wherein the thickness variation intensity map is used for describing a depth variation gradient between adjacent positions; extracting candidate attachment areas with continuous depth change and stable amplitude based on the thickness change intensity graph; Executing region connectivity constraint on the candidate attachment regions, merging the candidate regions which are mutually adjacent in space to form an attachment communication region set; Extracting depth change mutation lines at the boundary positions of the attachment communication region sets to generate attachment region boundaries; And defining the boundary of the attachment area and the space area surrounded by the boundary as an area to be cleaned, and outputting the area to be cleaned for control of the subsequent regional cleaning operation.
  9. 9. The binocular vision-based underwater hull cleaning robot area cleaning method of claim 8, wherein the controlling the underwater hull cleaning robot to perform the regional cleaning operation in the corresponding area according to the area to be cleaned comprises: reading the region to be cleaned, and obtaining the spatial position range and boundary contour of the region to be cleaned in a ship surface coordinate system; Mapping the region to be cleaned to a motion control coordinate system of the underwater hull cleaning robot based on the boundary contour to generate a region corresponding relation; Generating a cleaning track in the region according to a preset cleaning coverage rule inside the region to be cleaned, so that the cleaning track completely covers the region to be cleaned and does not exceed the boundary of the region to be cleaned; converting the cleaning track into a motion control instruction sequence executable by the robot, wherein the control instruction sequence comprises a pose adjusting instruction and a cleaning executing instruction; Controlling the underwater hull cleaning robot to enter the area to be cleaned according to the motion control instruction sequence and move along the cleaning track; Synchronously executing cleaning action in the motion process of the robot, and cleaning the surface of the ship body in the area to be cleaned; And after the cleaning track coverage is completed, finishing the current cleaning operation of the area to be cleaned, and outputting the area cleaning completion state.

Description

Underwater hull cleaning robot area cleaning method based on binocular vision Technical Field The invention relates to the technical field of hull cleaning, in particular to an underwater hull cleaning robot area cleaning method based on binocular vision. Background The existing underwater hull cleaning robot mostly adopts a binocular vision-based region cleaning method, stereoscopic image information of the hull surface is obtained through a binocular camera, the depth distribution of the hull surface is rebuilt through parallax relation, so that the perception of the hull surface morphology is realized, and a cleaning path is planned on the basis. The method generally relies on image texture information under natural light or auxiliary lighting conditions, generates a depth map through dense parallax matching, and then identifies a suspected attachment area by combining a threshold segmentation or area growth mode, so as to control a robot to execute local or whole cleaning operation, and the method is applied to partial leveling of hull surface scenes. The existing underwater hull area cleaning method based on binocular vision still has obvious limitations in practical application. On one hand, the surface of the ship body steel plate is generally flat and straight in large area and low in texture, is more susceptible to illumination attenuation and scattering in an underwater environment, so that binocular matching stability is insufficient, on the other hand, attachments tend to cause only tiny thickness variation, the corresponding parallax variation range is limited under the natural texture condition, and the ship body steel plate is easily submerged by water noise and matching errors. Some prior art attempts to enhance surface features by artificially introducing high frequency spatial texture, but in the absence of constraints, flat surfaces no longer appear as "low texture background", and structural light deformations caused by attachments are difficult to distinguish stably in parallax space, but may instead enlarge the mismatching region, resulting in small thickness variations that remain insufficiently resolvable in the depth map. In order to improve the stability of binocular matching in a low-texture scene, a technical thought of structured light assisted stereoscopic matching is provided in the related field, namely, by projecting artificial coding light textures to the target surface, controllable spatial textures are introduced in the imaging process, so that the parallax matching reliability is improved. The method is verified in scenes such as land three-dimensional measurement, industrial detection and the like, and the observability of a smooth surface can be effectively enhanced by reasonably designing a structured light pattern, so that surface relief which is difficult to distinguish originally is more clearly expressed in a parallax domain and a depth domain, and the method has certain technical reference significance. When the structured light assisted stereo matching is directly applied to the underwater hull cleaning scene, new technical problems are introduced. The water body has obvious scattering and absorption effects on the structured light, the projected pattern is easy to degrade in the propagation process, and particularly in the underwater large-area ship side plate scene, the working distance between the robot and the ship body is not in a fixed short-distance condition. With the increase of the propagation distance, suspended particles in the water body have scattering effect on the structural light, the difference of absorption characteristics of different wave band lights in the water is obvious, the structural light with high spatial frequency is attenuated rapidly, the pattern contrast is reduced in a nonlinear manner along with the distance, and the binocular system is difficult to stably extract the structural light coding characteristics at the imaging end, so that the auxiliary effect of the structural light on parallax matching is weakened. In summary, in the prior art, the binocular vision method based on natural texture is difficult to stably distinguish the thickness variation of the tiny attachments on the surface of the underwater low-texture ship body, and the conventional structure light auxiliary stereo matching is directly introduced and is limited by the optical attenuation characteristic of the water body, so that the structure light is invalid under the condition of long distance or complex water body. Therefore, the prior art lacks a region cleaning method capable of effectively restraining the structural light space frequency by combining the optical characteristics of the water body and simultaneously highlighting the thickness difference of attachments while guaranteeing the binocular matching stability, which is the core technical problem to be solved by the invention. . The present invention proposes a solution to the above-mentioned prob