CN-121739980-B - Ultra-shallow water blue-green laser sounding dynamic refraction vision correction method
Abstract
The invention discloses an extremely shallow blue-green laser sounding dynamic refraction vision correction method, which belongs to the technical field of ocean mapping and underwater detection and is used for laser refraction correction and comprises the steps of synchronously acquiring binocular images and inertial measurement unit attitude data to generate a three-dimensional point cloud; the method comprises the steps of obtaining an original normal vector of a water surface by fitting a micro-tangential plane according to a light spot area in a laser emission angle index point cloud, calculating confidence coefficient weight according to image texture definition, iteratively calculating an optimal estimated normal vector based on the original normal vector and the weight, calculating a real incident angle according to the optimal estimated normal vector and a laser emission direction, calculating a direction vector of underwater refraction light according to a Snell's law, and calculating a three-dimensional coordinate of a water bottom of a target point by combining a propagation distance of laser in air and water and an emission origin coordinate. By introducing the visual confidence, the invention solves the problem of data failure when the traditional visual sounding encounters water surface reflection, glare or broken waves, and remarkably improves the robustness under complex sea conditions.
Inventors
- WANG ZONGSHENG
- Ji Zhenghe
- ZHANG LINYU
- WANG SHENGLI
- ZHANG WEI
- Qin Huiqi
- LIU CHENGMING
- Ping Jinzhi
- LIU JIA
- ZHANG ZHIHAO
Assignees
- 山东科技大学
Dates
- Publication Date
- 20260508
- Application Date
- 20260226
Claims (8)
- 1. The ultra-shallow water blue-green laser sounding dynamic refraction vision correction method is characterized by comprising the following steps of: S1, carrying a blue-green laser depth finder, a binocular vision sensor, a synchronous control unit, an inertial measurement unit and a global navigation satellite system on an unmanned ship, wherein in the measurement process of the unmanned ship, the synchronous control unit is utilized to generate a unified time reference, the binocular vision sensor performs exposure acquisition at the initial moment of laser emission of the blue-green laser depth finder, the inertial measurement unit records the attitude angle of the blue-green laser depth finder, and the global navigation satellite system measures the coordinate information of the current unmanned ship to obtain a binocular image sequence; S2, performing three-dimensional matching on the binocular image sequence to generate a parallax image, converting the parallax image into a three-dimensional point cloud, setting a calibration matrix, setting a Euclidean distance preset radius by using the calibration matrix and the laser emission angle of the blue-green laser depth finder, indexing a laser spot area in the three-dimensional point cloud, and fitting an instantaneous inclined space plane of the area corresponding to the laser spot center by using a least square method to obtain an original observation normal vector of the water surface; S3, setting an image quality threshold, calculating self-adaptive confidence coefficient weights by combining image texture definition scores of laser drop point areas, and calculating optimal estimation normal vectors by using a recursion formula based on original water surface observation normal vectors and the self-adaptive confidence coefficient weights; S4, calculating a corrected real laser incident angle by using the refractive index of the laser in the air, the refractive index of the laser in the water body and the emergent direction unit vector of the laser in the air, and calculating the direction vector of the underwater refraction light by using the Snell' S law based on the corrected real laser incident angle; s5, calculating the three-dimensional coordinates of the water bottom of the target point based on the direction vector of the underwater refraction light and combining the propagation distance of the laser in the air, the propagation distance of the laser in the water and the coordinates of the laser emission origin; S3 comprises S3.1, converting an image area corresponding to the laser spot area into a gray level image Respectively calculating by utilizing Sobel operator Gradient in horizontal direction And a gradient in the vertical direction Calculation of The sum of squares of the gradient magnitudes of all pixel points in the image as an original definition value : ; Will be Mapping to intervals Obtaining an image texture sharpness score : ; In the formula, For a preset empirical maximum sharpness threshold, A preset experience minimum definition threshold value; S3 includes, S3.2, calculating Adaptive confidence weighting of time of day images : ; In the formula, Is the base of the exponential function, For the preset image quality threshold value, Is a sensitivity system; S3 comprises S3.3, calculating an optimal estimation normal vector by using a recursion formula : ; ; In the formula, Is that The optimal estimated normal vector of the moment in time, In order to sample the time interval of the time, Is an estimate of the angular velocity of the wave surface.
- 2. The ultra-shallow water blue-green laser sounding dynamic refraction vision correction method as set forth in claim 1, wherein S1 comprises an attitude angle Comprising a roll angle Pitch angle And yaw angle ; Unifying time references of left-eye image frames and right-eye image frames of binocular vision sensor by using synchronous control unit to generate left-eye image frames aligned with the time references And right-eye image frames aligned to a time reference Will be And Merging into a binocular image sequence , 。
- 3. The ultra-shallow water blue-green laser sounding dynamic refraction vision correction method as set forth in claim 2, wherein S2 comprises S2.1, pair Performing polar correction to make And Corresponding pixels of (a) are on the same horizontal line, and are calculated by using a semi-global matching algorithm And Parallax value of corresponding pixel point in (a) Summarizing the disparity values to generate a disparity map, Is the horizontal coordinate of the image and, Is the vertical coordinates of the image.
- 4. The ultra-shallow water blue-green laser sounding dynamic refraction vision correction method as set forth in claim 3, wherein S2 comprises S2.2, binocular vision sensor-based re-projection matrix Mapping each pixel in the parallax map into three-dimensional coordinates under a binocular vision sensor coordinate system, and generating a three-dimensional point cloud: ; in the formula, Is the x-axis coordinate in the binocular vision sensor coordinate system, Is the y-axis coordinate in the binocular vision sensor coordinate system, Is the z-axis coordinate in the binocular vision sensor coordinate system, For homogeneous components in the binocular vision sensor coordinate system, Transpose the symbol; Setting a calibration matrix, using the laser emission angles of the calibration matrix and the blue-green laser depth finder to obtain a region corresponding to the center of a laser spot in the three-dimensional point cloud, and using a least square method to fit an instantaneous inclined space plane of the region corresponding to the center of the laser spot to obtain an original observation normal vector of the water surface.
- 5. The ultra-shallow water blue-green laser sounding dynamic refraction vision correction method as set forth in claim 4, wherein S2 comprises S2.3, obtaining a combined calibration matrix of the binocular vision sensor and the blue-green laser sounding instrument , Comprising a rotation matrix Translation vector According to the emission angle of the current laser depth finder, calculating the ray path of the laser beam under the binocular vision sensor system, and setting the Euclidean distance preset radius as Searching that Euclidean distance from the center of a laser spot in the three-dimensional point cloud is smaller than a preset radius Point set of (2) As a laser spot area, the laser beam is irradiated, 。
- 6. The ultra-shallow water blue-green laser sounding dynamic refraction vision correction method as set forth in claim 5, wherein S2 includes, S2.4, the transient tilt spatial plane calculation process includes, using Construction of objective functions : , In the formula, Is the first The x-axis coordinates of the individual laser spot area points, Is the first The y-axis coordinates of the individual laser spot area points, Is the first Z-axis coordinates of the individual laser spot area points; Solving by least square method Minimized parameters ; S2 comprises S2.5, obtaining plane parameters by fitting Normalizing to obtain original observation normal vector of water surface : 。
- 7. The ultra-shallow water blue-green laser sounding dynamic refraction vision correction method as set forth in claim 6, wherein S4 comprises S4.1, based on Calculating corrected real laser incident angle ; ; In the formula, As an inverse cosine function of the sign of the wave, Is the unit vector of the emergent direction of the laser in the air, Is an absolute value; s4 comprises S4.2, calculating the direction vector of the underwater refraction light : ; In the formula, For the refractive index of the laser light in air, Is the refractive index of the laser in the water body.
- 8. The ultra shallow water blue-green laser sounding dynamic refraction vision correction method as set forth in claim 7, wherein S5 comprises calculating three-dimensional coordinates of the target point water bottom : ; In the formula, For the laser emission origin coordinates, For the propagation distance of the laser light in air, For the distance the laser propagates in the water, The three-dimensional coordinates of the laser emission center in the geodetic coordinate system at the laser emission time are obtained.
Description
Ultra-shallow water blue-green laser sounding dynamic refraction vision correction method Technical Field The invention discloses an extremely shallow water blue-green laser sounding dynamic refraction vision correction method, and belongs to the technical field of ocean mapping and underwater detection. Background When underwater topography mapping is carried out in extremely shallow waters (water depth is less than 2 m) such as a river channel, an island periphery and the like, the underwater topography mapping is influenced by wind waves, and a water-air interface presents complex dynamic fluctuation characteristics. Conventional shipborne laser sounding systems typically suffer from the following technical drawbacks: The geometrical distortion is serious, and the prior art mostly assumes that the water surface is horizontal (the normal vector is always vertical upwards), or only compensates the wave height in the vertical direction by a laser displacement sensor. However, the slope of the wave changes the incident angle of the laser, and the direction of the refracted ray is severely deviated according to Snell's law. At a water depth of 1m, a water surface inclination angle of 10 degrees can cause the position deviation of a light spot at the bottom to exceed 15cm, and the plane precision of a sounding point is seriously affected. The hardware cost is high, in order to keep the laser vertically incident, the traditional equipment is often provided with an expensive mechanical stability-increasing cradle head, and the mechanical response speed is difficult to keep up with the shake of high-frequency waves. In a very shallow water area ranging from 0.2m to 1m, echo signals and water surface clutter are seriously mixed, and refractive errors caused by waves have a very large proportion in shallow water, so that the area is difficult to map with high precision. Therefore, a method for capturing the microwave wave slope in real time and replacing the mechanical holder with a software algorithm to perform high-precision refraction correction is needed. Disclosure of Invention The invention aims to provide an extremely shallow water blue-green laser sounding dynamic refraction vision correction method, which aims to solve the problems of insufficient precision and water surface fluctuation interference existing in the conventional sounding technology in extremely shallow water in the prior art. The ultra-shallow water blue-green laser sounding dynamic refraction vision correction method comprises the following steps: S1, carrying a blue-green laser depth finder, a binocular vision sensor, a synchronous control unit, an inertial measurement unit and a global navigation satellite system on an unmanned ship, wherein in the measurement process of the unmanned ship, the synchronous control unit is utilized to generate a unified time reference, the binocular vision sensor performs exposure acquisition at the initial moment of laser emission of the blue-green laser depth finder, the inertial measurement unit records the attitude angle of the blue-green laser depth finder, and the global navigation satellite system measures the coordinate information of the current unmanned ship to obtain a binocular image sequence; S2, performing three-dimensional matching on the binocular image sequence to generate a parallax image, converting the parallax image into a three-dimensional point cloud, setting a calibration matrix, setting a Euclidean distance preset radius by using the calibration matrix and the laser emission angle of the blue-green laser depth finder, indexing a laser spot area in the three-dimensional point cloud, and fitting an instantaneous inclined space plane of the area corresponding to the laser spot center by using a least square method to obtain an original observation normal vector of the water surface; S3, setting an image quality threshold, calculating self-adaptive confidence coefficient weights by combining image texture definition scores of laser drop point areas, and calculating optimal estimation normal vectors by using a recursion formula based on original water surface observation normal vectors and the self-adaptive confidence coefficient weights; S4, calculating a corrected real laser incident angle by using the refractive index of the laser in the air, the refractive index of the laser in the water body and the emergent direction unit vector of the laser in the air, and calculating the direction vector of the underwater refraction light by using the Snell' S law based on the corrected real laser incident angle; s5, calculating the three-dimensional coordinates of the water bottom of the target point based on the direction vector of the underwater refraction light and combining the propagation distance of the laser in the air, the propagation distance of the laser in the water and the coordinates of the laser emission origin. S1 comprises an attitude angleComprising a roll anglePitch angleAnd y