CN-121095091-B - Underwater image enhancement model training method, system and equipment
Abstract
The invention discloses a training method, a training system and training equipment for an underwater image enhancement model, which relate to the technical field of image enhancement and comprise the steps of extracting a natural light field image and a detail image of an underwater image, combining the detail image with the natural light field image after Gaussian filtering treatment to obtain a detail degradation image, determining a non-offset color channel in a green channel and a blue channel of the natural light field image, carrying out color compensation on the red channel and the non-offset color channel, combining the natural light field image and the detail image after the color compensation into a color compensation image, carrying out reduction treatment on the brightness of the underwater image to obtain a low-brightness image, adding Gaussian noise to the underwater image to obtain a noise image, and taking the detail degradation image, the low-brightness image and the noise image as negative samples to refer to a clear image and the color compensation image as positive samples so as to finish training of the underwater image enhancement model. The color calibration capability and the contrast enhancement performance of the underwater image enhancement model on the underwater image are remarkably improved.
Inventors
- ZHANG LONG
- DU HONGWEI
- WANG JIAN
- LIU JIQIAO
- WANG WEI
- SUN YIHAO
- LIANG GE
Assignees
- 浪潮通用软件有限公司
Dates
- Publication Date
- 20260505
- Application Date
- 20251111
Claims (8)
- 1. An underwater image enhancement model training method is characterized by comprising the following steps: Extracting a natural light field diagram of the underwater image, and obtaining a detail diagram according to the underwater image and the natural light field diagram; Carrying out Gaussian filtering treatment on the detail map, and combining the detail map with the natural light field map to obtain a detail degradation map; Determining non-offset color channels in a green channel and a blue channel of the natural light field map, performing color compensation on the red channel and the non-offset color channels, and combining the natural light field map after the color compensation with the detail map to form a color compensation map; The color compensation is as follows: ; Wherein, the In order to compensate for the color channel before it, For the color channel with the highest average value, For the compensated color channel; the brightness of the underwater image is reduced to obtain a low-brightness image, wherein the brightness at the center of the light spot is highest, and the brightness is gradually reduced along the radial direction; Adding Gaussian noise to the underwater image to obtain a noise image; taking a detail degradation graph, a low-brightness graph and a noise graph as negative samples, and taking a reference clear graph and a color compensation graph as positive samples, so as to complete training of a pre-constructed underwater image enhancement model; Determining contrast learning loss function using sample weights : ; Wherein, the Representing a reference sharp figure, K represents a color compensation figure, Weights representing the color compensation map; outputting a graph for the model; Representing the first extracted from a pre-trained ResNet model C is the total number of the hidden features; as coefficients, 1/16, 1/8, 1/4 and 1;N are set as negative sample numbers; Is the first Negative samples of Weights of (2); is a positive number approaching 0.
- 2. The method of claim 1, wherein determining non-offset color channels in the green and blue channels of the natural light field map comprises: calculating a green channel mean value and a blue channel mean value of the underwater image; calculating a green channel mean value and a blue channel mean value of the reference clear image; And calculating the average value difference between the underwater image and the green and blue channels of the reference clear image, wherein the channel corresponding to the channel with the largest difference value is an offset color channel, and the other channel is a non-offset color channel.
- 3. The method for training an underwater image enhancement model according to claim 1, wherein the processing for reducing the brightness of the underwater image is as follows: ; ; Wherein, the And Respectively representing an underwater image and a low-brightness image; representing the center coordinates of the spot of the artificial illumination, And Representing the spot radius and the maximum illumination intensity, Is the basic brightness coefficient of the area outside the facula in the underwater image.
- 4. The method for training an underwater image enhancement model according to claim 1, wherein the method comprises the steps of performing a plurality of training iterations on the underwater image enhancement model, calculating a loss value according to a contrast learning loss function in each iteration, and after every set frequency iteration, recalculating a positive sample weight and the contrast learning loss function until the loss value is smaller than a preset value, and ending the training.
- 5. The training method of an underwater image enhancement model according to claim 4, wherein the weights of the detail degradation map, the low-luminance map and the noise map in the contrast learning are set to 1, the weight of the reference sharpness map in the contrast learning is set to 1, and the weight of the color compensation map in the contrast learning is set to 0.1; after iteration is carried out every set frequency, respectively calculating a model output graph of the underwater image enhancement model and a peak signal-to-noise ratio between the color compensation graph and the reference clear graph; If the peak signal-to-noise ratio between the color compensation map and the reference sharpness map Greater than or equal to the peak signal-to-noise ratio between the model output map and the reference sharpness map Taking the color compensation diagram as a positive sample, and adjusting the weight of the color compensation diagram to be ; Otherwise, changing the color compensation diagram into a negative sample in contrast learning, and adjusting the weight of the color compensation diagram to be: 。
- 6. The training method of an underwater image enhancement model according to claim 1, wherein the underwater image is subjected to multi-scale gaussian filtering, the filtered underwater image is transformed into a logarithmic domain and scaled to obtain a natural light field map of the underwater image, and the underwater image and the corresponding natural light field map are subjected to pixel difference to obtain a detail map.
- 7. An underwater image enhancement model training system, comprising: the separation module is configured to extract a natural light field diagram of the underwater image and obtain a detail diagram according to the underwater image and the natural light field diagram; The detail map processing module is configured to combine the detail map with the natural light field map after Gaussian filtering processing is carried out on the detail map to obtain a detail degradation map; The natural light field diagram processing module is configured to determine a non-offset color channel in a green channel and a blue channel of the natural light field diagram, perform color compensation on the red channel and the non-offset color channel, and combine the natural light field diagram after the color compensation with the detail diagram into a color compensation diagram, wherein the color compensation is as follows: ; Wherein, the In order to compensate for the color channel before it, For the color channel with the highest average value, For the compensated color channel; The primary image first processing module is configured to reduce the brightness of the underwater image to obtain a low-brightness image, wherein the brightness at the center of the light spot is highest, and the brightness gradually decreases along the radial direction; the original image second processing module is configured to add Gaussian noise to the underwater image to obtain a noise image; the training module is configured to take a detail degradation graph, a low-brightness graph and a noise graph as negative samples and a reference clear graph and a color compensation graph as positive samples, so that training of a pre-constructed underwater image enhancement model is completed; Determining contrast learning loss function using sample weights : ; Wherein, the Representing a reference sharp figure, K represents a color compensation figure, Weights representing the color compensation map; outputting a graph for the model; Representing the first extracted from a pre-trained ResNet model C is the total number of the hidden features; as coefficients, 1/16, 1/8, 1/4 and 1;N are set as negative sample numbers; Is the first Negative samples of Weights of (2); is a positive number approaching 0.
- 8. An electronic device comprising a memory and a processor and computer instructions stored on the memory and running on the processor, which when executed by the processor, perform the method of any one of claims 1-6.
Description
Underwater image enhancement model training method, system and equipment Technical Field The present invention relates to the field of image enhancement technologies, and in particular, to a training method, system, and apparatus for an underwater image enhancement model. Background The underwater image has high resolution and rich color information, and is important in the fields of marine science research, underwater engineering operation and the like as an important medium for sensing the underwater environment and acquiring key information. However, due to the light absorption of water to different wavelengths and the scattering of light by underwater suspended matters, the acquired underwater image has degradation characteristics such as color shift, low contrast and the like, and the visibility and the interpretation of the image are seriously reduced. Traditional underwater image enhancement methods are divided into adjusting pixel values of images and constructing a physical imaging model to obtain enhanced images, and the methods are often applicable to few scenes and have limited effects in detail enhancement. The underwater image enhancement model based on deep learning can adaptively learn degradation characteristics of the underwater image, particularly a supervised training model, establishes a mapping relation between the underwater image and the enhancement image, and has strong environmental adaptability. Preprocessing the input image before training can improve the robustness of the training model and expand the training samples. However, in the existing preprocessing method, no targeted preprocessing is performed for the characteristics of the underwater image, so that the learning ability of the image enhancement model on the underwater degradation physical characteristics cannot be improved. Disclosure of Invention In order to solve the problems, the invention provides a training method, a training system and training equipment for an underwater image enhancement model, which are used for remarkably improving the color calibration capability and the contrast enhancement performance of the underwater image enhancement model on an underwater image by combining the degradation characteristic of the underwater image with a contrast learning mechanism and solving the problems of low color deviation and contrast of the underwater image. In order to achieve the above purpose, the present invention adopts the following technical scheme: In a first aspect, the present invention provides a training method for an underwater image enhancement model, including: Extracting a natural light field diagram of the underwater image, and obtaining a detail diagram according to the underwater image and the natural light field diagram; Carrying out Gaussian filtering treatment on the detail map, and combining the detail map with the natural light field map to obtain a detail degradation map; Determining non-offset color channels in a green channel and a blue channel of the natural light field map, performing color compensation on the red channel and the non-offset color channels, and combining the natural light field map after the color compensation with the detail map to form a color compensation map; the brightness of the underwater image is reduced to obtain a low-brightness image, wherein the brightness at the center of the light spot is highest, and the brightness is gradually reduced along the radial direction; Adding Gaussian noise to the underwater image to obtain a noise image; And taking the detail degradation graph, the low-brightness graph and the noise graph as negative samples and taking the reference clear graph and the color compensation graph as positive samples, thereby completing training of the pre-constructed underwater image enhancement model. As an alternative embodiment, the process of determining the non-offset color channel in the green and blue channels of the natural light field map includes: calculating a green channel mean value and a blue channel mean value of the underwater image; calculating a green channel mean value and a blue channel mean value of the reference clear image; And calculating the average value difference between the underwater image and the green and blue channels of the reference clear image, wherein the channel corresponding to the channel with the largest difference value is an offset color channel, and the other channel is a non-offset color channel. As an alternative embodiment, the color compensation is: ; Wherein, the In order to compensate for the color channel before it,For the color channel with the highest average value,For the compensated color channel. As an alternative embodiment, the reduction processing of the brightness of the underwater image is: ; ; Wherein, the AndRespectively representing an underwater image and a low-brightness image; representing the center coordinates of the spot of the artificial illumination, AndRepresenting the spo