CN-122023884-A - Automatic water and soil loss monitoring method and system based on image recognition
Abstract
The application relates to the technical field of image processing and discloses an automatic water and soil loss monitoring method and system based on image recognition. The method comprises the steps of obtaining multi-time-phase multi-scale images, establishing a feature database, associating environmental parameters, retrieving similar environmental histories from the feature database, calculating feature reliability scoring vectors, inputting the reliability scoring vectors into an attention network to obtain weight vectors, weighting and fusing features, and updating the feature database by calculating feature accuracy contribution after the classifier identifies water and soil loss levels. The application improves the recognition stability and accuracy of the water and soil loss monitoring system under different lighting condition cloud cover and vegetation waiting period, and realizes the environment self-adaptation capability and the performance continuous optimization capability of the monitoring system in long-term operation.
Inventors
- WU WEIYI
- WANG XIAOLING
- XIN KAIFENG
Assignees
- 承德市嘉宏环境科技有限公司
Dates
- Publication Date
- 20260512
- Application Date
- 20260113
Claims (10)
- 1. An automatic water and soil loss monitoring method based on image identification is characterized by comprising the following steps: Step S1, acquiring multi-temporal multi-scale images of a monitoring area, extracting topographic features, erosion features and soil features under different time levels and spatial scales, establishing a feature database and associating environmental parameters corresponding to the features; Step S2, extracting illumination intensity, cloud cover shielding rate and vegetation weather of an image to be identified as current environmental parameters, retrieving a historical record which satisfies that the illumination intensity difference is smaller than a preset illumination threshold value, the cloud cover shielding rate difference is smaller than a preset shielding threshold value and the vegetation weather is the same from the characteristic database, carrying out weighted average calculation on identification accuracy of various characteristics in the historical record and environmental similarity weight, calculating the environmental similarity weight by adopting a Gaussian function according to the illumination intensity difference and the cloud cover shielding rate difference, and outputting reliability scoring vectors of various characteristics; s3, inputting the current environment parameters and the reliability scoring vector into an attention network, normalizing an output layer by adopting a Softmax function to obtain a weight vector, and weighting and fusing multi-scale features of the image to be identified by using the weight vector to obtain a fused feature vector; and S4, inputting the fusion feature vector into a classifier to identify water and soil loss levels, calculating feature accuracy contribution degree after obtaining verification data, and updating reliability scores in the feature database.
- 2. The automatic monitoring method for water and soil loss based on image recognition according to claim 1, wherein the step S1 comprises: Acquiring image data of a monitoring area under the daily scale, the weekly scale, the monthly scale and the quaternary scale by monitoring equipment, and decomposing the image data according to three spatial scales of macroscopic topography, a mesoscopic slope surface and a microscopic erosion point to obtain a multi-scale image set; Extracting a topography gradient change rate, an overall vegetation coverage and a soil bare area occupation ratio from a macroscopic land image in the multi-scale image set, extracting an erosion ditch length density, a slope roughness index and a deposition fan distribution area from a microscopic slope image, and extracting a soil color parameter, a surface texture direction gradient histogram and an erosion ditch wall inclination angle from a microscopic erosion point image to obtain a characteristic parameter set; Organizing the feature parameter set into a three-dimensional feature tensor according to three dimensions of a time hierarchy, a space scale and a feature type; And extracting illumination intensity, cloud cover shielding percentage and vegetation period identification at each image data acquisition time as environment parameters, and establishing an index association relation between the three-dimensional feature tensor and the environment parameters to obtain a feature database.
- 3. The automatic water and soil loss monitoring method based on image recognition according to claim 2, wherein the extracting the illumination intensity, the cloud cover shielding rate and the vegetation waiting period of the image to be recognized in the step S2 as the current environmental parameters includes: Obtaining the current illumination intensity by calculating the pixel mean value of the red, green and blue three channels of the image to be identified and multiplying the pixel mean value by a calibration coefficient; Calculating the contrast standard deviation of a local area of an image to be identified, judging an area with the contrast standard deviation smaller than a preset contrast threshold as a cloud cover area, and counting the ratio of the pixel number of the cloud cover area to the total pixel number to obtain the cloud cover rate; Calculating a normalized vegetation index (NDVI) value of the image to be identified, and judging a vegetation period identifier according to a numerical interval in which the NDVI value is positioned to obtain the current environmental parameter.
- 4. The automatic monitoring method for water and soil loss based on image recognition according to claim 3, wherein the step S2 of retrieving, from the feature database, a history record satisfying that the difference of illumination intensity is smaller than a preset illumination threshold, the difference of cloud cover shielding rate is smaller than a preset shielding threshold, and vegetation waiting periods are the same includes: Reading environment parameters corresponding to each history record from the characteristic database, calculating the difference value between the illumination intensity of each history record and the current illumination intensity as illumination intensity difference, and calculating the difference value between the cloud cover rate of each history record and the cloud cover rate as cloud cover rate difference; Screening histories of which the illumination intensity difference is smaller than a preset illumination threshold value, the cloud cover shielding rate difference is smaller than a preset shielding threshold value and the plant weather period identification is the same as the vegetation weather period identification, and obtaining a similar environment history collection; and reading the characteristic parameters corresponding to each history record from the similar environment history record set, and the accuracy data of each characteristic parameter in the history identification task.
- 5. The automatic monitoring method for water and soil loss based on image recognition according to claim 4, wherein in the step S2, the recognition accuracy of each type of feature in the history record and the environmental similarity weight are calculated by weighted average, the environmental similarity weight is calculated by using a gaussian function according to the illumination intensity difference and the cloud cover shielding rate difference, and the reliability scoring vector of each type of feature is output, which comprises: Dividing the illumination intensity difference by a preset illumination normalization coefficient for normalization processing, and dividing the cloud cover shielding difference by a preset shielding normalization coefficient for normalization processing to obtain a normalized illumination difference and a normalized shielding difference; respectively squaring and adding the normalized illumination difference and the normalized shielding difference, taking a negative value from an addition result, and performing exponential operation to obtain environment similarity weights corresponding to all histories; For each type of characteristic parameters in the similar environment history record set, carrying out product operation and summation on the identification accuracy of the same type of characteristic parameters in each history record and the environment similarity weight of the corresponding history record, and dividing the summation result by the sum of all environment similarity weights to obtain the reliability score of the same type of characteristic parameters; And arranging the reliability scores of all the feature types according to the sequence of the feature types to obtain a reliability score vector.
- 6. The automatic monitoring method for water and soil loss based on image recognition according to claim 1, wherein the step S3 comprises: Extracting features of an image to be identified according to three spatial scales of a macroscopic topography, a mesoscopic slope and a microscopic erosion point to obtain a current multi-scale feature matrix; Flattening the current multiscale feature matrix to form a feature vector, and expanding and matching the reliability scoring vector according to the dimension of the space scale to obtain an expanded reliability vector; The feature vector and the extended reliability vector are spliced and then input into an input layer of an attention network, the input layer is processed by two layers of fully-connected hidden layers and then transmitted to an output layer, and the output layer carries out normalization calculation on each neuron activation value by adopting a Softmax function to obtain a weight vector; and carrying out element-by-element product operation on each weight coefficient in the weight vector and the corresponding feature element in the current multi-scale feature matrix, and summing to obtain a fusion feature vector.
- 7. The automatic monitoring method for water and soil loss based on image recognition according to claim 1, wherein the step S4 comprises: Inputting the fusion feature vector into a support vector machine classifier, calculating a radial basis function value between the fusion feature vector and each level support vector, calculating a decision function value of each level according to the function value, and selecting the level with the largest decision function value as a prediction level; The field verification data of the monitoring area corresponding to the prediction grade are obtained, the field verification data and the prediction grade are compared to judge and identify correctness, and a correctness mark is obtained; Performing product operation on each weight coefficient in the weight vector and the correctness sign to obtain accuracy contribution of each feature; and constructing the current environment parameters, the accuracy contribution degree of each feature and the correctness marks as feedback samples, storing the feedback samples into the feature database, and updating the reliability scores of the corresponding feature parameters according to the correctness marks by adopting a moving average method.
- 8. An automatic water and soil loss monitoring system based on image recognition, which is used for realizing the automatic water and soil loss monitoring method based on image recognition as claimed in any one of claims 1 to 7, wherein the automatic water and soil loss monitoring system based on image recognition comprises: The extraction module is used for acquiring multi-temporal multi-scale images of the monitoring area, extracting topographic features, erosion features and soil features under different time levels and spatial scales, establishing a feature database and associating environmental parameters corresponding to the features; The retrieval module is used for extracting the illumination intensity, the cloud cover shielding rate and the vegetation weather period of the image to be identified as current environment parameters, retrieving a historical record which satisfies that the illumination intensity difference is smaller than a preset illumination threshold value, the cloud cover shielding rate difference is smaller than a preset shielding threshold value and the vegetation weather period is the same from the characteristic database, carrying out weighted average calculation on the identification accuracy of various characteristics in the historical record and environment similarity weight, calculating the environment similarity weight by adopting a Gaussian function according to the illumination intensity difference and the cloud cover shielding rate difference, and outputting reliability scoring vectors of various characteristics; The input module is used for inputting the current environment parameters and the reliability scoring vector into the attention network, the output layer adopts a Softmax function to normalize to obtain a weight vector, and the weight vector is used for weighting and fusing the multi-scale features of the image to be identified to obtain a fused feature vector; And the updating module is used for inputting the fusion feature vector into a classifier to identify the water and soil loss level, calculating the feature accuracy contribution degree after obtaining verification data and updating the reliability score in the feature database.
- 9. The system of claim 8, wherein acquiring multi-temporal multi-scale images of the monitored area, extracting topographical features, erosion features, and soil features at different temporal levels and spatial scales, creating a feature database, and correlating environmental parameters corresponding to each feature, comprises: Acquiring image data of a monitoring area under the daily scale, the weekly scale, the monthly scale and the quaternary scale by monitoring equipment, and decomposing the image data according to three spatial scales of macroscopic topography, a mesoscopic slope surface and a microscopic erosion point to obtain a multi-scale image set; Extracting a topography gradient change rate, an overall vegetation coverage and a soil bare area occupation ratio from a macroscopic land image in the multi-scale image set, extracting an erosion ditch length density, a slope roughness index and a deposition fan distribution area from a microscopic slope image, and extracting a soil color parameter, a surface texture direction gradient histogram and an erosion ditch wall inclination angle from a microscopic erosion point image to obtain a characteristic parameter set; Organizing the feature parameter set into a three-dimensional feature tensor according to three dimensions of a time hierarchy, a space scale and a feature type; And extracting illumination intensity, cloud cover shielding percentage and vegetation period identification at each image data acquisition time as environment parameters, and establishing an index association relation between the three-dimensional feature tensor and the environment parameters to obtain a feature database.
- 10. The system of claim 8, wherein extracting the illumination intensity, cloud cover rate, and vegetation waiting period of the image to be identified as current environmental parameters comprises: Obtaining the current illumination intensity by calculating the pixel mean value of the red, green and blue three channels of the image to be identified and multiplying the pixel mean value by a calibration coefficient; Calculating the contrast standard deviation of a local area of an image to be identified, judging an area with the contrast standard deviation smaller than a preset contrast threshold as a cloud cover area, and counting the ratio of the pixel number of the cloud cover area to the total pixel number to obtain the cloud cover rate; Calculating a normalized vegetation index (NDVI) value of the image to be identified, and judging a vegetation period identifier according to a numerical interval in which the NDVI value is positioned to obtain the current environmental parameter.
Description
Automatic water and soil loss monitoring method and system based on image recognition Technical Field The application relates to the technical field of image processing, in particular to an automatic water and soil loss monitoring method and system based on image recognition. Background The method is characterized in that a camera device is arranged in a monitoring area or a satellite remote sensing, an unmanned plane and the like are utilized to obtain monitoring images, the image recognition algorithm is combined to automatically recognize and quantitatively analyze soil and water loss characteristics such as surface features, vegetation coverage, soil exposure, erosion gullies, sediment deposition and the like, the implementation process of the prior art generally comprises the steps of acquiring multi-time phase image data of the monitoring area, preprocessing, extracting surface elements such as soil vegetation water bodies and the like by utilizing an image segmentation technology, recognizing key features of soil and water loss by a feature extraction algorithm, establishing a classification or deep learning model to automatically judge the soil and water loss level by adopting a machine learning or deep learning algorithm, comparing and analyzing quantitative parameters such as soil erosion area, sediment loss amount and the like by utilizing a multi-time phase image, and finally, displaying the recognition result in a visual manner to generate a monitoring report. In the prior art, the automatic water and soil loss monitoring method based on image recognition still has various defects, is greatly influenced by environmental factors such as illumination conditions, weather changes, vegetation shielding and the like, obviously causes unstable recognition accuracy due to image quality differences acquired at different time periods in different seasons, is difficult to accurately reflect hidden water and soil loss forms such as underground erosion, fine ditch erosion and the like due to limited single image source information, and the conventional algorithm is required to re-label samples and train when the generalization capability of a training model aiming at specific landform types or erosion types is weaker and is applied to climate areas with different geological conditions, and lacks a multisource data fusion mechanism to fully integrate auxiliary information such as rainfall monitoring soil attributes of the landform data so as to improve discrimination accuracy. Disclosure of Invention The application provides an automatic water and soil loss monitoring method and system based on image recognition, which are used for establishing a quantitative association mechanism between environmental parameters and feature reliability, utilizing historical monitoring data to statistically analyze the accuracy performance of various features under different environmental conditions, constructing a self-adaptive weight distribution network to dynamically adjust feature fusion weights according to current environmental conditions and feature reliability scores, and continuously updating a feature reliability evaluation system through feedback optimization closed loop, so that the technical problems that the feature fusion weights are fixed and cannot adapt to environmental changes, the recognition accuracy is unstable under complex environmental conditions, and the system performance cannot be self-learned and optimized in the prior art are solved. The application improves the identification stability and accuracy of the water and soil loss monitoring system under different lighting condition cloud cover and vegetation waiting period, and realizes the environment self-adaption capability and the performance continuous optimization capability in long-term operation of the monitoring system. In a first aspect, the application provides an automatic water and soil loss monitoring method based on image recognition, which comprises the following steps: Step S1, acquiring multi-temporal multi-scale images of a monitoring area, extracting topographic features, erosion features and soil features under different time levels and spatial scales, establishing a feature database and associating environmental parameters corresponding to the features; Step S2, extracting illumination intensity, cloud cover shielding rate and vegetation weather of an image to be identified as current environmental parameters, retrieving a historical record which satisfies that the illumination intensity difference is smaller than a preset illumination threshold value, the cloud cover shielding rate difference is smaller than a preset shielding threshold value and the vegetation weather is the same from the characteristic database, carrying out weighted average calculation on identification accuracy of various characteristics in the historical record and environmental similarity weight, calculating the environmental similarity weight by adopti