CN-122024108-A - Wild animal community state monitoring method and system based on unmanned aerial vehicle
Abstract
The invention relates to the technical field of ecological monitoring, in particular to a wild animal community state monitoring method and system based on an unmanned aerial vehicle, which comprises the steps of firstly controlling visible light and thermal imaging carried by the unmanned aerial vehicle and a multispectral camera to carry out hardware synchronous acquisition to obtain space-time aligned multimodal data; the method comprises the steps of preprocessing and lightweight AI preliminary identification at an unmanned plane end, edge calculation and cross-modal fusion analysis at a ground control station to achieve species identification, quantity statistics, behavior analysis and vegetation index calculation, deep modeling by integrating multi-space-time data at a cloud end to achieve distribution density fine calculation, behavior pattern semantic understanding and habitat suitability comprehensive evaluation, and finally generating a comprehensive monitoring report and early warning. The method solves the problems of single data dimension, poor analysis timeliness and lack of habitat association evaluation in the traditional monitoring method, and realizes efficient, all-weather and automatic comprehensive monitoring and evaluation of the wild animal community state.
Inventors
- XU XIAOHONG
- LIU LEI
- ZHANG PENGWEI
- HE LIQIANG
Assignees
- 洋县湿地保护管理中心
Dates
- Publication Date
- 20260512
- Application Date
- 20260211
Claims (10)
- 1. The wild animal community state monitoring method based on the unmanned aerial vehicle is characterized by comprising the following steps of: Controlling an unmanned aerial vehicle carrying a visible light camera, a thermal imaging camera and a multispectral camera to fly along a planned flight path, and synchronously collecting a visible light image, a thermal imaging image and multispectral image data which are aligned in time and space; preprocessing a visible light image and a thermal imaging image by using an AI model, detecting a preliminary animal target, extracting a suspected animal region image block, and packaging and transmitting the suspected animal region image block and corresponding multispectral image data; Performing cross-modal interactive fusion analysis on visible light, thermal imaging and multispectral data in a received data packet by utilizing an edge AI model, and outputting a preliminary analysis result of animal types, number and behavior modes and a vegetation index; Integrating multiple space-time data, carrying out animal quantity statistics and distribution density calculation and animal behavior pattern depth identification by using a deep learning model, and comprehensively evaluating habitat suitability by combining vegetation indexes and animal distribution to generate a habitat quality map; Based on the habitat quality map, comprehensive monitoring reports and ecological early warning information comprising population distribution, behavior pattern statistics and habitat assessment are automatically generated.
- 2. The unmanned aerial vehicle-based wild animal community state monitoring method of claim 1, wherein the method further comprises: and dynamically demarcating a target monitoring area based on the multisource geographic information data and the perch model prediction, and carrying out self-adaptive grid waypoint layout and terrain following flight path planning in the area.
- 3. The unmanned aerial vehicle-based wild animal community state monitoring method of claim 2, wherein dynamically demarcating the target monitoring area based on multi-source geographic information data and habitat model predictions comprises: Integrating historical animal distribution data, environmental factor remote sensing data and terrain data, generating a habitat suitability probability distribution map of a target species by using an ecological niche model, and defining a monitoring area boundary by combining a seasonal change and a migration prediction model and performing gridding treatment.
- 4. The method for monitoring the status of a wild animal community based on an unmanned aerial vehicle according to claim 1, The AI model is a double-flow parallel architecture and comprises a visible light image processing flow and a thermal imaging image processing flow, and is connected with a feature fusion and decision module, and the suspicious animal region image block extracting specifically comprises the steps of carrying out confidence fusion and space-time matching on a double-flow detection result, generating an optimal bounding box, carrying out self-adaptive context expansion, and then cutting to obtain a multi-mode image block.
- 5. The method for monitoring the status of a wild animal community based on an unmanned aerial vehicle according to claim 1, The edge AI model is a multi-branch network comprising a visible light branch, a thermal imaging branch, a multi-spectrum branch and a metadata encoder, and performs feature fusion through a cross-modal interaction fusion module, wherein the cross-modal interaction fusion analysis specifically comprises intra-modal feature enhancement, feature map space alignment based on deformable convolution and semantic fusion through a layered cross-modal attention mechanism.
- 6. The method for monitoring the status of a wild-type animal community based on an unmanned aerial vehicle according to claim 5, The cross-modal interaction fusion analysis further comprises the step of dynamically adjusting fusion weights of all modal characteristics through a gating fusion unit, and the weights are calculated in an adaptive mode according to quality scores of all modal data and environmental conditions.
- 7. The method for monitoring the status of a wild animal community based on an unmanned aerial vehicle according to claim 1, wherein the calculation of the animal number statistics and the distribution density comprises: Based on the preliminary result uploaded by the edge calculation, reconstructing a high-resolution animal distribution density map by utilizing a super-resolution network, estimating the density by adopting a self-adaptive bandwidth kernel density estimation method, and carrying out de-duplication statistics by combining cross-frame tracking and individual characteristics.
- 8. The unmanned aerial vehicle-based wild animal community state monitoring method of claim 1, wherein the animal behavior pattern depth recognition specifically comprises: Reconstructing individual space-time motion track, extracting the characteristics of kinematics, space utilization and society, constructing dynamic space-time diagram, reasoning by using a hierarchical space-time diagram convolution network, and carrying out layer-by-layer identification and semantic understanding from a basic action unit, a medium-level behavior mode to a high-level ecological behavior mode.
- 9. The unmanned aerial vehicle-based wild animal community state monitoring method according to claim 1, wherein the comprehensive evaluation of habitat suitability specifically comprises: Extracting food resources, hidden shelters, water sources, interference and topography and climate multidimensional suitability factors from an ecological space-time data cube, constructing a resource selection function model for specific species or functional groups, quantifying each factor weight and calculating habitat suitability indexes, synthesizing a habitat geological map of a multi-species suitability index generation area and carrying out restriction factor analysis.
- 10. A wild animal community state monitoring system based on an unmanned aerial vehicle, which is applied to the wild animal community state monitoring method based on the unmanned aerial vehicle as claimed in claim 1, and is characterized by comprising the following steps: the unmanned aerial vehicle aerial platform is used for controlling an unmanned aerial vehicle carrying a visible light camera, a thermal imaging camera and a multispectral camera to fly along a planned flight path, synchronously collecting a visible light image, a thermal imaging image and multispectral image data which are aligned in time and space, preprocessing the visible light image and the thermal imaging image by utilizing an AI model, detecting a preliminary animal target, extracting a suspected animal region image block, and packaging and transmitting the suspected animal region image block and the corresponding multispectral image data; The ground control station is used for performing cross-modal interaction fusion analysis on visible light, thermal imaging and multispectral data in the received data packet by utilizing an edge AI model and outputting a preliminary analysis result of animal types, number and behavior modes and a vegetation index; The cloud data center platform is used for integrating multiple time-space data, carrying out animal quantity statistics and distribution density calculation and animal behavior pattern depth identification by utilizing a deep learning model, carrying out comprehensive evaluation on habitat suitability by combining a vegetation index and animal distribution, generating a habitat quality map, and automatically generating a comprehensive monitoring report and ecological early warning information containing population distribution, behavior pattern statistics and habitat evaluation based on the habitat quality map.
Description
Wild animal community state monitoring method and system based on unmanned aerial vehicle Technical Field The invention relates to the technical field of ecological monitoring, in particular to a wild animal community state monitoring method and system based on an unmanned aerial vehicle. Background The traditional wild animal community state monitoring method mainly relies on manual field investigation, fixed point position camera monitoring or satellite remote sensing technology. The manual investigation is low in efficiency, limited in coverage range and easy to be limited by terrain and climatic conditions, the animal is interfered, the monitoring range of the fixed camera is fixed, animal migration cannot be flexibly tracked, satellite remote sensing is wide in coverage, but limited in resolution, and animal individual identification and behavior analysis are difficult to realize. The existing unmanned aerial vehicle is monitored by adopting a single sensor (such as a visible light camera), the data dimension is single, the comprehensive analysis of all-weather, multidimensional and high-precision animal community states is difficult to realize, and the collaborative evaluation capability of species, behaviors and habitat vegetation conditions is lacking. Disclosure of Invention The invention aims to provide a wild animal community state monitoring method and system based on an unmanned aerial vehicle, which realize all-weather, multidimensional and high-precision animal community state comprehensive analysis and improve the collaborative assessment capability of species, behaviors and habitat vegetation conditions. To achieve the above object, in a first aspect, the present invention provides a wild animal community state monitoring method based on an unmanned aerial vehicle, comprising the steps of: Controlling an unmanned aerial vehicle carrying a visible light camera, a thermal imaging camera and a multispectral camera to fly along a planned flight path, and synchronously collecting a visible light image, a thermal imaging image and multispectral image data which are aligned in time and space; preprocessing a visible light image and a thermal imaging image by using an AI model, detecting a preliminary animal target, extracting a suspected animal region image block, and packaging and transmitting the suspected animal region image block and corresponding multispectral image data; Performing cross-modal interactive fusion analysis on visible light, thermal imaging and multispectral data in a received data packet by utilizing an edge AI model, and outputting a preliminary analysis result of animal types, number and behavior modes and a vegetation index; Integrating multiple space-time data, carrying out animal quantity statistics and distribution density calculation and animal behavior pattern depth identification by using a deep learning model, and comprehensively evaluating habitat suitability by combining vegetation indexes and animal distribution to generate a habitat quality map; Based on the habitat quality map, comprehensive monitoring reports and ecological early warning information comprising population distribution, behavior pattern statistics and habitat assessment are automatically generated. Wherein the method further comprises: and dynamically demarcating a target monitoring area based on the multisource geographic information data and the perch model prediction, and carrying out self-adaptive grid waypoint layout and terrain following flight path planning in the area. Wherein, dynamically demarcating a target monitoring area based on multisource geographic information data and habitat model predictions, comprising: Integrating historical animal distribution data, environmental factor remote sensing data and terrain data, generating a habitat suitability probability distribution map of a target species by using an ecological niche model, and defining a monitoring area boundary by combining a seasonal change and a migration prediction model and performing gridding treatment. The AI model is of a double-flow parallel architecture and comprises a visible light image processing flow and a thermal imaging image processing flow, and is connected with a feature fusion and decision module, and the suspicious animal region image block extracting specifically comprises the steps of carrying out confidence fusion and space-time matching on double-flow detection results, generating an optimal boundary frame, carrying out self-adaptive context expansion, and then cutting to obtain a multi-mode image block. The edge AI model is a multi-branch network comprising a visible light branch, a thermal imaging branch, a multi-spectrum branch and a metadata encoder, and performs feature fusion through a cross-modal interaction fusion module, wherein the cross-modal interaction fusion analysis specifically comprises intra-modal feature enhancement, feature map space alignment based on deformable convolution and