CN-121999525-A - Large yellow croaker behavior recognition method and device, electronic equipment and storage medium
Abstract
The application relates to the technical field of behavior recognition of large yellow croakers, and discloses a method, a device, electronic equipment and a storage medium for recognizing behavior of large yellow croakers, wherein the method comprises the following steps: by analyzing historical data in a plurality of time windows, the model can learn dynamic mapping relations between signal characteristics and behavior types in different periods, the characteristic drift problem caused by growth stages and environmental changes is effectively overcome, due to analysis, the characteristic dimension with the most discrimination in each period can be identified by the model, the adaptability and the interpretability of the model in different time nodes are improved, and finally, a graph neural network is constructed based on original fusion characteristics of the multi-time nodes, so that the model can identify individual behaviors and can capture time sequence evolution rules of group interaction modes, and the behavior types of specified large yellow croakers are identified. The method has the beneficial effects that the accuracy and the robustness of behavior identification in a complex culture environment are remarkably improved, and the accurate identification of the time-varying behavior of the large yellow croaker is realized.
Inventors
- FU CHENGYU
- ZHU HANHAO
- TANG YUNFENG
- ZHOU ZIHAO
- WANG ZHUO
- SU ZHENG
- HOU XIAOBO
- ZHANG ZHIYAO
- ZHOU HUIJUN
- LI JUNQI
Assignees
- 浙江海洋大学
Dates
- Publication Date
- 20260508
- Application Date
- 20251225
Claims (7)
- 1. A method for identifying behavior of large yellow croaker, the method comprising: collecting multiple groups of original data of a target water area in a current time window, wherein each group of original data comprises an original acoustic signal and an original visual signal; Performing time stamp synchronization on the original acoustic signal and the original visual signal; Extracting first acoustic features of the synchronized acoustic signals, identifying key point pixel coordinates of a plurality of large yellow croakers in the visual signals through a key point detection model, calculating mass center coordinates representing the space position of each large yellow croaker based on the key point pixel coordinates, generating motion track data of each large yellow croaker based on a continuous mass center coordinate sequence, and constructing the mass center coordinates, the motion track data and related kinematic parameters into the first visual features; detecting the definition of the original visual signal through a preset definition detection model; assigning a first visual weight to the first visual feature and a first acoustic weight to the first acoustic feature based on the sharpness; Weighting and fusing the first visual feature and the first acoustic feature according to the first visual weight and the first acoustic weight to obtain the original fusion feature; Performing attribution analysis on each original fusion feature to obtain a behavior type corresponding to each original fusion feature; taking the original fusion characteristics of each large yellow croaker and the corresponding behavior types thereof under each time window as a node; Based on the barycenter coordinates in the first visual features, calculating Euclidean distances between different large yellow croaker nodes in the same time window; establishing a connection edge for a node pair with a distance smaller than a preset threshold value based on the Euclidean distance; distributing weights to the connecting edges according to the Euclidean distance, wherein the weights are higher when the distance is smaller; Constructing graph structure data by using the nodes with weights and the connecting edges so as to train a behavior recognition model of the graph neural network; collecting a current acoustic signal and a current visual signal of a designated large yellow croaker in a target water domain, and carrying out feature extraction fusion to obtain a current fusion feature; and inputting the current fusion characteristics into a pre-trained behavior recognition model, and recognizing the behavior type of the specified large yellow croaker.
- 2. The method for identifying behavior of large yellow croaker according to claim 1, wherein the step of collecting the current acoustic signal and the current visual signal of the designated large yellow croaker in the target water domain, and performing feature extraction and fusion to obtain the current fusion feature comprises the following steps: Extracting the current acoustic signal and the current visual signal characteristics to obtain current acoustic characteristics and current visual characteristics, wherein the extraction flow of the current acoustic signal and the current visual signal characteristics is the same as the extraction flow of the original data; Acquiring the real-time environment definition of the current visual signal, distributing real-time visual weight to the current visual feature according to the real-time environment definition, and distributing real-time acoustic weight to the current acoustic feature; And carrying out weighted fusion on the current visual feature and the current acoustic feature according to the real-time visual weight and the real-time acoustic weight to obtain the current fusion feature.
- 3. The behavior recognition method of large yellow croaker of claim 1, wherein the step of inputting the current fusion feature into a pre-trained behavior recognition model to recognize the behavior type of the specified large yellow croaker further comprises: Comparing the behavior type with an actual observation record; if the comparison results are inconsistent, marking the current fusion characteristics and the corresponding behavior types as samples to be verified; and adding the sample to be verified into an incremental learning data set, and performing parameter fine adjustment on the behavior recognition model based on the updated incremental learning data set so as to optimize the model performance.
- 4. The method for identifying behavior of large yellow croaker as recited in claim 1, wherein the step of performing attribution analysis on each of the original fusion features to obtain a behavior type corresponding to each of the original fusion features comprises: inputting the original fusion characteristics into a pre-trained behavior classifier to obtain preliminary behavior type probability distribution; Calculating the contribution degree of each feature dimension in the original fusion feature to the preliminary behavior type probability distribution through a preset attribution algorithm; and screening key feature dimensions based on the contribution degree, and determining a final behavior type according to a matching result of the key feature dimensions and a preset behavior judgment rule.
- 5. A behavior recognition apparatus for large yellow croaker, the apparatus comprising: The acquisition module is used for acquiring a plurality of groups of original data of the target water area in the current time window, wherein each group of original data comprises an original acoustic signal and an original visual signal; the extraction module is used for extracting and fusing the characteristics of the original acoustic signals and the original visual signals in each group of original data to obtain original fusion characteristics; the acquisition module is used for carrying out attribution analysis on each original fusion feature so as to acquire a behavior type corresponding to each original fusion feature; The construction module is used for constructing a behavior recognition model of the graph neural network based on the original fusion characteristics and the corresponding behavior types; the fusion module is used for collecting the current acoustic signal and the current visual signal of the designated large yellow croaker in the target water domain, and carrying out feature extraction and fusion to obtain the current fusion feature; the identification module is used for inputting the current fusion characteristics into a pre-trained behavior identification model and identifying the behavior type of the specified large yellow croaker; the extraction module comprises: a synchronization sub-module for time stamp synchronizing the original acoustic signal and the original visual signal; the first acoustic feature extraction submodule is used for extracting first acoustic features of the synchronized acoustic signals and extracting first visual features of the synchronized visual signals; The definition detection sub-module is used for detecting the definition of the original visual signal through a preset definition detection model; an assignment sub-module for assigning a first visual weight to the first visual feature and a first acoustic weight to the first acoustic feature based on the sharpness; the original fusion feature acquisition submodule is used for carrying out weighted fusion on the first visual feature and the first acoustic feature according to the first visual weight and the first acoustic weight to obtain the original fusion feature; The first acoustic feature extraction submodule includes: The key point pixel coordinate identification unit is used for identifying key point pixel coordinates of a plurality of large yellow croakers in the visual signal through a key point detection model; The barycenter coordinate calculation unit is used for calculating barycenter coordinates representing the space position of each large yellow croaker based on the pixel coordinates of the key points; The motion trail data generation unit is used for generating motion trail data of each piece of large yellow croaker based on the continuous barycenter coordinate sequence; The first visual feature construction unit is used for constructing the centroid coordinates, the motion trail data and the related kinematic parameters into the first visual feature; the construction module comprises: the node marking sub-module is used for taking the original fusion characteristics of each large yellow croaker and the corresponding behavior types of the original fusion characteristics under each time window as a node; the Euclidean distance calculating sub-module is used for calculating Euclidean distances among different large yellow croaker nodes in the same time window based on the barycenter coordinates in the first visual features; The connection edge establishing sub-module is used for establishing a connection edge for a node pair with the distance smaller than a preset threshold value based on the Euclidean distance; the weight distribution sub-module is used for distributing weights to the connecting edges according to the Euclidean distance, wherein the weights are higher when the distance is smaller; and the graph structure data construction submodule is used for constructing graph structure data by using the nodes with weights and the connecting edges so as to train a behavior recognition model of the graph neural network.
- 6. A computer-readable storage medium, characterized in that a computer program is stored, which, when being executed by a processor, causes the processor to perform the steps of the method for identifying behavior of large yellow croaker as claimed in any one of claims 1 to 4.
- 7. An electronic device comprising a memory and a processor, the memory storing a computer program that, when executed by the processor, causes the processor to perform the steps of the method of identifying behavior of large yellow croaker as claimed in any one of claims 1 to 4.
Description
Large yellow croaker behavior recognition method and device, electronic equipment and storage medium Technical Field The invention relates to the technical field of behavior recognition of large yellow croakers, in particular to a method, a device, electronic equipment and a storage medium for recognizing behavior of large yellow croakers. Background As an important economic fish species in China, the refined cultivation and behavior monitoring of the large yellow croaker are important to the improvement of cultivation benefits. The existing behavior recognition technology mainly faces three levels of challenges, namely firstly, depending on single sensor data, visual analysis is easily influenced by turbidity and illumination of a water body, acoustic analysis is sensitive to environmental noise, secondly, the existing multi-mode fusion method mainly adopts simple characteristic splicing, dynamic correlation between signal characteristics and behavior types of the large yellow croaker under different time nodes cannot be fully considered, and most importantly, the existing technology ignores the influence of time-varying environment and growth cycle of the large yellow croaker, namely, under different growth stages, day-night cycles and seasonal changes, the same behavior type possibly shows differentiated acoustic and visual characteristics, and the existing static model is difficult to adapt to the dynamic changes. Disclosure of Invention Based on this, it is necessary to provide a method, a device, an electronic device and a storage medium for identifying behavior of large yellow croaker, aiming at the existing problem of identifying behavior of large yellow croaker. A method for identifying behavior of large yellow croaker, the method comprising: collecting multiple groups of original data of a target water area in a current time window, wherein each group of original data comprises an original acoustic signal and an original visual signal; extracting and fusing the characteristics of the original acoustic signals and the original visual signals in each group of original data to obtain original fusion characteristics; Performing attribution analysis on each original fusion feature to obtain a behavior type corresponding to each original fusion feature; Constructing a behavior recognition model of the graph neural network based on each original fusion characteristic and the corresponding behavior type; collecting a current acoustic signal and a current visual signal of a designated large yellow croaker in a target water domain, and carrying out feature extraction fusion to obtain a current fusion feature; and inputting the current fusion characteristics into a pre-trained behavior recognition model, and recognizing the behavior type of the specified large yellow croaker. Further, the step of extracting and fusing the features of the original acoustic signal and the original visual signal in each set of the original data to obtain original fused features includes: Performing time stamp synchronization on the original acoustic signal and the original visual signal; extracting a first acoustic feature of the synchronized acoustic signal and extracting a first visual feature of the synchronized visual signal; detecting the definition of the original visual signal through a preset definition detection model; assigning a first visual weight to the first visual feature and a first acoustic weight to the first acoustic feature based on the sharpness; and carrying out weighted fusion on the first visual feature and the first acoustic feature according to the first visual weight and the first acoustic weight to obtain the original fusion feature. Further, in the step of extracting the first acoustic feature of the synchronized acoustic signal and the first visual feature of the synchronized visual signal, the step of extracting the first visual feature includes: identifying key point pixel coordinates of a plurality of large yellow croakers in the visual signal through a key point detection model; Calculating mass center coordinates representing the space position of each large yellow croaker based on the key point pixel coordinates; generating motion trail data of each large yellow croaker based on a continuous barycenter coordinate sequence; and constructing the centroid coordinates, the motion trail data and related kinematic parameters as the first visual features. Further, the step of constructing a behavior recognition model of the graph neural network based on each original fusion feature and the corresponding behavior type includes: taking the original fusion characteristics of each large yellow croaker and the corresponding behavior types thereof under each time window as a node; Based on the barycenter coordinates in the first visual features, calculating Euclidean distances between different large yellow croaker nodes in the same time window; establishing a connection edge for a node pair with a distan