Search

CN-120599516-B - Fish respiration monitoring method and system

CN120599516BCN 120599516 BCN120599516 BCN 120599516BCN-120599516-B

Abstract

The embodiment of the invention provides a fish respiration monitoring method and system, and belongs to the technical field of artificial intelligence. The method comprises the steps of obtaining fish body video data, intercepting images of the fish body video data, carrying out real-time reasoning on each frame of video by adopting a mixed cascade network model to obtain a motion boundary frame of motion of each frame of fish mouth and fish gill, carrying out space-time feature extraction operation on the motion boundary frame to obtain time sequence motion signals of the fish mouth and the fish gill, determining a fish body respiratory rate value according to the time sequence motion signals, and determining a fish body pressure grade according to the fish body respiratory rate value. Compared with the prior art, the method and the system provided by the embodiment of the invention realize accurate monitoring of the fish respiration under the influence of the water wave through the combination of the mixed cascade network model and the time sequence motion signal.

Inventors

  • Ye Ziran
  • KONG DEDONG
  • TAN XIANGFENG
  • DAI MENGDI
  • RUAN BINJIE
  • CHEN XUTING
  • Kong Feiling
  • LI BOJUN

Assignees

  • 浙江省农业科学院

Dates

Publication Date
20260505
Application Date
20250603

Claims (8)

  1. 1. A method for monitoring respiration of fish, comprising: Acquiring fish video data; Intercepting an image of the fish body video data; Carrying out real-time reasoning on each frame of video by adopting a mixed cascade network model to obtain a motion boundary frame of the motion of each frame of fish mouth and fish gill; Performing space-time feature extraction operation on the motion boundary frame to obtain time sequence motion signals of the fish mouth and the fish gill; Determining a fish respiration rate value according to the time sequence motion signal; Determining a fish pressure grade according to the fish respiration frequency value; determining a fish pressure rating from the fish respiration rate value, comprising: Determining a respiratory frequency variation coefficient, a peak period standard deviation and an energy entropy ratio according to the fish respiratory frequency value; Constructing a feature matrix according to the respiratory rate variation coefficient, the peak period standard deviation and the energy entropy ratio; weighting the characteristic matrix by adopting a three-level pressure label, and determining uncoordinated scheduling of the fish mouth and the fish gill movement by combining with an SHAP value interpretability analysis method; And determining the fish pressure level according to the uncoordinated degree.
  2. 2. The method of claim 1, wherein acquiring fish video data comprises: performing low-frequency filtering on the acquired original video stream by adopting Gaussian filtering; processing the original video stream after low-frequency filtering by adopting a median filtering method; Adjusting the processed original video stream by adopting a multispectral channel equalization method; And enhancing the adjusted original video stream by adopting a contrast limited self-adaptive histogram equalization method.
  3. 3. The method as recited in claim 1, further comprising: Splitting the fish body video frame by frame to obtain a fish body image; labeling a fish head area in the fish body image; Carrying out at least one of random rotation, dynamic scaling, atomization simulation and motion blurring on the marked fish body image; combining the processed fish body images into a training data set; adopting ConvNeXt network structure as backbone network to construct initial model; Determining a loss function of the initial model according to formulas (1) to (3): ,(1) ,(2) ,(3) Wherein, the As a balance factor, the balance factor is, As the predictive probability of the initial model, As the focal point factor of the lens, Is the loss function; And training an initial model by adopting the data set to obtain the hybrid cascade network model.
  4. 4. The method of claim 1, wherein performing a spatiotemporal feature extraction operation on the motion bounding box to obtain a time series motion signal of the fish mouth and the fish gill comprises: extracting space-time motion features between motion boundary frames of every two frames by adopting a multi-scale feature pyramid; calculating an optical flow field component according to the space-time motion characteristics by adopting an iterative updating mechanism; Calculating a motion intensity characterization from the optical flow field component; generating a time sequence signal according to the motion intensity representation at a preset sampling rate; Adopting a time alignment mechanism compensation model to perform phase deviation reasoning on the time sequence signals; and synchronizing the time series signals according to the result of the phase deviation reasoning so as to obtain the time series motion signals.
  5. 5. The method of claim 4, wherein computing optical flow field components from the spatio-temporal motion features using an iterative update mechanism comprises: an outlier optical flow component in the optical flow field component is eliminated using a median filter.
  6. 6. The method of claim 4, wherein computing a motion intensity characterization from the optical flow field component comprises: calculating the motion intensity characterization using equation (4): ,(4) Wherein, the Is the first The motion intensity of the individual sampling points is characterized, 、 Respectively the first Different sub-components of the optical flow field component of the sample points, Is a statistical quantile.
  7. 7. The method of claim 1, wherein determining a fish respiration rate value from the time-series motion signal comprises: Adopting a fourth-order Butterworth low-pass filter and a Symlet wavelet function to sequentially preprocess the time sequence motion signals; determining a respiratory cycle characteristic point according to the preprocessed time sequence motion signal by adopting a sliding window dynamic peak detection algorithm; Determining a weighted average of adjacent peak interval time according to the respiratory cycle characteristic points; And calculating the respiratory rate value of the fish body according to the weighted average value.
  8. 8. A fish respiration monitoring system comprising a processor for performing the method of any of claims 1 to 7.

Description

Fish respiration monitoring method and system Technical Field The invention relates to the technical field of artificial intelligence, in particular to a fish respiration monitoring method and system. Background In current aquaculture, the assessment of fish health is often dependent on manual observation or traditional biosensors, which often suffer from inefficiency, low accuracy and susceptibility to environmental factors. With the development of computer vision technology, non-contact monitoring using video images is possible. Optical flow has been used in a variety of biological behavior analysis scenarios as an effective motion estimation method. However, conventional optical flow (e.g., farneback) relies on image gradient changes, which tend to lose motion information in low-texture areas or when fish swim rapidly. Disclosure of Invention The embodiment of the invention aims to provide a fish respiration monitoring method and system. In order to achieve the above object, an embodiment of the present invention provides a fish respiration monitoring method, including: Acquiring fish video data; Intercepting an image of the fish body video data; Carrying out real-time reasoning on each frame of video by adopting a mixed cascade network model to obtain a motion boundary frame of the motion of each frame of fish mouth and fish gill; Performing space-time feature extraction operation on the motion boundary frame to obtain time sequence motion signals of the fish mouth and the fish gill; Determining a fish respiration rate value according to the time sequence motion signal; And determining the fish pressure grade according to the fish respiration frequency value. Optionally, acquiring the fish body video data includes: performing low-frequency filtering on the acquired original video stream by adopting Gaussian filtering; processing the original video stream after low-frequency filtering by adopting a median filtering method; Adjusting the processed original video stream by adopting a multispectral channel equalization method; And enhancing the adjusted original video stream by adopting a contrast limited self-adaptive histogram equalization method. Optionally, the method further comprises: Splitting the fish body video frame by frame to obtain a fish body image; labeling a fish head area in the fish body image; Carrying out at least one of random rotation, dynamic scaling, atomization simulation and motion blurring on the marked fish body image; combining the processed fish body images into a training data set; adopting ConvNeXt network structure as backbone network to construct initial model; Determining a loss function of the initial model according to formulas (1) to (3): FL(p)=-α(1-p)γlog(p), (1) BCE(p)=-log(p), (2) Loss(p)=FL(p)+BCE(p), (3) Wherein alpha is a balance factor, p is a prediction probability of an initial model, gamma is a focus factor, and Loss (p) is the Loss function; And training an initial model by adopting the data set to obtain the hybrid cascade network model. Optionally, performing a space-time feature extraction operation on the motion bounding box to obtain a time-series motion signal of the fish mouth and the fish gill, including: extracting space-time motion features between motion boundary frames of every two frames by adopting a multi-scale feature pyramid; calculating an optical flow field component according to the space-time motion characteristics by adopting an iterative updating mechanism; Calculating a motion intensity characterization from the optical flow field component; generating a time sequence signal according to the motion intensity representation at a preset sampling rate; adopting a time alignment mechanism compensation factor model to carry out phase deviation reasoning on the time sequence signals; and synchronizing the time series signals according to the result of the phase deviation reasoning so as to obtain the time series motion signals. Optionally, calculating the optical flow field component from the spatio-temporal motion features using an iterative update mechanism includes: an outlier optical flow component in the optical flow field component is eliminated using a median filter. Optionally, calculating a motion intensity characterization from the optical flow field component comprises: calculating the motion intensity characterization using equation (4): Wherein, the For the motion intensity characterization of the ith sampling point, u i、vi is respectively different subcomponents in the optical flow field component of the ith sampling point, and lambda is the statistical quantile. Optionally, determining a fish respiration rate value from the time-series motion signal comprises: Adopting a fourth-order Butterworth low-pass filter and a Symlet wavelet function to sequentially preprocess the time sequence motion signals; determining a respiratory cycle characteristic point according to the preprocessed time sequence motion signal by adopting a sl