CN-115497021-B - Fine granularity identification method for sow lactation based on computer vision
Abstract
The invention discloses a fine-grained identification method for sow lactation based on computer vision, which comprises the steps of collecting nodding videos of sows and piglets in lactation period, establishing a fine-grained classification data set of sow lactation, training a behavior identification model of fine-grained classification of sow lactation, inputting a long-period monitoring video into the behavior identification model to generate a behavior type label sequence of fine-grained classification of corresponding sow lactation, modeling a fine-grained classification problem of sow lactation by combining with a hidden Markov model and utilizing a Viterbi algorithm to correct errors of the type label sequence, and outputting classification results of four categories of piglet breast-feeding, lactation interruption and non-lactation states. The invention can avoid the shielding interference of the limit fence of the sow obstetric table as much as possible, has strong robustness, consumes less computing resources, has higher running speed, and is suitable for a video monitoring system in an actual cultivation environment.
Inventors
- LI BO
- XU WEIJIE
- CHEN TIANMING
Assignees
- 南京农业大学
Dates
- Publication Date
- 20260512
- Application Date
- 20220915
- Priority Date
- 20220909
Claims (8)
- 1. The fine-grained identification method for the lactating behavior of the sow based on computer vision is characterized by comprising the following steps of: (1) Collecting nodding videos of sows and piglets in a lactation period; (2) Dividing the related behavior of sow lactation into four states of piglet lactation, lactation in progress, lactation interruption and non-lactation according to behavior characteristics, and establishing a fine-grained classification data set of sow lactation; (3) Building and training a behavior recognition model of fine-granularity classification of sow lactation behavior with a double-flow structure; (4) Inputting the video into a trained model to generate a category label sequence for fine-grained classification of sow lactation; (5) Preprocessing a behavior class label sequence; (6) Setting hidden Markov model parameters, taking the preprocessed class label sequence of fine granularity classification of sow lactation as an observation sequence of the hidden Markov model, carrying out error correction by utilizing a Viterbi algorithm, and outputting a final fine granularity classification result of sow lactation.
- 2. A method of fine granularity recognition of sow lactation based on computer vision as claimed in claim 1, wherein the data set in step (2) comprises a behavioral recognition model training data set and a sow lactation video clip data set to verify feasibility of the method.
- 3. The method for identifying fine granularity of lactating sows based on computer vision according to claim 1, wherein the step (3) is specifically: (3.1) constructing a neural network behavior recognition model with a double-flow structure, and realizing feature extraction of two videos with different input frame rates; (3.2) fusing the extracted features; And (3.3) inputting the data set into the constructed behavior recognition model, and performing model training.
- 4. The method for identifying fine granularity of lactating sows based on computer vision according to claim 1, wherein the step (4) is specifically: (4.1) segmenting the long video clips into short video clip input models by using the trained behavior recognition models, and storing behavior category tag sequences of the short video clips one by one; And (4.2) when the whole long video is segmented to the end, obtaining a behavior category label sequence of the whole video.
- 5. The method for identifying fine granularity of lactating sows based on computer vision as claimed in claim 1, wherein the preprocessing mode in the step (5) comprises filtering and binarizing the class tag sequence.
- 6. The method for identifying fine granularity of lactating sows based on computer vision according to claim 1, wherein the step (6) is specifically: (6.1) setting hidden Markov model parameters in combination with the logical relation among four categories of sow lactation fine-granularity classification, and modeling a sow lactation fine-granularity classification problem by using a hidden Markov model; and (6.2) inputting the observation sequence consisting of the observation values into a Viterbi algorithm, carrying out error correction by using the Viterbi algorithm, and outputting a final fine-grained behavior classification label result.
- 7. A computer storage medium having stored thereon a computer program, which when executed by a processor implements a computer vision based method for fine-grained identification of lactating sows as claimed in any of claims 1-6.
- 8. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements a computer vision based method of fine grain identification of lactating sows as claimed in any one of claims 1 to 6 when the computer program is executed.
Description
Fine granularity identification method for sow lactation based on computer vision Technical Field The invention relates to image processing, computer vision and interactive behavior recognition, in particular to a sow lactation behavior fine-granularity recognition method based on computer vision. Background According to statistics, the mortality of piglets before weaning is as high as 10-15% under the background of large-scale enterprise cultivation management. And the average number of healthy piglets per sow is an important factor affecting the income of a pig farm. Enterprises can intervene the breeding and the lactation of the sows as manually as possible so as to fully mine the breeding and the lactation potential of the sows and strive for a larger litter size. Therefore, for better manual intervention, data statistics and analysis of sow lactation are important. However, the information such as the lactation state, the duration time and the like of the sow is recorded by a manual monitoring mode, and great manpower investment is required. And, manual recording may result in subjective factors affecting the data. Therefore, analysis of lactating sow behavior by using an automatic identification technique has become an important subject in the field of pig farming. However, the current technical research is mainly focused on a single recognition technology of whether a sow is suckling, and information of a suckling initiation process and a suckling ending process is ignored. In actual production, the mode and duration information of the starting and ending processes of sow lactation represent the lactation habit and the maternal level of the sow to a great extent. Therefore, the fine classification and information collection of the sow lactation process are of great significance to the behavioral analysis of different sows and the selection and elimination of sows in pig farm cultivation. In the field of animal behavior recognition, a great deal of research work is being done by students on behavior classification techniques based on wearable sensors. However, the sensor worn by live pigs is easy to cause the problems of friction damage, falling off and the like. With the development of technology, non-contact computer vision technology is beginning to be used for behavior and gesture recognition of sows. Typical methods include extracting specific parameters from images, formulating judgment standards according to the parameters, zhu Weixing et al acquire the outline of a pig by adopting an Otsu threshold segmentation method on the pig image, judging the behavior of the pig by calculating the outline similarity, and applying for a patent of outline-based pig drinking behavior recognition method (publication No. CN 107437069A) based on the method. Although the method realizes the identification of the simple behavior gesture, the time sequence motion characteristics of pigs are not considered in the identification process, and the identification capability of complex behaviors is limited. In addition, in the sow behavior recognition fields Xue Yueju and AQING YANG, et al, a deep neural network and an optical flow method are combined for the behavior judgment of sow lactation, results are published in International journal Biosystems Engineering and Computers and Electronics in Agriculture, and a method for recognizing sow lactation behavior by computer vision (publication No. CN 109492535A) is applied for, and a method for realizing lactation behavior recognition by using the optical flow method, the convolutional neural network and a support vector machine is disclosed. However, such sow lactation recognition methods only consider the recognition of whether lactation occurs, and cannot extract detailed information of the lactation process. Therefore, providing a method for achieving detailed classification, i.e. fine-grained classification and identification, of lactating sows is a problem that needs to be solved by those skilled in the art. Disclosure of Invention The invention aims to provide a fine grain recognition method for sow lactation based on computer vision, so that a neural network model with a double-flow structure is utilized to preliminarily detect fine grain behavior categories of sow lactation, a hidden Markov model is utilized to model fine grain classification problems of sow lactation, and a Viterbi algorithm is utilized to correct and obtain a final state sequence, so that fine grain recognition of lactation is realized. The fine granularity identification method for sow lactation based on computer vision comprises the following steps: (1) And collecting nodding videos of sows and piglets in the lactation period. (2) A fine-grained classification dataset of lactating sows is established. (3) And constructing and training a behavior recognition model of fine-grained classification of the lactating behaviors of the sows with the double-flow structure. (4) And inputting the monitori