Search

CN-121708556-B - Monitoring system and method for flight guarantee node

CN121708556BCN 121708556 BCN121708556 BCN 121708556BCN-121708556-B

Abstract

The invention discloses a monitoring system and method of flight guarantee nodes, wherein the monitoring system specifically comprises an acquisition module for acquiring a flight guarantee node strategy and video sequences of all monitoring points, a behavior analysis module for identifying a multi-scale target through integrating a multi-path network and target detection of position attention, and correcting the multi-scale target through a space perception function based on a class gap scale index, so as to obtain a flight guarantee target and a guarantee target node, and an event construction module for constructing a flight guarantee node event based on matching analysis of the flight guarantee target, the guarantee target node and the flight guarantee node strategy. And determining a flight guarantee target and a corresponding guarantee target node through target correction so as to overcome flight detection errors caused by environmental changes, and constructing a flight guarantee node event by means of a node judgment model, thereby realizing the real-time performance and accuracy of flight guarantee node monitoring.

Inventors

  • Mou Jianliang
  • LI XIN
  • WANG YU
  • WEI HAIYANG
  • WANG ZEHUI
  • JIA ZHIJIE
  • YAN YIFENG
  • XU HAOYUAN
  • JIN YONGAN
  • Lian Mengyu

Assignees

  • 北京司拓民航科技有限责任公司

Dates

Publication Date
20260508
Application Date
20260212

Claims (8)

  1. 1. The monitoring system of the flight guarantee node is characterized by comprising the following specific components: The acquisition module is used for acquiring a flight guarantee node strategy and video sequences of all monitoring points; The behavior analysis module is used for analyzing a video sequence through fusion of a multi-path network and target detection of position attention, and identifying multi-scale targets, and concretely comprises the steps of reading a video sequence of each monitoring point to obtain a plurality of video frame images, giving an identification range based on a target area of flight guarantee, cutting each video frame image through the identification range to obtain a video frame key image, extracting image targets of different scales in the video frame key image through a multi-scale feature extraction path to obtain image features of different scales, carrying out fusion analysis on the image features of different scales by combining with a multi-feature fusion matching strategy with position enhancement, and giving an image output vector for each image target, wherein the image output vector comprises combining scale weights, fusing the image features of different scales in each video frame key image to obtain fusion features, carrying out position enhancement on the fusion features according to feature positions in the video frame images to form enhancement feature vectors of each image target, carrying out target positioning judgment on the enhancement features of each image target based on a target detection head trained in advance, and giving the image vector of each image target in each video frame image, wherein the image output vector comprises a target position detection parameter and a target spatial degree; The method comprises the steps of correcting a multi-scale target through a space perception function based on a class void scale index to obtain a flight support target and a support target node, and concretely comprises the steps of setting a class calibration function according to the target class of an image target and the number of the target classes, wherein the class calibration function is used for eliminating overall bias caused by class sample scale unbalance, carrying out class scale adjustment, void enhancement strength is used for controlling the enhancement strength of a class void scale index to a tail class, the class void scale index is used for describing sparsity differences of different target classes in space, carrying out scale mapping on the class calibration function, combining the void enhancement strength, determining a confidence correction function to correct the detection confidence of the image target, determining the flight support target from the image output vector of the image target based on the corrected detection confidence, judging the image output vector of each flight support target in combination with the target area of flight support, and giving a judgment result, and according to continuous time steps, fusing each judgment result to form a time state sequence, judging the motion state of the flight support target based on statistical analysis of the time state sequence; The event construction module is used for constructing the flight guarantee node event based on the matching analysis of the flight guarantee target, the guarantee target node and the flight guarantee node policy.
  2. 2. The monitoring system of a flight support node of claim 1, wherein the video sequence of each monitoring point is obtained by: collecting initial video data of a preset monitoring point position; based on the time interval triggered by the event, extracting a key video stream in the initial video data to form a plurality of video clips; and (5) collecting all video clips to form video sequences of all monitoring points.
  3. 3. The monitoring system of a flight support node of claim 1, wherein the classification calibration function is expressed in: ; where Cal () is a class calibration function, For the pre-correction detection confidence of the flight assurance target e corresponding to the target class number C, C is all target class numbers, A is the number of target categories, bg is the background category number, N c is the number of samples corresponding to the target category number C in the training dataset, and beta is the logarithmic base.
  4. 4. The monitoring system of a flight support node of claim 1, wherein the confidence correction function is obtained by: Determining class distribution lacunarity of each target class by analyzing the quantity distribution of each target class in the training data set; based on the class distribution void fraction, carrying out multi-scale analysis by combining different window sizes, and determining class void scale indexes of each target class; performing scale mapping on the image output vector to obtain scale mapping parameters of each target class; Determining a space perception calibration parameter through a space perception function based on the scale mapping parameter and in combination with a standardized class void scale index; and carrying out normalization processing through the space perception calibration parameters of different target categories, and completing the construction of the confidence correction function.
  5. 5. The system for monitoring a flight support node according to claim 1, wherein the construction of a flight support node event based on a matching analysis of a flight support target, a support target node, and a flight support node policy, comprises: Acquiring a flight guarantee target, a guarantee target node and a flight guarantee node strategy, wherein the flight guarantee node strategy comprises standard guarantee node events corresponding to each flight guarantee target; based on a pre-constructed node judgment model, carrying out matching analysis on a flight guarantee target and a flight guarantee node strategy by a guarantee target node; And constructing a flight guarantee node event based on the result of the matching analysis.
  6. 6. The system for monitoring the flight support node according to claim 5, wherein the node judgment model specifically comprises a behavior unit, a probability calculation unit and a rule unit; The behavior unit is used for analyzing the relation between the flight guarantee target and the standard guarantee node event based on the flight guarantee target and the guarantee target node to obtain the current node state; the probability calculation unit is used for calculating probability distribution of the current node state based on a probability density function of the guarantee target node in combination with the current node state to give probability distribution likelihood values of the guarantee target node; And the rule unit is used for judging probability distribution likelihood values of the guarantee target nodes through a preset logic expression relationship and giving a matching result of the flight guarantee targets on the guarantee target nodes and the flight guarantee nodes.
  7. 7. The system for monitoring a flight support node of claim 6, wherein the probability density function is a gaussian mixture model, obtained by: based on each guarantee node, modeling the state distribution of the guarantee nodes by adopting a Gaussian mixture model to obtain an initial distribution model; calculating posterior responsibility degrees corresponding to the Gaussian components through the Gaussian components of each guarantee node; And updating model parameters in the initial distribution model by combining a preset learning rate and a posterior responsibility degree corresponding to the Gaussian component to determine a Gaussian mixture model, wherein the model parameters comprise component weights, central positions and covariance matrixes of the Gaussian component.
  8. 8. A method for monitoring a flight support node, characterized in that a monitoring system of the flight support node according to any one of claims 1-7 is adopted, comprising the following steps: acquiring a flight guarantee node strategy and a video sequence of each monitoring point; The method comprises the steps of obtaining a plurality of video frame images by reading a video sequence of each monitoring point, giving an identification range based on a target area of flight guarantee, cutting each video frame image through the identification range to obtain a video frame key image, extracting image targets with different scales in the video frame key image through a multi-scale feature extraction path to obtain image features with different scales, carrying out fusion analysis on the image features with different scales by combining a multi-feature fusion matching strategy with position enhancement, and giving an image output vector for each image target, wherein the method specifically comprises the steps of combining scale weights, fusing the image features with different scales in each video frame key image to obtain fusion features, carrying out position enhancement on the fusion features according to feature positions of the fusion features in the video frame images, forming enhancement feature vectors of each image target, carrying out target positioning judgment on the enhancement feature vectors of each image target based on a target detection head trained in advance, and giving the image output vector of the image target in each video frame image, wherein the image output vector comprises a target position detection head, a target detection space parameter and a target class; The method comprises the steps of correcting a multi-scale target through a space perception function based on a class void scale index to obtain a flight support target and a support target node, and concretely comprises the steps of setting a class calibration function according to the target class of an image target and the number of the target classes, wherein the class calibration function is used for eliminating overall bias caused by class sample scale unbalance, carrying out class scale adjustment, void enhancement strength is used for controlling the enhancement strength of a class void scale index to a tail class, the class void scale index is used for describing sparsity differences of different target classes in space, carrying out scale mapping on the class calibration function, combining the void enhancement strength, determining a confidence correction function to correct the detection confidence of the image target, determining the flight support target from the image output vector of the image target based on the corrected detection confidence, judging the image output vector of each flight support target in combination with the target area of flight support, and giving a judgment result, and according to continuous time steps, fusing each judgment result to form a time state sequence, judging the motion state of the flight support target based on statistical analysis of the time state sequence; and constructing a flight guarantee node event based on the matching analysis of the flight guarantee target, the guarantee target node and the flight guarantee node policy.

Description

Monitoring system and method for flight guarantee node Technical Field The invention belongs to the technical field of aviation data analysis, and particularly relates to a monitoring system and method of a flight guarantee node. Background Airport apron is an important area for ground guarantee operation of an aircraft, and the operation safety and the guarantee efficiency of the airport apron directly influence the flight operation quality and the aviation safety. The apron guarantee process relates to cooperative operation of various personnel, vehicles and guarantee equipment, the operation flow is complex, the occurrence of key events is frequent, and high requirements are provided for real-time and accurate event acquisition. Traditional flight guarantee node data acquisition mainly relies on modes such as manual reporting and interphone notification, information consistency and timeliness are difficult to guarantee, data lag or errors are easily caused, and flight scheduling and operation efficiency are affected. In addition, most of current flight guarantee service monitoring systems are based on traditional fixed rule algorithms or image recognition technologies, and a certain service state is judged through feature extraction and analysis of single-frame images. Although these methods can capture the static state of the service node, they are not effective in identifying dynamic changes and potential violations in the service process. Especially in abnormal conditions such as unreasonable equipment placement, personnel's not in place, traditional monitored control system often is difficult to in time make accurate judgement, has the potential safety hazard. For example, patent CN117523500a provides a monitoring system, a method and a storage medium of a flight support node, the monitoring system comprises a video acquisition module for acquiring real-time video streams of machine positions, aircraft doors and other visual angles in an aircraft support area, a positioning module for acquiring real-time positioning information related to workers, aircrafts and ground moving targets in the aircraft support area, a real-time video analysis module for inputting real-time video frames of all visual angles at the current moment into a trained machine learning model to obtain the aircraft access state, the door opening and closing state, the ground service state and other support node states corresponding to all visual angles at the current moment, and calibrating the support node states according to the real-time positioning information acquired at the current moment to obtain a high-precision support node state, and a data integration module for generating and reporting an XML message of the aircraft at the current moment according to the high-precision support node state, the flight identification and the current moment. The scheme can effectively improve the accuracy and the instantaneity of monitoring the flight guarantee node. However, a plurality of targets with large scale differences, such as planes, ground equipment, personnel and the like, often exist in the flight monitoring video, the targets are identified, positioned and node state judged by the prior art, the phenomenon that the targets cannot be identified or are wrongly identified easily occurs in a severe environment, inaccuracy and even loss of target positioning are easily caused, and the real-time performance and accuracy of the flight guarantee node monitoring are affected. Therefore, how to overcome the flight detection error caused by environmental change and realize the automatic acquisition, release and presentation of the flight guarantee node data so as to ensure the real-time performance and accuracy of the flight guarantee node monitoring is a problem to be solved by the technicians in the field. Disclosure of Invention Aiming at the defects in the prior art, the invention provides a monitoring system and a monitoring method of a flight guarantee node, wherein the monitoring system specifically comprises an acquisition module, a behavior analysis module and an event construction module, wherein the acquisition module is used for acquiring a flight guarantee node strategy and video sequences of all monitoring points, the behavior analysis module is used for identifying a multi-scale target through integrating multi-path network and position attention target detection, analyzing the video sequences, correcting the multi-scale target through a space perception function based on a class void scale index to obtain the flight guarantee target and a guarantee target node, and the event construction module is used for constructing a flight guarantee node event based on matching analysis of the flight guarantee target, the guarantee target node and the flight guarantee node strategy. The video sequence of each monitoring point is analyzed through the fusion of the multichannel network and the target detection of the position attentio