CN-121982876-A - Expressway emergency lane real-time monitoring method and system based on unmanned aerial vehicle
Abstract
The invention discloses a real-time monitoring method and a real-time monitoring system for an emergency lane of an expressway based on an unmanned aerial vehicle, wherein the method comprises the steps of collecting an emergency lane video data stream of a preset target road section through the unmanned aerial vehicle, and constructing a three-dimensional dynamic situation map of objects in the lane by adopting a multi-source information fusion technology; the method comprises the steps of identifying special vehicles allowed to pass and common vehicles and obstacles occupying a lane in violation based on a three-dimensional dynamic situation map, generating a classification result set, generating a hierarchical response strategy aiming at a violation target by adopting a space situation assessment algorithm based on the classification result set, controlling an unmanned aerial vehicle to acquire evidence and track the violation target according to the hierarchical response strategy, and sending an early warning data packet containing target characteristics and position information to a road management and control center in real time. By utilizing the embodiment of the invention, the three-dimensional and intelligent monitoring of the state of the corresponding emergency lane can be realized, and the automation level of monitoring, the accuracy of violation identification and the accuracy of management response are improved.
Inventors
- ZHANG QING
- GONG WEIAN
- PENG YUEQING
Assignees
- 江苏纬创智能交通科技有限公司
Dates
- Publication Date
- 20260505
- Application Date
- 20251215
Claims (10)
- 1. The method for monitoring the expressway emergency lane in real time based on the unmanned aerial vehicle is characterized by comprising the following steps of: the method comprises the steps that an emergency lane video data stream of a preset target section is collected through an unmanned aerial vehicle, and a three-dimensional dynamic situation map of objects in a lane is constructed by adopting a multi-source information fusion technology; Based on the three-dimensional dynamic situation map, performing space-time behavior pattern analysis on objects in the lane by using a deep convolutional neural network, identifying special vehicles which are allowed to pass and common vehicles and obstacles which infringe the lane, and generating a classification result set containing the identified target type and behavior state; based on the classification result set, a space situation assessment algorithm is adopted to generate a hierarchical response strategy aiming at the violation targets, wherein the strategy comprises a differential tracking path and a multi-angle evidence obtaining scheme aiming at the violation targets with different grades; And controlling the unmanned aerial vehicle to acquire evidence and track the illegal target according to the grading response strategy, and sending an early warning data packet containing target characteristics and position information to a road management and control center in real time.
- 2. The method according to claim 1, wherein the capturing, by the unmanned aerial vehicle, an emergency lane video data stream of a preset target section, and constructing a three-dimensional dynamic situation map of an object in a lane by adopting a multi-source information fusion technology, comprises: controlling an unmanned aerial vehicle group carrying a multispectral camera to cooperatively fly on a preset target road section, and acquiring multi-angle emergency lane video data according to a preset track to generate an original multi-view video stream; Performing space-time registration processing on an original multi-view video stream, aligning video frames of different unmanned aerial vehicles by adopting a feature point matching algorithm, compensating image shake caused by unmanned aerial vehicle movement, and generating a stably registered video sequence; based on the stably registered video sequence, calculating depth information of an object in a lane by using a stereoscopic vision algorithm, and constructing a three-dimensional point cloud of the object by combining GPS/Beidou navigation and IMU data to generate an initial three-dimensional scene point cloud; And (3) extracting and tracking a dynamic target of the initial three-dimensional scene point cloud, fusing multi-frame point cloud data to construct a continuous three-dimensional motion track, and finally generating a three-dimensional dynamic situation map of the object in the lane.
- 3. The method of claim 2, wherein the performing space-time behavior pattern analysis on the objects in the lane using the deep convolutional neural network based on the three-dimensional dynamic situation map, identifying special vehicles allowed to pass and common vehicles and obstacles that infringe the lane, and generating the classification result set including the identified target type and behavior state comprises: extracting a motion track sequence and appearance characteristics of a target from a three-dimensional dynamic situation map, constructing a target description vector containing space-time characteristics, and generating a target characteristic data set; Inputting the target characteristic data set into a pre-trained deep convolutional neural network, extracting the motion mode and morphological characteristics of the target through a space-time convolutional layer, and generating an advanced semantic characteristic representation; Based on the advanced semantic feature representation, the type attribute of the target is parallelly identified by utilizing a multi-classifier, wherein the type attribute at least comprises a special vehicle, a common vehicle and an obstacle, and a target type identification result is generated; And judging whether the behavior state of the target accords with the emergency lane utilization standard or not by combining the target type recognition result and the motion trail analysis, and finally generating a classification result set containing the target type and the behavior state.
- 4. The method of claim 3, wherein the generating a hierarchical response strategy for the offending target using a spatial situation assessment algorithm based on the classification result set, the strategy comprising a differential tracking path and multi-angle evidence-taking scheme for different levels of offending target comprises: According to the types and the severity of the violation targets in the classification result set, establishing a violation grade evaluation model to calculate threat indexes of each violation target and generate a target threat grade evaluation result; based on the target threat level evaluation result, generating a differential tracking path scheme by utilizing a differential unmanned aerial vehicle tracking path planning algorithm; Combining the differential tracking path schemes, determining a multi-angle evidence obtaining strategy, determining the optimal shooting angle and evidence obtaining frequency on each tracking path, and generating a multi-angle evidence obtaining scheme; And integrating a differential tracking path scheme and a multi-angle evidence obtaining scheme, configuring corresponding response parameters according to different threat levels, and finally generating a hierarchical response strategy aiming at the violation target.
- 5. The method of claim 4, wherein controlling the drone to forensic track the offending target according to the hierarchical response strategy and send an early warning data packet containing target characteristics and location information to a road management and control center in real time comprises: analyzing a tracking path and a evidence obtaining scheme in the grading response strategy, generating a flight control instruction and a cradle head control instruction of the unmanned aerial vehicle, and generating an unmanned aerial vehicle control instruction set; Executing an unmanned aerial vehicle control instruction set, controlling the unmanned aerial vehicle to track the violation target according to a preset path, and simultaneously adjusting the angle of a camera to acquire multi-angle images so as to generate target evidence obtaining data; Processing the target evidence obtaining data in real time, extracting visual characteristics, motion characteristics and position information of the target, packaging the visual characteristics, the motion characteristics and the position information into a data packet in a standard format, and generating an early warning data packet; And transmitting the early warning data packet to the road management and control center in real time through a 5G communication link, receiving confirmation feedback of the management and control center, and completing closed-loop transmission of the early warning data.
- 6. Expressway emergency lane real-time monitoring system based on unmanned aerial vehicle, characterized in that, the system includes: the construction module is used for acquiring an emergency lane video data stream of a preset target section through the unmanned aerial vehicle and constructing a three-dimensional dynamic situation map of an object in a lane by adopting a multi-source information fusion technology; The analysis module is used for carrying out space-time behavior pattern analysis on objects in the lane by using a deep convolutional neural network based on the three-dimensional dynamic situation map, identifying special vehicles which are allowed to pass and common vehicles and obstacles which infringe the lane, and generating a classification result set containing the identified target type and behavior state; the generation module is used for generating a hierarchical response strategy aiming at the violation targets by adopting a space situation assessment algorithm based on the classification result set, wherein the strategy comprises a differentiated tracking path and a multi-angle evidence obtaining scheme aiming at the violation targets with different grades; And the control module is used for controlling the unmanned aerial vehicle to carry out evidence collection and tracking on the illegal target according to the grading response strategy and sending an early warning data packet containing target characteristics and position information to the road management and control center in real time.
- 7. The system according to claim 6, characterized in that said construction module is in particular adapted to: controlling an unmanned aerial vehicle group carrying a multispectral camera to cooperatively fly on a preset target road section, and acquiring multi-angle emergency lane video data according to a preset track to generate an original multi-view video stream; Performing space-time registration processing on an original multi-view video stream, aligning video frames of different unmanned aerial vehicles by adopting a feature point matching algorithm, compensating image shake caused by unmanned aerial vehicle movement, and generating a stably registered video sequence; based on the stably registered video sequence, calculating depth information of an object in a lane by using a stereoscopic vision algorithm, and constructing a three-dimensional point cloud of the object by combining GPS/Beidou navigation and IMU data to generate an initial three-dimensional scene point cloud; And (3) extracting and tracking a dynamic target of the initial three-dimensional scene point cloud, fusing multi-frame point cloud data to construct a continuous three-dimensional motion track, and finally generating a three-dimensional dynamic situation map of the object in the lane.
- 8. The system according to claim 7, wherein the analysis module is specifically configured to: extracting a motion track sequence and appearance characteristics of a target from a three-dimensional dynamic situation map, constructing a target description vector containing space-time characteristics, and generating a target characteristic data set; Inputting the target characteristic data set into a pre-trained deep convolutional neural network, extracting the motion mode and morphological characteristics of the target through a space-time convolutional layer, and generating an advanced semantic characteristic representation; Based on the advanced semantic feature representation, the type attribute of the target is parallelly identified by utilizing a multi-classifier, wherein the type attribute at least comprises a special vehicle, a common vehicle and an obstacle, and a target type identification result is generated; And judging whether the behavior state of the target accords with the emergency lane utilization standard or not by combining the target type recognition result and the motion trail analysis, and finally generating a classification result set containing the target type and the behavior state.
- 9. A storage medium having a computer program stored therein, wherein the computer program is arranged to perform the method of any of claims 1-5 when run.
- 10. An electronic device comprising a memory and a processor, wherein the memory has stored therein a computer program, the processor being arranged to run the computer program to perform the method of any of claims 1-5.
Description
Expressway emergency lane real-time monitoring method and system based on unmanned aerial vehicle Technical Field The invention belongs to the technical field of traffic monitoring, and particularly relates to a real-time expressway emergency lane monitoring method and system based on an unmanned aerial vehicle. Background With the increasing development of expressway networks, the problem that an emergency lane is illegally occupied is increasingly prominent, and the passing efficiency of emergency rescue vehicles is seriously affected. The existing monitoring means mainly depend on a fixed point position camera, and have the inherent defects of large monitoring blind area, limited visual angle, incapability of continuously tracking a moving target and the like. When the traditional video analysis algorithm faces complex road conditions, vehicle shielding and light change, the recognition accuracy is obviously reduced, and accurate classification and timely evidence obtaining of illegal behaviors are difficult. Meanwhile, the fixed monitoring system lacks flexible response capability, and the evidence obtaining strategy cannot be dynamically adjusted according to the severity level of the illegal event, so that the management efficiency is low. Disclosure of Invention The invention aims to provide a real-time expressway emergency lane monitoring method and system based on an unmanned aerial vehicle, which are used for solving the defects in the prior art, realizing three-dimensional and intelligent monitoring of the state of an emergency lane and improving the automation level of emergency lane monitoring, the accuracy of illegal identification and the accuracy of management response. The embodiment of the application provides an expressway emergency lane real-time monitoring method based on an unmanned aerial vehicle, which comprises the following steps: the method comprises the steps that an emergency lane video data stream of a preset target section is collected through an unmanned aerial vehicle, and a three-dimensional dynamic situation map of objects in a lane is constructed by adopting a multi-source information fusion technology; Based on the three-dimensional dynamic situation map, performing space-time behavior pattern analysis on objects in the lane by using a deep convolutional neural network, identifying special vehicles which are allowed to pass and common vehicles and obstacles which infringe the lane, and generating a classification result set containing the identified target type and behavior state; based on the classification result set, a space situation assessment algorithm is adopted to generate a hierarchical response strategy aiming at the violation targets, wherein the strategy comprises a differential tracking path and a multi-angle evidence obtaining scheme aiming at the violation targets with different grades; And controlling the unmanned aerial vehicle to acquire evidence and track the illegal target according to the grading response strategy, and sending an early warning data packet containing target characteristics and position information to a road management and control center in real time. Optionally, the capturing, by an unmanned aerial vehicle, an emergency lane video data stream of a preset target section, and constructing a three-dimensional dynamic situation map of an object in a lane by adopting a multi-source information fusion technology, including: controlling an unmanned aerial vehicle group carrying a multispectral camera to cooperatively fly on a preset target road section, and acquiring multi-angle emergency lane video data according to a preset track to generate an original multi-view video stream; Performing space-time registration processing on an original multi-view video stream, aligning video frames of different unmanned aerial vehicles by adopting a feature point matching algorithm, compensating image shake caused by unmanned aerial vehicle movement, and generating a stably registered video sequence; based on the stably registered video sequence, calculating depth information of an object in a lane by using a stereoscopic vision algorithm, and constructing a three-dimensional point cloud of the object by combining GPS/Beidou navigation and IMU data to generate an initial three-dimensional scene point cloud; And (3) extracting and tracking a dynamic target of the initial three-dimensional scene point cloud, fusing multi-frame point cloud data to construct a continuous three-dimensional motion track, and finally generating a three-dimensional dynamic situation map of the object in the lane. Optionally, based on the three-dimensional dynamic situation map, performing space-time behavior pattern analysis on objects in the lane by using a deep convolutional neural network, identifying special vehicles allowed to pass and common vehicles and obstacles occupying the lane in a violation manner, and generating a classification result set including the identified