CN-121979234-A - Unmanned aerial vehicle airspace situation awareness fusion system and method
Abstract
The invention discloses an unmanned aerial vehicle airspace situation awareness fusion system and method. The method comprises the steps of dividing an airspace into grids by an edge computing node, judging the grids into situations of different red, yellow and green risk levels, generating light situation data, then adaptively selecting a link according to the type and communication capacity of an unmanned aerial vehicle, transmitting the data to the unmanned aerial vehicle by adopting a priority scheduling mechanism, merging the external situation data and real-time perception data acquired by a sensor of the unmanned aerial vehicle after the unmanned aerial vehicle receives the data, generating a final airspace environment analysis decision through a dynamic weight rule, and controlling flying according to the final airspace environment analysis decision. The method solves the problems of limited perception range and lack of global situation of the unmanned aerial vehicle, expands the early warning range through 'side-end' cooperation, improves the accuracy and the robustness of decision making, and enhances the flight safety of the unmanned aerial vehicle in a complex airspace.
Inventors
- CHENG CHENGQI
- WU XUEMIN
- HU XUELIAN
Assignees
- 北斗伏羲信息技术有限公司
Dates
- Publication Date
- 20260505
- Application Date
- 20260104
Claims (20)
- 1. The unmanned aerial vehicle airspace situation awareness fusion method is characterized by comprising the following steps of: The edge computing node performs: Acquiring multi-source airspace data, and dividing a target airspace into a plurality of grids based on a preset grid code system; according to preset situation dividing standards, judging the grids as situations with different risk grades, and generating light situation data comprising grid codes and corresponding situations; The edge computing node performs cooperatively with the unmanned aerial vehicle: According to the preset type and the real-time communication capability of the unmanned aerial vehicle, adaptively selecting a communication link, and transmitting the light-weight situation data to the unmanned aerial vehicle by adopting a priority scheduling mechanism based on the risk level of the situation; performing by the drone: receiving the light situation data; Acquiring real-time sensing data acquired by a self-sensing module of the unmanned aerial vehicle; Performing multi-source data fusion processing, distributing weights for the light situation data and the real-time perception data according to a preset dynamic weight rule, and generating a fused airspace environment analysis decision based on a weighted result; and controlling the flight state of the unmanned aerial vehicle by the flight controller of the unmanned aerial vehicle according to the airspace environment analysis decision.
- 2. The method of claim 1, wherein the multi-source data fusion process comprises: data layer processing, namely performing space-time calibration on the light situation data and the real-time perception data; feature layer processing for performing feature extraction and target association on the calibrated data, and And the decision layer processes, and fusion decision is carried out on the correlated features so as to generate the airspace environment analysis decision.
- 3. The method of claim 2, wherein the decision layer process specifically comprises a dynamic weight voting mechanism configured to dynamically adjust trust weights of the lightweight situational data and the real-time perceptual data based on real-time environmental factors or operating states of the perceptual modules.
- 4. A method according to claim 3, wherein the dynamic weight voting mechanism is specifically configured to: initializing the weight Wext of the light situation data and the weight Wint of the real-time perception data; monitoring at least one environmental parameter in real time, the environmental parameter comprising visibility, rainfall intensity or electromagnetic interference level; When the environmental parameter is detected to be inferior to a preset threshold value, the weight of real-time perception data corresponding to the affected perception module is pertinently reduced, and the weight Wext of the light-weight situation data is correspondingly improved, wherein the weight adjustment follows a preset nonlinear mapping function, so that when the reliability of a data source changes, an information source with higher confidence is preferentially adopted, and the robustness of the airspace environment analysis decision is ensured under the condition that a single sensor fails or is interfered.
- 5. The method of claim 1, wherein the adaptively selecting a communication link comprises: Aiming at the consumer-level unmanned aerial vehicle, wi-Fi communication is configured to be used as a main link, and Beidou short message communication is used as a redundant backup link; for an industrial unmanned aerial vehicle, 5G communication is adopted as a main link, and LoRa communication is adopted as a redundant backup link.
- 6. The method of claim 5, wherein the priority scheduling mechanism is configured to: dividing the situation into three grades of high risk, medium risk and low risk; Allocating different transmission priorities for situation data with different risk levels, wherein the priority of the high-risk situation data is highest; transmitting all levels of situation data when the main link bandwidth is sufficient; when the bandwidth of the main link is monitored to be lower than a first preset threshold value, stopping transmitting low-risk situation data; And when the main link is monitored to be interrupted or the bandwidth is lower than a second preset threshold value, switching to the redundant backup link, and transmitting only high-risk situation data.
- 7. The method of claim 1, wherein the drone further performs: monitoring the occupancy rate of self operation resources in real time; and when the occupancy rate exceeds a preset overload threshold value, automatically switching to an emergency processing mode, wherein in the emergency processing mode, the unmanned aerial vehicle simplifies the complexity of the multi-source data fusion processing.
- 8. The method of claim 7, wherein the specific process of simplifying the complexity of the multi-source data fusion process is: in an emergency treatment mode, the processor of the unmanned aerial vehicle pauses the processing of the light-weight situation data of the medium and low risk levels; meanwhile, the processor pauses to call a self-perception algorithm except for high-risk target recognition; The multi-source data fusion processing is simplified into a rule-based decision logic which only judges whether the current position of the unmanned aerial vehicle is overlapped with the high-risk situation grid or not, and triggers a preset emergency obstacle avoidance maneuver with minimum calculation amount when the current position of the unmanned aerial vehicle is overlapped, so that the occupancy rate of operation resources is reduced to a safe level on the premise of ensuring the safety of a core.
- 9. The method of claim 1, wherein the situational partitioning criteria comprises: A grid with permanent or temporary no-fly zones, high-probability conflict tracks with other aircrafts or fixed obstacles such as ultra-high voltage lines and the like is judged to be a red light situation with the highest risk level; Determining a yellow light situation with a medium risk level as a grid with potential conflict tracks, in an electronic fence boundary transition area or unstable communication signals; and judging other airspace safety grids as green light situations with low risk levels.
- 10. The method of claim 9, wherein the generating lightweight situational data further comprises: encoding and compressing the generated situation data by the edge computing node in binary format, and In the continuous transmission period, only the situation data of the grid with the changed state is sent to realize incremental update.
- 11. The unmanned aerial vehicle airspace situation awareness fusion system is characterized by comprising an edge computing node and an unmanned aerial vehicle; The edge computing node is configured to: Acquiring multi-source airspace data, and dividing a target airspace into a plurality of grids based on a preset grid code system; according to preset situation dividing standards, judging the grids as situations with different risk grades, and generating light situation data comprising grid codes and corresponding situations; the edge computing node is connected to the drone by a communication link, the system being configured to: According to the preset type and the real-time communication capability of the unmanned aerial vehicle, adaptively selecting the communication link, and transmitting the light-weight situation data to the unmanned aerial vehicle by adopting a priority scheduling mechanism based on the risk level of the situation; The unmanned aerial vehicle includes a self-awareness module and a flight controller configured to: receiving the light situation data; Acquiring real-time sensing data acquired by the self sensing module; Performing multi-source data fusion processing, distributing weights for the light situation data and the real-time perception data according to a preset dynamic weight rule, and generating a fused airspace environment analysis decision based on a weighted result; And controlling the flight state of the unmanned aerial vehicle according to the airspace environment analysis decision.
- 12. The system of claim 11, wherein the flight controller is configured to perform the multi-source data fusion process by: performing data layer processing to perform space-time calibration on the light situation data and the real-time perception data; Performing feature layer processing to perform feature extraction and target association on the calibrated data, and And carrying out decision layer processing to carry out fusion decision on the correlated features so as to generate the airspace environment analysis decision.
- 13. The system of claim 12, wherein the decision layer processing within the flight controller is implemented using a dynamic weight voting module configured to dynamically adjust trust weights of the lightweight situational data and the real-time awareness data based on real-time environmental factors or operational states of the awareness modules.
- 14. The system of claim 13, wherein the internal logic of the dynamic weight voting module is configured to: Storing initial values of weights Wext of the light-weight situation data and weights Wint of the real-time perception data; connecting an environment sensor for acquiring at least one environment parameter in real time, wherein the environment parameter comprises visibility, rainfall intensity or electromagnetic interference level; When the environmental parameter value received from the environmental sensor is inferior to a preset threshold value, the module executes a preset weight adjustment algorithm, the algorithm pertinently reduces the weight Wint of the real-time sensing data corresponding to the affected sensing module, and promotes the weight Wext of the light situation data according to constraint conditions, and the algorithm ensures that when part of sensing information is unreliable, system decisions can automatically tend to data sources with higher confidence, so that the decision reliability of the system in complex severe environments is greatly promoted.
- 15. The system of claim 11, wherein the communication link is established by a communication module on the drone and a communication module on the edge computing node, the communication module configured to: if the unmanned aerial vehicle is a consumer-grade unmanned aerial vehicle, the communication module supports Wi-Fi communication and Beidou short message communication, and takes Wi-Fi as a main link and Beidou short message as a backup link; and if the unmanned aerial vehicle is an industrial unmanned aerial vehicle, the communication module supports 5G communication and LoRa communication, and takes 5G as a main link and LoRa as a backup link.
- 16. The system of claim 15, wherein the priority scheduling module within the edge computing node is configured to: dividing the situation into three risk levels of high, medium and low; different transmission priorities are given to the situation data of each level; When the Channel Quality Indication (CQI) value of the main link is monitored to be higher than a first threshold value, allowing transmission of all levels of situation data; when the CQI value is between a first threshold value and a second threshold value, the scheduling module places the data packet with low risk level at the end of a sending queue or directly discards the data packet; When the CQI value is lower than a second threshold value or the link is switched, the scheduling module constructs a very simple data packet only containing high risk situation data and transmits the very simple data packet through the backup link.
- 17. The system of claim 11, wherein the flight controller of the drone is further configured to: the built-in resource monitoring unit is used for monitoring the processor occupancy rate of the flight controller in real time; When the occupancy rate output by the resource monitoring unit exceeds a preset overload threshold value, the flight controller triggers the operation mode to switch from a standard mode to an emergency processing mode, and in the emergency processing mode, the flight controller executes a simplified fusion processing algorithm.
- 18. The system of claim 17, wherein the specific execution logic of the flight controller in the emergency treatment mode is: An algorithm scheduler in the flight controller shields decoding and processing instructions of the middle-risk level situation data and the low-risk level situation data so as to release related computing resources; the algorithm dispatcher pauses sending other image processing or point cloud analysis tasks except for high-risk target identification to the processor of the self-perception module at the same time; The flight controller loads a preset rule-based emergency decision engine, the input of the engine is only the position of the unmanned aerial vehicle and the grid position of the high-risk situation, the input of the engine is a predefined emergency avoidance instruction sequence, and the architecture ensures that the unmanned aerial vehicle can still execute the most basic and critical safety guarantee action under the condition of extreme operation resource shortage.
- 19. The system of claim 11, wherein the situation determination module within the edge computing node comprises a set of situation-partitioning rule bases defining: The triggering condition of the red light situation is that the coincidence ratio of the grid range and a database of a known no-fly zone is more than 90 percent, or a conflict target which is predicted by ADS-B data and has a space-time distance with the unmanned aerial vehicle track less than a safety threshold value in 60 seconds in the future exists in the grid; the triggering condition of the yellow light situation is that the grid is in the range of 500 meters outside the boundary of the known no-fly zone, or other unmanned aerial vehicles exist in the grid without direct collision risk.
- 20. A computer readable storage medium having stored thereon a computer program which when executed by a processor implements the method of any of claims 1-10.
Description
Unmanned aerial vehicle airspace situation awareness fusion system and method Technical Field The invention relates to the technical field of unmanned aerial vehicle autonomous flight, in particular to an unmanned aerial vehicle airspace situation awareness fusion system and method. In particular to a method and a system for unmanned aerial vehicle autonomous flight control, low-altitude airspace situation awareness, multi-source data fusion and Beidou grid code application. Background In recent years, unmanned aerial vehicle technology is developing towards autonomy, intellectualization and scale, and has been widely used in various fields such as urban logistics, power inspection, environmental monitoring, etc. Such unmanned aerial vehicles are typically equipped with diverse sensing modules, such as millimeter wave radars, vision cameras, inertial Measurement Units (IMUs), and autonomous computing units, such as embedded Artificial Intelligence (AI) chips, to achieve autonomous sensing and decision making of the local environment. However, the low-altitude airspace environment is increasingly complex, including not only other unmanned aerial vehicles, but also low-altitude aircraft, high-rise buildings, and temporarily planned no-fly areas. Under such an environment, the airspace environment analysis only depends on the unmanned aerial vehicle self-perception module, so that obvious technical defects exist: First, the detection range and reliability of the self-sensing module are limited. The short-range radar detection distance carried by the consumer unmanned aerial vehicle is usually between 0.5 and 1 km, and the effective recognition distance of the vision sensor can be reduced to below 500 meters under unfavorable illumination and weather conditions such as night, backlight, heavy rain or dense fog. This results in the unmanned aerial vehicle being unable to perceive in advance the risk of distant airspace, such as a temporary no-fly zone 2 km away or other unmanned aerial vehicles flying in the same direction at high speed. Considering the prevailing cruising speed of unmanned aerial vehicles (30-80 km/h), the perception range of 1 km can only provide a theoretical reaction time of 45-120 seconds, and after deducting the time consumption of data processing and decision calculation, the practically available handling time is often less than 30 seconds, which is far from sufficient for ensuring flight safety. Secondly, the global airspace situation information is lacking. The existing unmanned aerial vehicle can only acquire local environment data in the detection range of the unmanned aerial vehicle, and cannot acquire macroscopic information of busyness-flight permission of an airspace in a larger range. For example, a logistics drone may not detect any direct obstacle by its own radar while flying through urban areas, but may be temporarily marked as a no-fly area (i.e., a "red light" area) due to an emergency event in the airspace 1.5 km in front of its course. Due to the fact that the key situation information cannot be acquired, the unmanned aerial vehicle continues to fly along the original route, emergency avoidance is triggered until the unmanned aerial vehicle approaches the edge of the no-fly zone, route delay is caused, and safety risks can be caused. Under the scene of multi-machine collaborative operation, for example, a plurality of unmanned aerial vehicles patrol the same transmission line simultaneously, each unmanned aerial vehicle can only sense peers in hundreds of meters around, and can not know the distribution condition of unmanned aerial vehicles at a longer distance, so that local airspace congestion (namely a yellow light area) is easily caused. Thirdly, multisource data are not effectively fused, so that the environmental analysis accuracy is not high. The unmanned aerial vehicle's own perception data (e.g., radar point clouds, visual images) and external airspace management information (e.g., no-fly zone range, other aircraft dynamics) are typically independent of each other. The radar may misjudge the bird as another unmanned aerial vehicle (false alarm rate about 15%), and the vision sensor is easy to miss the obstacle (miss detection rate about 10%) when facing to the complex background such as the high-rise glass curtain wall. Under the condition of lack of external situation information for cross verification, an autonomous operation unit of the unmanned aerial vehicle is difficult to effectively distinguish real risks from perception errors, so that wrong obstacle avoidance decisions, such as unnecessary maneuvers caused by misjudging birds, can be caused, and flight efficiency and task execution are affected. The environmental analysis accuracy of the prior art is typically less than 75%. Fourth, the transmission mechanism of the airspace situation information is not matched with the airborne capacity of the unmanned aerial vehicle. At present, situation data issued