Search

CN-122022333-A - Emergency rescue method and system based on unmanned aerial vehicle and urban three-dimensional platform

CN122022333ACN 122022333 ACN122022333 ACN 122022333ACN-122022333-A

Abstract

The application discloses an emergency rescue method based on an unmanned aerial vehicle and a three-dimensional city platform, which comprises the following steps of S1, in the three-dimensional city emergency command platform, generating a structured digital command work order according to disaster information, wherein the digital command work order comprises space position information of a task target, S2, the platform issues the digital command work order to a designated unmanned aerial vehicle operation terminal, S3, the unmanned aerial vehicle operation terminal generates guide information according to the digital command work order to control the flying space position of the unmanned aerial vehicle, and performs field operation matched with the work order requirement. According to the emergency rescue method and system based on the unmanned aerial vehicle and the urban three-dimensional platform, the system digitizes the rescue task into the executable instruction, drives the unmanned aerial vehicle to accurately collect field information, and automatically backfills the processed data to the three-dimensional situation model, so that the rescue command can make iterative decisions according to the dynamically updated real scene data, and the overall efficiency of emergency response is systematically improved.

Inventors

  • HUANG QUANFENG

Assignees

  • 黄权烽

Dates

Publication Date
20260512
Application Date
20260130

Claims (10)

  1. 1. The emergency rescue method based on the unmanned aerial vehicle and the urban three-dimensional platform is characterized by comprising the following steps of: S1, in an urban three-dimensional emergency command platform, generating a structured digital command work order according to disaster information, wherein the digital command work order comprises space position information of a task target; s2, the platform issues the digital command worksheet to a designated unmanned aerial vehicle operation terminal; S3, the unmanned aerial vehicle operation terminal generates guide information according to the digital command worksheet so as to control the unmanned aerial vehicle to fly against the space position, and performs field operation matched with worksheet requirements; S4, the unmanned plane returns the field data acquired by executing the operation to the platform; and S5, the platform updates disaster situations according to the returned field data, and generates a new rescue instruction based on the updated situations to form a task closed loop.
  2. 2. The method of claim 1, wherein generating the digital command worksheet body in step S1 comprises designating a target area or target object on a three-dimensional geographic information model based on a preset task template library to generate the digital command worksheet including task type, data acquisition requirements and task priority.
  3. 3. The method according to claim 2, wherein in step S1, graphical interaction instructions of the user with the three-dimensional geographic information model are captured and converted into geospatial coordinate data in real time; Based on the coordinate data, a geofence coordinate sequence of the target area or a spatial position identification of the target object is structurally written in the digital command worksheet.
  4. 4. The method according to claim 1, wherein in step S2, the platform compiles the digital command worksheet into a task file executable by the unmanned aerial vehicle; the task file is packaged in a structured format, and comprises: A list of waypoints defined based on a WSG-84 coordinate system, each waypoint including coordinates, altitude, speed, and hover time parameters; The equipment control instruction set is corresponding to the data acquisition requirement in the work order and is used for automatically triggering shooting parameters of a camera, a cradle head angle or sensor sampling at a specific waypoint; the task execution logic describes defining the flight sequence and conditional jump relationship between waypoints.
  5. 5. The method according to claim 1 or 4, wherein in step S3, the guidance information is an augmented reality guidance interface; the augmented reality guidance interface is generated by: S31, analyzing the task type and the data acquisition requirement in the digital command worksheet; S32, dynamically generating a corresponding guide element set based on the analyzed task requirement and the spatial position information, wherein the guide element set comprises at least one of a navigation arrow for indicating a flight path, a virtual frame line for framing an observation target and a text description for prompting a current task step; And S33, performing spatial alignment and superposition rendering on the guide element set and the video picture returned by the unmanned aerial vehicle in real time to generate a task-adaptive augmented reality guide interface.
  6. 6. The method according to claim 1, wherein in step S4, the platform processes and analyzes the returned field data, and automatically associates and binds the analysis result with the corresponding spatial positions in the digital command worksheet and the three-dimensional geographic information model; The processing and analyzing comprises real-time analysis of images or videos through an AI model to automatically identify abnormal targets or measure key parameters; and the identification or measurement result obtained by the AI model analysis is used as the structured data to be automatically filled back into the corresponding field of the digital command worksheet.
  7. 7. A system for implementing the unmanned aerial vehicle and city three-dimensional platform-based emergency rescue method of any one of claims 1-6, comprising: the intelligent command subsystem is deployed at the urban three-dimensional emergency command platform end; The intelligent operation subsystem is deployed on the sides of the unmanned aerial vehicle and the ground operation terminal; The communication and positioning guarantee subsystem is used for establishing a data transmission link between the intelligent command subsystem and the intelligent operation subsystem and providing positioning service for the unmanned aerial vehicle; the intelligent command subsystem is configured to generate and issue a digital command work order, and receive and process field data returned by the intelligent operation subsystem to update a three-dimensional situation; The intelligent work subsystem is configured to receive and execute the digital command worksheet to guide field work and collect and return the field data.
  8. 8. The system of claim 7, wherein the intelligent command subsystem comprises: the three-dimensional situation engine is used for bearing and rendering a three-dimensional geographic information model; The work order creation and management module is in communication connection with the three-dimensional situation engine and is used for generating and managing the digital command work order according to disaster information; The task planning and distribution engine is connected with the work order creation and management module and is used for carrying out unmanned aerial vehicle task planning and issuing the digital command work order to the intelligent operation subsystem; the data fusion and duplication module is connected with the three-dimensional situation engine and the communication and positioning guarantee subsystem and is used for carrying out fusion analysis on the returned field data and updating situation information in the three-dimensional situation engine.
  9. 9. The system of claim 7, wherein the intelligent job subsystem comprises: The work order and task analysis module is used for receiving and analyzing the digital command work order from the intelligent command subsystem; the augmented reality guide generation and presentation module is connected with the work order and task analysis module and is used for generating an augmented reality guide interface according to the analyzed work order task parameters and presenting the augmented reality guide interface on the interactive terminal; And the task automation execution engine is connected with the work order and task analysis module and used for controlling the flight and data acquisition operation of the unmanned aerial vehicle according to the analyzed work order execution logic.
  10. 10. The system of claim 9, wherein the task automation execution engine is configured to control the unmanned aerial vehicle to automatically execute a sequence of flight and data acquisition actions including at least one of fixed point hover, surround flight, course scan, and multi-angle photography, according to instructions in the digital command worksheet.

Description

Emergency rescue method and system based on unmanned aerial vehicle and urban three-dimensional platform Technical Field The application relates to the technical field of rescue systems, in particular to an emergency rescue method and system based on an unmanned aerial vehicle and a city three-dimensional platform. Background With the acceleration of urban digitization progress, an urban three-dimensional emergency command platform has become a core infrastructure of a modern emergency rescue system. The platform constructs a digital twin body of the urban physical space by integrating high-precision geographic information, a building information model, internet of things data and other multi-source information, can provide visual and three-dimensional situation display and space analysis capability for commanders, and remarkably improves the macroscopic decision level of disaster situation research and judgment and resource scheduling. The existing platform still exists in the implementation of perceived decision closed loop, and the acquisition of real-time and accurate information on disaster sites still depends on the traditional manual inspection or single video monitoring mode seriously. Specifically, when an emergency occurs, the platform itself cannot directly and actively acquire the field structured data. If the command center needs to know information such as building damage details, dangerous goods diffusion ranges or personnel trapping accurate positions, the command center usually needs to wait for site personnel to report or call fixed point position camera pictures, and the problems of information lag, visual angle limitation, insufficient coverage and the like exist. The field information acquisition mode relying on manual and passive reception has become a main factor for limiting emergency response speed and decision accuracy. Disclosure of Invention The present application aims to solve at least one of the technical problems in the related art to some extent. Therefore, an object of the present application is to provide an emergency rescue method and system based on a three-dimensional platform of an unmanned plane and a city, wherein the system digitizes a rescue task into an executable instruction, drives the unmanned plane to accurately collect field information, and automatically backfills processed data into a three-dimensional situation model, so that an emergency rescue command can make iterative decisions according to dynamically updated real scene data, thereby systematically improving the overall efficiency of emergency response. In order to achieve the above purpose, an embodiment of a first aspect of the present application provides an emergency rescue method based on a three-dimensional platform of an unmanned aerial vehicle and a city, comprising the following steps: S1, in an urban three-dimensional emergency command platform, generating a structured digital command work order according to disaster information, wherein the digital command work order comprises space position information of a task target; s2, the platform issues the digital command worksheet to a designated unmanned aerial vehicle operation terminal; S3, the unmanned aerial vehicle operation terminal generates guide information according to the digital command worksheet so as to control the unmanned aerial vehicle to fly against the space position, and performs field operation matched with worksheet requirements; S4, the unmanned plane returns the field data acquired by executing the operation to the platform; and S5, the platform updates disaster situations according to the returned field data, and generates a new rescue instruction based on the updated situations to form a task closed loop. According to the emergency rescue method and system based on the unmanned aerial vehicle and the urban three-dimensional platform, the system digitizes the rescue task into the executable instruction, drives the unmanned aerial vehicle to accurately collect field information, and automatically backfills the processed data to the three-dimensional situation model, so that the rescue command can make iterative decisions according to the dynamically updated real scene data, and the overall efficiency of emergency response is systematically improved. In addition, the emergency rescue method based on the unmanned aerial vehicle and the urban three-dimensional platform provided by the application can also have the following additional technical characteristics: In one embodiment of the application, generating the digital command worksheet specifically comprises designating a target area or target object on a three-dimensional geographic information model based on a preset task template library to generate the digital command worksheet comprising a task type, a data acquisition requirement and a task priority. In one embodiment of the present application, in step S1, a graphical interaction instruction of a user and a three-di