CN-116700334-B - Collaborative processing method and device for multiple unmanned aerial vehicles
Abstract
The embodiment of the specification provides a collaborative processing method and device of multiple unmanned aerial vehicles, wherein the collaborative processing method of the multiple unmanned aerial vehicles comprises the steps of calculating geographic position coordinates of target position points based on measurement data of an image containing the target position points, carrying out coding search on a processing area determined based on the geographic position coordinates according to an index table to obtain an attribute coding set, determining spatial codes and grid images of processing grids of the target position points in the image, reading unmanned aerial vehicle identifications in all attribute codes in the attribute coding set, determining corresponding collaborative unmanned aerial vehicles based on the unmanned aerial vehicle identifications, and sending collaborative processing data containing the spatial codes, the grid images and the corresponding attribute codes to the collaborative unmanned aerial vehicles to carry out collaborative processing of the target position points.
Inventors
- TONG XIAOCHONG
- GUO CONGZHOU
- LEI YI
- QIU CHUNPING
- SUN YUEKUN
- AN ZIGE
- TANG JIAYI
- Lei Yaxian
- Song Haoshuai
- LI HE
Assignees
- 中国人民解放军战略支援部队信息工程大学
Dates
- Publication Date
- 20260505
- Application Date
- 20230621
Claims (19)
- 1. A co-processing method of multiple unmanned aerial vehicles, applied to a requesting unmanned aerial vehicle in a unmanned aerial vehicle bee colony, the method comprising: Calculating geographic position coordinates of a target position point based on measurement data of an image containing the target position point; Performing coding search on the processing area determined based on the geographic position coordinates according to an index table to obtain an attribute coding set, and determining the space coding and grid image of the processing grid of the target position point in the image; Reading unmanned aerial vehicle identifications in all attribute codes in the attribute code set, and determining corresponding collaborative unmanned aerial vehicles based on the unmanned aerial vehicle identifications; transmitting cooperative processing data comprising the spatial codes, the grid images and the corresponding attribute codes to the cooperative unmanned aerial vehicle so as to perform cooperative processing of the target position points; The method further comprises the steps of determining an image meshing level of a proper image coverage area according to the size of a flight area, the camera parameters of the unmanned aerial vehicle and the performance of an onboard computer, wherein the image meshing level is calculated by adopting the following mode: The said Representing the image gridding level, the Representing image area Area of mesh And when the grid is positioned at a level, d represents a preset depth, the image area refers to the area of the image acquired by the unmanned aerial vehicle, and the grid area refers to the area of the grid under the corresponding level obtained after calculation according to the scale of the preset level grid.
- 2. The multi-unmanned aerial vehicle cooperative processing method of claim 1, further comprising: Collecting a geographic image, and performing space coding and attribute coding on the geographic image to obtain angular point space coding, range space coding and image attribute coding; Updating the index table based on the range space code and the image attribute code, and performing compression processing on the corner space code and the image attribute code to obtain compression codes; And sending the compression codes to the candidate unmanned aerial vehicles except the request unmanned aerial vehicle in the unmanned aerial vehicle bee colony so as to update the index table of each candidate unmanned aerial vehicle.
- 3. The co-processing method of multiple unmanned aerial vehicles according to claim 2, the spatial encoding process comprising: calculating angular point position coordinates of angular points in the geographic image; Calculating the coding index of the corner based on the corner position coordinates and the space coding level; and calculating the corner space coding of the corner according to the coding index and the space coding level.
- 4. A multi-unmanned aerial vehicle co-processing method as claimed in claim 3, the spatial encoding process further comprising: Gridding the geographic image based on grids of the image gridding hierarchy to obtain at least one single-scale grid; and aggregating the at least one single-scale grid to obtain at least one multi-scale grid, and taking the spatial code of each grid in the at least one multi-scale grid as the range spatial code.
- 5. The collaborative processing method for multiple unmanned aerial vehicles according to claim 4, wherein the spatial codes of the grids are obtained by adopting the following modes: calculating the spatial coding of the maximum hierarchy, and calculating the spatial coding of each hierarchy based on the spatial coding of the maximum hierarchy; And determining the grid level of each grid, and reading the space coding of each grid based on the grid level.
- 6. The multi-unmanned aerial vehicle cooperative processing method according to claim 2, wherein the attribute encoding process includes: the attribute data in the metadata of the geographic image is read; performing attribute coding processing on the geographic image based on the attribute data, the image meshing level and a preset coding algorithm to obtain attribute codes of the geographic image; the attribute data comprises at least one of image acquisition time, image frame number, unmanned aerial vehicle number and image type.
- 7. The co-processing method of multiple unmanned aerial vehicles according to claim 2, the updating the index table based on the range spatial code and the image attribute code, comprising: Inquiring whether the primary key of the index table contains the range space code or not; If yes, mapping the image attribute code into a key value of a main key of the range space code; if not, the index which takes the range space code as a main key and the image attribute code as a key value is updated to the index table.
- 8. The co-processing method of multiple unmanned aerial vehicles according to claim 2, wherein any candidate unmanned aerial vehicle other than the requesting unmanned aerial vehicle performs the following operations: receiving a compression code sent by the request unmanned aerial vehicle and decompressing the compression code to obtain the angular point space code and the image attribute code; resolving the angular point space code to obtain the range space code; And updating the index table of any candidate unmanned aerial vehicle based on the range space coding and the image attribute coding.
- 9. The co-processing method of multiple unmanned aerial vehicles according to claim 1, wherein the encoding and retrieving the processing area determined based on the geographic position coordinates according to the index table, to obtain the attribute encoding set, comprises: Taking the geographic position coordinates as the region center, and carrying out region division by taking the preset length as the division length to obtain the processing region; and carrying out coding search on the processing area according to the index table to obtain an attribute coding set.
- 10. The co-processing method of multiple unmanned aerial vehicles according to claim 9, wherein the performing the code search on the processing area according to the index table to obtain the attribute code set comprises: Performing space coding processing on the processing region to obtain region space coding of the processing region, and calculating coding intervals of each grid space coding in the region space coding according to a preset interval algorithm; Reading a main key positioned in the coding section from the index table, and reading a key value mapped by the main key; and screening the key values, and constructing the attribute coding set based on the screened key values.
- 11. The co-processing method of multiple unmanned aerial vehicles according to claim 1, wherein the determining the spatial encoding of the processing grid of the target location point in the image and the grid image comprises: determining a spatial encoding of a processing grid containing the target location points in the image; And based on the space coding, clipping an image corresponding to the processing grid in the image as the grid image.
- 12. The co-processing method of multiple unmanned aerial vehicles according to claim 1, the co-processing comprising: receiving collaborative processing data sent by the request unmanned aerial vehicle; Reading a target image based on the attribute codes contained in the collaborative processing data, and performing image clipping on the target image based on the space codes contained in the collaborative processing data to obtain a target grid image; Matching the grid image and the target grid image to obtain the image position point coordinates of the target position point in the target grid image; and calculating a cooperative processing result of the target position point based on the image position point coordinates and sending the cooperative processing result to the requesting unmanned aerial vehicle.
- 13. The co-processing method of multiple unmanned aerial vehicles according to claim 1, wherein the sending, to the co-unmanned aerial vehicle, co-processing data including the spatial code, the grid image, and the corresponding attribute code, to perform the co-processing step of the target location point, further comprises: receiving a returned cooperative position coordinate after the cooperative unmanned aerial vehicle performs cooperative processing; and calculating the positioning position coordinates of the target position point based on the cooperative position coordinates and the geographic position coordinates.
- 14. The multi-unmanned aerial vehicle cooperative processing method of claim 1, further comprising: Rendering an initial image coverage situation map according to the historical image and the global discrete grid system, wherein an image grid in the initial image coverage situation map is in a first display state; Reading at least one range spatial code in the index table, and updating the display state of an image grid corresponding to the at least one range spatial code in the initial image coverage situation map to a second display state to obtain an intermediate image coverage situation map; And reading attribute codes mapped by each range space code in the at least one range space code in the index table, and carrying out association display on geographic image images associated with the attribute codes and corresponding image grids to obtain an image coverage situation map.
- 15. A collaborative processing method for multiple unmanned aerial vehicles, applied to collaborative unmanned aerial vehicles in a unmanned aerial vehicle bee colony, the method comprising: Receiving cooperative processing data sent by a request unmanned aerial vehicle in the unmanned aerial vehicle bee colony, wherein the cooperative processing data comprises space codes and grid images of processing grids of target position points in an image containing the target position points, and attribute codes; reading a target image based on the attribute codes, and performing image clipping on the target image based on the space codes to obtain a target grid image; Matching the grid image and the target grid image to obtain the image position point coordinates of the target position point in the target grid image; calculating a cooperative processing result of the target position point based on the image position point coordinates and sending the cooperative processing result to the request unmanned aerial vehicle; The method further comprises the steps that the request unmanned aerial vehicle determines an image gridding level of a proper image coverage range according to the size of a flight area, the camera parameters of the unmanned aerial vehicle and the performance of an onboard computer, and the image gridding level is calculated in the following mode: The said Representing the image gridding level, the Representing image area Area of mesh And when the grid is positioned at a level, d represents a preset depth, the image area refers to the area of the image acquired by the unmanned aerial vehicle, and the grid area refers to the area of the grid under the corresponding level obtained after calculation according to the scale of the preset level grid.
- 16. A co-processing apparatus for multiple unmanned aerial vehicles, a requesting unmanned aerial vehicle operating in a drone swarm, comprising: A coordinate calculation module configured to calculate geographic position coordinates of a target position point based on measurement data of an image including the target position point; The encoding and searching module is configured to perform encoding and searching on the processing area determined based on the geographic position coordinates according to an index table, obtain an attribute encoding set and determine the spatial encoding and grid image of the processing grid of the target position point in the image; The unmanned aerial vehicle determining module is configured to read unmanned aerial vehicle identifications in all attribute codes in the attribute code set and determine corresponding collaborative unmanned aerial vehicles based on the unmanned aerial vehicle identifications; A cooperative processing data transmitting module configured to transmit cooperative processing data including the spatial code, the grid image, and a corresponding attribute code to the cooperative unmanned aerial vehicle, so as to perform cooperative processing of the target location point; The code retrieval module is further configured to determine an image gridding level of a suitable image coverage area according to the size of the flight area, the camera parameters of the unmanned aerial vehicle and the performance of an onboard computer, wherein the image gridding level is calculated by adopting the following mode: The said Representing the image gridding level, the Representing image area Area of mesh And when the grid is positioned at a level, d represents a preset depth, the image area refers to the area of the image acquired by the unmanned aerial vehicle, and the grid area refers to the area of the grid under the corresponding level obtained after calculation according to the scale of the preset level grid.
- 17. A co-processing apparatus for multiple unmanned aerial vehicles, a co-unmanned aerial vehicle operating in a drone swarm, comprising: The cooperative processing data receiving module is configured to receive cooperative processing data which is sent by a request unmanned aerial vehicle in the unmanned aerial vehicle bee colony, wherein the cooperative processing data comprises space codes and grid images of processing grids of image images containing target position points and attribute codes; The image clipping module is configured to read a target image based on the attribute codes, and clip the target image based on the space codes to obtain a target grid image; The matching processing module is configured to perform matching processing on the grid image and the target grid image to obtain image position point coordinates of the target position point in the target grid image; the cooperative processing result calculation module is configured to calculate a cooperative processing result of the target position point based on the image position point coordinates and send the cooperative processing result to the requesting unmanned aerial vehicle; The request unmanned aerial vehicle determines an image meshing level of a proper image coverage area according to the size of a flight area, the camera parameters of the unmanned aerial vehicle and the performance of an onboard computer, and the image meshing level is calculated in the following mode: The said Representing the image gridding level, the Representing image area Area of mesh And when the grid is positioned at a level, d represents a preset depth, the image area refers to the area of the image acquired by the unmanned aerial vehicle, and the grid area refers to the area of the grid under the corresponding level obtained after calculation according to the scale of the preset level grid.
- 18. A multi-drone co-processing apparatus, comprising: And a memory configured to store computer-executable instructions that, when executed, cause the processor to implement the co-processing method of the multi-drone of any of claims 1-15.
- 19. A computer readable storage medium for storing computer executable instructions that when executed by a processor implement the co-processing method of a multi-drone of any one of claims 1-15.
Description
Collaborative processing method and device for multiple unmanned aerial vehicles Technical Field The document relates to the technical field of unmanned aerial vehicles, in particular to a cooperative processing method and device for multiple unmanned aerial vehicles. Background Along with the improvement of the computing capacity of edge computing equipment, the capability of airborne data processing and analysis is obviously improved, and becomes a research hotspot in the fields of airborne real-time remote sensing observation, airborne computer vision and the like, and further, how to realize collaborative management of airborne data is a challenge for researchers. Disclosure of Invention One or more embodiments of the present disclosure provide a collaborative processing method for a multi-unmanned aerial vehicle. The collaborative processing method of the multiple unmanned aerial vehicles is applied to a requesting unmanned aerial vehicle in a unmanned aerial vehicle bee colony and comprises the steps of calculating geographic position coordinates of a target position point based on measurement data of an image containing the target position point. And carrying out coding search on the processing area determined based on the geographic position coordinates according to an index table to obtain an attribute coding set, and determining the space coding and the grid image of the processing grid of the target position point in the image. And reading unmanned aerial vehicle identifications in all attribute codes in the attribute code set, and determining corresponding collaborative unmanned aerial vehicles based on the unmanned aerial vehicle identifications. And sending cooperative processing data comprising the spatial codes, the grid images and the corresponding attribute codes to the cooperative unmanned aerial vehicle so as to perform cooperative processing of the target position point. One or more embodiments of the present disclosure provide another collaborative processing method for multiple unmanned aerial vehicles, which is applied to a collaborative unmanned aerial vehicle in a unmanned aerial vehicle bee colony, and includes receiving collaborative processing data sent by a requesting unmanned aerial vehicle in the unmanned aerial vehicle bee colony, where the collaborative processing data includes spatial encoding and grid image of a processing grid of an image including a target location point, and attribute encoding. And reading a target image based on the attribute codes, and performing image clipping on the target image based on the space codes to obtain a target grid image. And carrying out matching processing on the grid image and the target grid image to obtain the image position point coordinates of the target position point in the target grid image. And calculating a cooperative processing result of the target position point based on the image position point coordinates and sending the cooperative processing result to the requesting unmanned aerial vehicle. One or more embodiments of the present disclosure provide a multi-unmanned aerial vehicle co-processing apparatus, which is a requesting unmanned aerial vehicle operating in a unmanned aerial vehicle bee colony, and includes a coordinate calculation module configured to calculate geographic position coordinates of a target position point based on measurement data of an image including the target position point. And the encoding and searching module is configured to perform encoding and searching on the processing area determined based on the geographic position coordinates according to an index table, obtain an attribute encoding set and determine the spatial encoding of the processing grid of the target position point in the image and the grid image. And the unmanned aerial vehicle determining module is configured to read unmanned aerial vehicle identifications in all attribute codes in the attribute code set and determine corresponding collaborative unmanned aerial vehicles based on the unmanned aerial vehicle identifications. And the cooperative processing data transmitting module is configured to transmit cooperative processing data containing the spatial codes, the grid images and the corresponding attribute codes to the cooperative unmanned aerial vehicle so as to perform cooperative processing of the target position point. One or more embodiments of the present disclosure provide another collaborative processing device for multiple unmanned aerial vehicles, where the collaborative unmanned aerial vehicle operates in a unmanned aerial vehicle bee colony, and the collaborative processing device includes a collaborative processing data receiving module configured to receive collaborative processing data sent by a requesting unmanned aerial vehicle in the unmanned aerial vehicle bee colony, where the collaborative processing data includes spatial encoding and grid image of a processing grid of a target location point in an image