CN-121994196-A - Space measurement method based on unmanned aerial vehicle real-time video
Abstract
The invention discloses a space measurement method based on unmanned aerial vehicle real-time video, which comprises the steps of acquiring unmanned aerial vehicle parameters in real time, including pixel coordinates, image size, camera sensor size, focal length, zoom multiple, attitude angle of an unmanned aerial vehicle camera, unmanned aerial vehicle position and height of a target point in a video image, defining a plurality of coordinate systems, including a pixel coordinate system, a camera coordinate system, a north east-earth coordinate system of a machine body, a geocentric earth fixed coordinate system and a WGS84 geographic coordinate system, executing coordinate conversion calculation among the coordinate systems based on the unmanned aerial vehicle parameters, calculating geographic coordinates of all target points in an identification area according to the step 3, constructing a geometric object representing the identification area according to the geographic coordinates of each target point, and calculating the area of the geometric object as the area of the identification area. The method and the device are suitable for rapidly calculating the target position and the target area in real time in the emergency scene.
Inventors
- QIU YINGEN
- LIN SI
- ZHU SHITAO
Assignees
- 福州市勘测院有限公司
Dates
- Publication Date
- 20260508
- Application Date
- 20251217
Claims (10)
- 1. The space measurement method based on the unmanned aerial vehicle real-time video is characterized by comprising the following steps of: Step 1, acquiring unmanned aerial vehicle parameters in real time, wherein the unmanned aerial vehicle parameters comprise pixel coordinates, image sizes, camera sensor sizes, focal lengths, zoom multiples, attitude angles of unmanned aerial vehicle cameras, unmanned aerial vehicle positions and heights of target points in video images; step 2, defining a plurality of coordinate systems, including a pixel coordinate system, a camera coordinate system, a machine body north-east-earth coordinate system, a machine body north-east-day coordinate system, a geocentric-earth fixed coordinate system and a WGS84 geographic coordinate system; And 3, performing coordinate conversion calculation among a plurality of coordinate systems based on the unmanned aerial vehicle parameters, wherein the method specifically comprises the following steps: step 31, converting pixel coordinates of the target point in the pixel coordinate system into camera coordinates in the camera coordinate system based on the pixel coordinates, the image size, the camera sensor size, the focal length and the zoom factor; Step 32, converting the camera coordinates to north-east coordinates in a north-east coordinate system of the machine body through a preset rotation matrix based on the camera coordinates and the attitude angle of the unmanned aerial vehicle camera; Step 33, converting the north-east-earth coordinates into north-east-day coordinates under a north-east-day coordinate system of the machine body based on the position and the height of the unmanned aerial vehicle; step 34, converting the north-east coordinates into geocentric geodetic coordinates in a geocentric geodetic coordinate system based on the north-east coordinates, the position and the height of the unmanned aerial vehicle; Step 35, converting the geocentric geocoordinates into geographic coordinates under a WGS84 geographic coordinate system; step 4, calculating geographic coordinates of all target points in the identification area according to the step 3; And 5, constructing a geometric object representing the identification area according to the geographic coordinates of each target point, and calculating the area of the geometric object as the area of the identification area.
- 2. The method for measuring space based on unmanned aerial vehicle real-time video according to claim 1, wherein the camera sensor size refers to the width and the height of an effective imaging area of a photosensitive element used for camera imaging; the zoom multiple refers to the ratio of the current focal length of the camera lens to the reference focal length; The attitude angle comprises a yaw angle of 0 degrees in the positive north and 90 degrees in the positive east, a pitch angle of 0 degrees in the horizontal forward direction and 90 degrees in the vertical downward direction, and a roll angle of positive roll to the right and negative roll to the left relative to the advancing direction of the machine body; the unmanned aerial vehicle position is longitude and latitude of the unmanned aerial vehicle; The altitude includes an absolute altitude and an altitude of the drone relative to the ground.
- 3. The space measurement method based on the real-time video of the unmanned aerial vehicle according to claim 1, wherein the step 2 specifically comprises: Step 21, establishing a pixel coordinate system, namely defining a horizontal rightward direction as a u-axis, a vertical downward direction as a v-axis and coordinate values as integer pixels by taking an upper left corner pixel point of an image plane shot by the unmanned aerial vehicle as an origin, and calibrating each pixel position of a target point on the image plane; Step 22, a camera coordinate system is established, an intersection point of an optical axis of the unmanned aerial vehicle camera and an image plane is taken as an origin, and the origin is positioned at the center of the image plane; Step 23, establishing a north-east coordinate system of the machine body, defining the direction pointing to the north as an X axis, the direction pointing to the east as a Y axis and the direction pointing to the ground vertically as a Z axis by taking the position of the unmanned aerial vehicle as an origin; step 24, establishing a north-east coordinate system of the machine body, defining an X1 axis pointing to the north by taking the position of the unmanned aerial vehicle as an origin, a Y1 axis pointing to the east and a Z1 axis pointing to the vertical upwards; Step 25, establishing a geocentric geodetic coordinate system, defining a direction parallel to the rotation axis of the earth and pointing to the north pole as a Z2 axis by taking the geodetic as an origin, wherein the direction pointing to the intersection point of the primary meridian and the equator from the origin as an X2 axis, and the direction perpendicular to the primary meridian plane and pointing to the intersection point of the east meridian 90 degrees and the equator as a Y2 axis; step 26, establishing a WGS84 geographic coordinate system, which takes the mass center of the earth as an origin and has longitude, latitude and altitude relative to a WGS84 reference ellipsoid.
- 4. The method for spatial measurement based on real-time video of unmanned aerial vehicle according to claim 1, wherein the step 31 specifically comprises: step 311, determining the pixel coordinates of the image center point in the pixel coordinate system , ) Coordinate values of the image center point on the u-axis For image width in image size Half of (a) in the v-axis coordinate value Height of image in image size Is the following formula: step 312, calculating the physical size of the target point unit pixel, the width direction component of the physical size For width in camera sensor size Divided by image width Height direction component of the physical dimension For height in camera sensor size Divided by image height The formula is as follows: Step 313, subtracting the coordinate value of the image center point on the u-axis from the coordinate value on the u-axis in the pixel coordinates (u, v) of the target point Multiplying the width-direction component of the physical dimension Obtaining the coordinate value of the X-axis under the camera coordinate system, subtracting the coordinate value of the image center point on the V-axis from the coordinate value on the V-axis in the pixel coordinate of the target point Multiplying the height direction component of the physical dimension Obtaining coordinate value of the camera coordinate system on the y-axis, and comparing the focal length value f with the zoom multiple Multiplying to obtain a coordinate value on a z-axis under a camera coordinate system, wherein the formula is as follows: wherein x, y and z are camera coordinates in a camera coordinate system, respectively.
- 5. The method for spatial measurement based on real-time video of unmanned aerial vehicle according to claim 1, wherein the step 32 specifically comprises: the rotation matrix R is an Euler angle rotation matrix, camera coordinates in a camera coordinate system are converted into north east ground coordinates in a north east ground coordinate system of the machine body by using the rotation matrix R, and a calculation formula is as follows: wherein X, Y and Z are north-east coordinates of the machine body in a north-east coordinate system respectively; the construction of the rotation matrix R depends on the sine value and the cosine value after the attitude angle is converted from an angle system to an radian system, and is defined as follows: in the formula, For the radian value corresponding to the yaw angle, Is the radian value corresponding to the pitch angle, The radian value corresponding to the roll angle.
- 6. The method for spatial measurement based on real-time video of unmanned aerial vehicle according to claim 1, wherein the step 33 specifically comprises: step 331, calculating a direction vector pointing from the unmanned plane position to a ground point corresponding to the target point in the image in the north-east-day coordinate system based on the north-east-earth coordinate system of the body: Wherein, the Represents the direction vector on the X axis in the north-east-heaven coordinate system of the machine body, Represents the direction vector on the Y axis in the north-east-heaven coordinate system of the machine body, The direction vector on the Y axis in the north-east coordinate system of the machine body is represented; Step 332, constructing rays which start from the position of the unmanned aerial vehicle and have direction vectors (Xneu, yneu, zneu); Step 333, checking the direction vector If (if) If the number Zneu is greater than or equal to 0, the camera lens is determined not to be oriented downwards, and calculation is not performed; Step 334, calculating an intersection point parameter t of the ray and the ground plane according to the height agl of the unmanned aerial vehicle relative to the ground, wherein the calculation formula is as follows: Step 335, calculating north-east coordinates (X1, Y1, Z1) of the ground point in the north-east coordinate system of the machine body based on the intersection parameter t, where: Wherein X1 is the northeast coordinate of the target point in the northeast coordinate system of the machine body, Y1 is the eastern coordinate of the target point in the northeast coordinate system of the machine body, and Z1 is the vertical coordinate of the target point in the northeast coordinate system of the machine body.
- 7. The space measurement method based on unmanned aerial vehicle real-time video according to claim 1, wherein in step 34, the north eastern coordinate is converted into the geocentric geodetic coordinate in the geocentric geodetic coordinate system, and the conversion mode is specifically as follows: using a geographic coordinate transfer function, the northeast coordinates (X1, Y1, Z1), latitude, longitude, and absolute altitude of the drone location are input, and output as geocentric geodetic coordinates (X2, Y2, Z2).
- 8. The space measurement method based on the real-time video of the unmanned aerial vehicle according to claim 1, wherein in the step 35, the geocentric-geodetic coordinates are converted into the geographic coordinates under the WGS84 geographic coordinate system by the following specific conversion modes: The geocentric geodetic coordinates (X2, Y2, Z2) in the geocentric geodetic coordinate system are converted into longitudes L, latitudes B and altitudes H in the WGS84 geodetic coordinate system using a coordinate conversion library, the longitudes L, latitudes B and altitudes H being used to represent the actual geographic position of the target point on the earth.
- 9. The method for measuring space based on real-time video of unmanned aerial vehicle according to claim 1, wherein the number of target points of the identified area in the step 4 is at least three, and when calculating the area, all target points are defaulted to be at the same elevation level.
- 10. The space measurement method based on the real-time video of the unmanned aerial vehicle according to claim 1, wherein the step 5 specifically comprises: Step 51, according to the sequence of target points in the identification area, the longitude value and the latitude value of each target point are spliced in turn to form a section of coordinate sequence text with space separating longitude and latitude, the coordinate sequence text is placed in a geometric description prefix and bracket representing the polygon type, and the head and tail coordinates of the coordinate sequence text are the same, so that a WKT character string is formed; Step 52, utilizing the function of analyzing the WKT character string in the geometric calculation library to load the WKT character string into a geometric object; and step 53, calling an area calculation method of the geometric object, and obtaining the area value of the identification area represented by the geometric object.
Description
Space measurement method based on unmanned aerial vehicle real-time video Technical Field The invention relates to the technical field of unmanned aerial vehicle image processing, in particular to a space measurement method based on unmanned aerial vehicle real-time video. Background Along with the development of unmanned aerial vehicle technique, unmanned aerial vehicle obtains wide application in fields such as taking photo by plane, survey and drawing, AI inspection. In these applications, it is often necessary to convert the target point position into actual geographic coordinates, calculate the size of the identified area, etc. Because the unmanned aerial vehicle has the influence of factors such as posture change (pitching, rolling and yawing) during shooting, camera focal length, sensor size, topography fluctuation and the like, the conversion process involves complex coordinate conversion, and in the prior art, the method and the system for calculating the video target area of the unmanned aerial vehicle in real time and rapidly mainly aim at post image calculation are not available. Disclosure of Invention Therefore, the invention aims to provide a space measurement method based on unmanned aerial vehicle real-time video, which utilizes unmanned aerial vehicle parameters and real-time pose information to calculate the geographic coordinates and the target graphic area of a target point and can be suitable for rapidly calculating the target position and the target area in real time in an emergency scene. In order to achieve the technical purpose, the invention adopts the following technical scheme: The invention provides a space measurement method based on unmanned aerial vehicle real-time video, which comprises the following steps: Step 1, acquiring unmanned aerial vehicle parameters in real time, wherein the unmanned aerial vehicle parameters comprise pixel coordinates, image sizes, camera sensor sizes, focal lengths, zoom multiples, attitude angles of unmanned aerial vehicle cameras, unmanned aerial vehicle positions and heights of target points in video images; step 2, defining a plurality of coordinate systems, including a pixel coordinate system, a camera coordinate system, a machine body north-east-earth coordinate system, a machine body north-east-day coordinate system, a geocentric-earth fixed coordinate system and a WGS84 geographic coordinate system; And 3, performing coordinate conversion calculation among a plurality of coordinate systems based on the unmanned aerial vehicle parameters, wherein the method specifically comprises the following steps: step 31, converting pixel coordinates of the target point in the pixel coordinate system into camera coordinates in the camera coordinate system based on the pixel coordinates, the image size, the camera sensor size, the focal length and the zoom factor; Step 32, converting the camera coordinates to north-east coordinates in a north-east coordinate system of the machine body through a preset rotation matrix based on the camera coordinates and the attitude angle of the unmanned aerial vehicle camera; Step 33, converting the north-east-earth coordinates into north-east-day coordinates under a north-east-day coordinate system of the machine body based on the position and the height of the unmanned aerial vehicle; step 34, converting the north-east coordinates into geocentric geodetic coordinates in a geocentric geodetic coordinate system based on the north-east coordinates, the position and the height of the unmanned aerial vehicle; Step 35, converting the geocentric geocoordinates into geographic coordinates under a WGS84 geographic coordinate system; step 4, calculating geographic coordinates of all target points in the identification area according to the step 3; And 5, constructing a geometric object representing the identification area according to the geographic coordinates of each target point, and calculating the area of the geometric object as the area of the identification area. Further, the camera sensor size refers to the width and the height of an effective imaging area of a photosensitive element used for camera imaging; the zoom multiple refers to the ratio of the current focal length of the camera lens to the reference focal length; The attitude angle comprises a yaw angle of 0 degrees in the positive north and 90 degrees in the positive east, a pitch angle of 0 degrees in the horizontal forward direction and 90 degrees in the vertical downward direction, and a roll angle of positive roll to the right and negative roll to the left relative to the advancing direction of the machine body; the unmanned aerial vehicle position is longitude and latitude of the unmanned aerial vehicle; The altitude includes an absolute altitude and an altitude of the drone relative to the ground. Further, the step 2 specifically includes: Step 21, establishing a pixel coordinate system, namely defining a horizontal rightward direction as a u-axis, a ve