Search

CN-121995763-A - Low-cost unmanned aerial vehicle terminal guidance solving method based on monocular camera

CN121995763ACN 121995763 ACN121995763 ACN 121995763ACN-121995763-A

Abstract

The low-cost unmanned aerial vehicle terminal guidance solving method based on the monocular camera comprises the following steps of normalizing a camera model and pixel coordinates, modeling relative attitude kinematics and dynamics of an unmanned aerial vehicle terminal guidance task, and designing a control law of the unmanned aerial vehicle terminal guidance task. Aiming at the problem of moment saturation constraint, the invention provides a target pointing coordinate system to define a gesture control target, designs an anti-saturation controller for a tiny disturbance moment Lu Bang, ensures that the z axis of the unmanned aerial vehicle continuously points to a moving target, solves a target azimuth angle through monocular vision and generates a control instruction, and realizes efficient approaching on the premise of keeping moment constraint.

Inventors

  • LI PEIRAN
  • ZHANG QINGQI
  • LUO YANBO
  • WANG MENG
  • GAO JINGLONG
  • KONG XU
  • PANG CHUNLEI
  • WU HAO
  • ZHANG LIANG
  • LI RUI
  • ZHAO CHENHAO
  • MO YIFEI
  • GU WENKUN
  • JIANG YAO

Assignees

  • 中国人民解放军空军工程大学

Dates

Publication Date
20260508
Application Date
20260220

Claims (1)

  1. 1. The low-cost unmanned aerial vehicle terminal guidance solving method based on the monocular camera is characterized by comprising the following steps of: step one, solving a direction error vector based on a monocular camera observation result: Monocular visual resolution; (1) Normalizing the camera model and the pixel coordinates; The monocular camera adopts a standard pinhole model, the origin is at the upper left corner, the right is the u axis, the downward is the v axis, (u, v) is the original pixel point seen by the camera, and due to lens distortion, distortion correction must be performed to change the coordinates into ideal pixel points conforming to the pinhole model: Wherein, the Representing the original pixel coordinates based on the camera reference matrix and distortion parameters Performing a nonlinear function of the anti-distortion map; the pixel sitting mark after distortion correction is Camera internal reference matrix Expressed as: (1) Wherein, the Indicating the focal length of the horizontal direction sensor, The focal length in the vertical direction is indicated, Representing a camera principal point; Determining how the camera projects 3D rays onto the 2D pixel plane, defining rays directed to the target , Is a real number, and is a real number, Respectively represent the light ray edges 、 、 The component of the axial direction is utilized Obtaining The method comprises the following steps: (2) Normalizing to obtain unit direction vector of target in camera coordinate system : (3) (2) Converting the camera coordinate system into a machine body coordinate system; Is provided with In order to rotate the matrix from the camera coordinate system to the body coordinate system, the target is oriented under the body coordinate system The method comprises the following steps: (4) Wherein, the 、 、 Respectively represent vectors In the machine body coordinate system 、 、 Components in three coordinate axis directions; (3) Solving the direction vector of the machine system to a target azimuth; Assuming that the body coordinate system satisfies The shaft is directed in the direction of the machine head, The shaft is directed to the right side of the machine body, Axially downward, target yaw angle The calculation is as follows: (5) Wherein the method comprises the steps of Representing four-quadrant arctangent function, pitch angle Expressed as: (6) (4) Target yaw angle obtained by monocular vision calculation And a target pitch angle Conversion into direction error vectors in body coordinate system ; The body coordinate system is given by 、 Direction vector of constitution : (7) In the system, the unit vector of the machine head forward direction The method comprises the following steps: (8) Error in direction The vector is defined as: (9) Will be The vector is input into an anti-saturation controller to generate a corresponding control instruction, and the control instruction is sent to an actuating mechanism of the unmanned aerial vehicle through a flight controller to drive the unmanned aerial vehicle to adjust the gesture and the flight direction; modeling relative attitude kinematics and dynamics of a terminal guidance task of the unmanned aerial vehicle; in the current precursor coordinate system of unmanned aerial vehicle In the method, a tracking unmanned aerial vehicle current body coordinate system is defined relative to a target body coordinate system The difference of the attitude angular speeds is used for tracking the current coordinate system of the unmanned aerial vehicle Is projected as Unmanned aerial vehicle current body coordinate system relative to inertial coordinate system for defining tracking unmanned aerial vehicle Is in the current precursor coordinate system of the unmanned aerial vehicle Is projected as Defining the target attitude angular speed of the tracking unmanned aerial vehicle to be in a target body coordinate system of the tracking unmanned aerial vehicle Is projected as I.e. (10) Wherein, the Is from the object coordinate system The rotation matrix to the current coordinate system takes the following values: (11) Wherein, the Is a modified euclidean parameter MRPs corresponding to the attitude error, A single matrix of 3 rows and 3 columns, and (12) (13) Wherein " "Means the first derivative; Obtaining (14) Wherein, the Corresponding oblique symmetry matrix The method comprises the following steps: (15) Wherein, the 、 、 Respectively represent the rolling angle speed error, the pitch angle speed error and the yaw angle speed error, and the relative attitude kinematics represented by MRPs is (16) Wherein, the Is a coefficient matrix related to the modified rodgers parameter: (17) Definition of the definition Is provided with nominal components And uncertainty item Is used for tracking the moment of inertia matrix of the unmanned aerial vehicle, Is the control moment provided by the actuator, Is the external disturbance torque in the space environment, and obtains (18) Then there is Wherein " "Means the inverse of the matrix, The uncertainty term in the representation set, therefore, yields the relative pose dynamics equation as: (20) Wherein the method comprises the steps of (21) It is necessary to analyze the coupling relationship between the target attitude and the relative position to give the desired angular acceleration Is a specific formula of (2); The measuring sensor is arranged on the current tracking unmanned aerial vehicle A kind of electronic device On axis tracking unmanned aerial vehicle Unit vector of axis and tracking unmanned aerial vehicle gravity center pointing to target unmanned aerial vehicle gravity center The measuring sensor is perpendicular to the sun ray and inertial system Wherein the solar rays are represented as , Real number, object coordinate system Unit vectors corresponding to three coordinate axes of the (B) are in inertial system Is expressed as (22) Wherein, the Respectively represent the coordinate systems of the object The x-axis, y-axis, z-axis in the inertial frame, Is that The unit vector form of the vector, Is a rotation matrix from an inertial coordinate system to a current body coordinate system (23) Wherein, the For correcting unmanned plane gesture represented by Rodrign parameter, from tracking expected gesture of unmanned plane to inertial system Is the rotation matrix of (a) (24) Rotation matrix of conversion relation between target pointing coordinate system and unmanned plane body coordinate system from target gesture to current body coordinate system Characterization: (25) Defining a target attitude angular velocity vector Wherein 、 、 Respectively representing a target roll angle speed, a target pitch angle speed, and a target yaw angle speed, the cross-multiplication matrix thereof Is that (26) Thus: (27) Consider that: (28) Then there is (29) Obtaining a relative attitude kinematics and dynamics model of the unmanned aerial vehicle terminal guidance task: (30) Defining a first state variable And a second state variable The attitude tracking error dynamic model of the unmanned aerial vehicle terminal guidance task is expressed as (31) Wherein the method comprises the steps of First procedure function First process parameter Second process parameter Third process parameter Selecting a sliding film surface (32) Wherein the first coefficient Is a positive real number; Thirdly, designing a control law of the unmanned aerial vehicle terminal guidance task; Assuming that 1, for the control system described in formula (31), there should be a feasible control instruction input, that is, the consistency of the pose of the target unmanned aerial vehicle and the tracking unmanned aerial vehicle can be achieved; Suppose that the maximum running speed of the 2-tracking unmanned aerial vehicle is higher than the maximum running speed of the target unmanned aerial vehicle; the 1-ary assumption of 1 means that the actual control input of the actuator is taken into account in the actuator saturation And command control input Input difference between Is bounded, i.e. there is an unknown positive real number So that ; The following control law is designed (33) Wherein, the In order to control the torque ideal value, To take into account the actual control torque value for actuator saturation, For the difference value of the two, Is any diagonal matrix with diagonal elements greater than 0, a second coefficient Is a positive real number, and the output is a real number, Representing the adaptive compensation amount, taking into account the actual attitude control moment value of actuator saturation The formula is as follows: (34) Wherein the constant is Represents maximum control moment of single-axis output of unmanned aerial vehicle and self-adaptive compensation law The following equation is satisfied: (35) wherein the second coefficient and the third coefficient If there is , The method meets the following conditions: Wherein the fourth, fifth and sixth coefficients Then And Can gradually and stably converge to the neighborhood of the origin.

Description

Low-cost unmanned aerial vehicle terminal guidance solving method based on monocular camera Technical Field The invention relates to the technical field of unmanned aerial vehicles, in particular to a low-cost unmanned aerial vehicle terminal guidance scheme based on a monocular camera. Background During the mission of the drone, the terminal guidance phase is critical for accurate arrival at the target. Traditional unmanned aerial vehicle terminal guidance schemes often rely on complex sensor systems, such as multi-camera, lidar, etc., which are costly, increasing the overall manufacturing cost of the unmanned aerial vehicle. The monocular camera cannot obtain the target depth information, but has low cost and convenient use. Disclosure of Invention Aiming at the defects of the prior art, the invention provides a low-cost unmanned aerial vehicle terminal guidance solving method based on a monocular camera, which comprises the following steps: step one, solving a direction error vector based on a monocular camera observation result: Monocular visual resolution; (1) Normalizing the camera model and the pixel coordinates; The monocular camera adopts a standard pinhole model, the origin is at the upper left corner, the right is the u axis, the downward is the v axis, (u, v) is the original pixel point seen by the camera, and due to lens distortion, distortion correction must be performed to change the coordinates into ideal pixel points conforming to the pinhole model: Wherein, the Representing the original pixel coordinates based on the camera reference matrix and distortion parametersPerforming a nonlinear function of the anti-distortion map; the pixel sitting mark after distortion correction is Camera internal reference matrixExpressed as: (1) Wherein, the Indicating the focal length of the horizontal direction sensor,The focal length in the vertical direction is indicated,Representing a camera principal point; Determining how the camera projects 3D rays onto the 2D pixel plane, defining rays directed to the target ,Is a real number, and is a real number,Respectively represent the light ray edges、、The component of the axial direction is utilizedObtainingThe method comprises the following steps: (2) Normalizing to obtain unit direction vector of target in camera coordinate system : (3) (2) Converting the camera coordinate system into a machine body coordinate system; Is provided with In order to rotate the matrix from the camera coordinate system to the body coordinate system, the target is oriented under the body coordinate systemThe method comprises the following steps: (4) Wherein, the 、、Respectively represent vectorsIn the machine body coordinate system、、Components in three coordinate axis directions; (3) Solving the direction vector of the machine system to a target azimuth; Assuming that the body coordinate system satisfies The shaft is directed in the direction of the machine head,The shaft is directed to the right side of the machine body,Axially downward, target yaw angleThe calculation is as follows: (5) Wherein the method comprises the steps of Representing four-quadrant arctangent function, pitch angleExpressed as: (6) (4) Target yaw angle obtained by monocular vision calculation And a target pitch angleConversion into direction error vectors in body coordinate system; The body coordinate system is given by、Direction vector of constitution: (7) In the system, the unit vector of the machine head forward directionThe method comprises the following steps: (8) Error in direction The vector is defined as: (9) Will be The vector is input into an anti-saturation controller to generate a corresponding control instruction, and the control instruction is sent to an actuating mechanism of the unmanned aerial vehicle through a flight controller to drive the unmanned aerial vehicle to adjust the gesture and the flight direction; modeling relative attitude kinematics and dynamics of a terminal guidance task of the unmanned aerial vehicle; in the current precursor coordinate system of unmanned aerial vehicle In the method, a tracking unmanned aerial vehicle current body coordinate system is defined relative to a target body coordinate systemThe difference of the attitude angular speeds is used for tracking the current coordinate system of the unmanned aerial vehicleIs projected asUnmanned aerial vehicle current body coordinate system relative to inertial coordinate system for defining tracking unmanned aerial vehicleIs in the current precursor coordinate system of the unmanned aerial vehicleIs projected asDefining the target attitude angular speed of the tracking unmanned aerial vehicle to be in a target body coordinate system of the tracking unmanned aerial vehicleIs projected asThe right time of the process (10) Wherein, the Is from the object coordinate systemThe rotation matrix to the current coordinate system takes the following values: (11) Wherein, the Is a modified euclidean parameter MRPs correspon