Search

CN-121414794-B - Method and system for tracking and controlling ground vehicle

CN121414794BCN 121414794 BCN121414794 BCN 121414794BCN-121414794-B

Abstract

The application relates to a method and a system for tracking and controlling a ground vehicle, wherein the method comprises the steps of dividing each frame of image of a shot picture into an ROI (region of interest) area and a background area, dividing the background area into a plurality of grids, and distributing corresponding characteristic points to the ROI area and the grids; the method comprises the steps of calculating an optical flow residual of a feature point of an ROI region, judging the real-time position of a tracked vehicle on an image based on the optical flow residual, carrying out shielding prediction on the state of the tracked vehicle to obtain the latest state of the tracked vehicle, calculating the real-time longitude and latitude corresponding to the tracked vehicle based on the latest state and the latest position, and issuing a waypoint to an unmanned plane based on the real-time longitude and latitude of the tracked vehicle so as to carry out real-time tracking. The method and the device can overcome the interference of the vibration of the unmanned aerial vehicle on the image recognition, and can quickly recover tracking after the tracked vehicle is temporarily blocked.

Inventors

  • Nian Muxin
  • LIAO JINGYANG
  • DENG YONGHONG
  • LOU DIMING
  • CHEN ZHIWEI
  • XU FENG

Assignees

  • 南昌智能新能源汽车研究院

Dates

Publication Date
20260512
Application Date
20251230

Claims (8)

  1. 1. A method of tracking control of a ground vehicle, the method comprising: dividing each frame of image of a shot picture into an ROI (region of interest) area and a background area, dividing the background area into a plurality of grids, and distributing corresponding characteristic points to the ROI area and the grids, wherein the ROI area is the area where a tracking vehicle is located; Calculating optical flow residual errors of the feature points of the ROI area by an optical flow method, and judging the real-time position of the tracked vehicle on the image based on the optical flow residual errors; based on a Kalman filter, carrying out shielding prediction on the state of the tracked vehicle to obtain the latest state of the tracked vehicle, calculating the real-time longitude and latitude corresponding to the tracked vehicle based on the latest state and the real-time position of the tracked vehicle, and issuing a waypoint to an unmanned plane based on the real-time longitude and latitude of the tracked vehicle so as to carry out real-time tracking; the step of calculating the optical flow residual error of the feature point of the ROI area by an optical flow method comprises the following steps: Calculating by a pyramid LK optical flow method according to the characteristic points in the image of the previous frame and the image of the current frame to obtain an optical flow vector; Acquiring angle parameters of the unmanned aerial vehicle in real time through an IMU, performing linear interpolation on the angle parameters to obtain Euler angle vectors aligned with time stamps of images of the current frame number, constructing a rotation matrix based on a preset rotation sequence and the Euler angle vectors, and performing rotation compensation on the feature points based on the rotation matrix to obtain the latest position of the feature points of the current frame number; calculating an offset caused by the rotation of the unmanned aerial vehicle based on the latest position of the characteristic point of the current frame number and the position of the characteristic point of the previous frame number, calculating the expected residual quantity of the characteristic point under the normal flight motion state of the unmanned aerial vehicle, and calculating the optical flow residual of the optical flow vector based on the offset and the residual quantity; the step of allocating corresponding feature points to the ROI area and the grids comprises: Constructing an adaptive FAST threshold based on the ROI region and the flying height, extracting a plurality of characteristic points from the shooting picture through the FAST threshold, and distributing corresponding characteristic points to the ROI region based on the ratio of the area of the ROI region to the image area of the shooting picture; Calculating the actual distance from the center point of each grid to the center point of the ROI area, calculating an importance value corresponding to each grid based on the gradient amplitude of the grid and the actual distance, and distributing corresponding feature points to the grids based on the importance values.
  2. 2. The method according to claim 1, wherein the calculation formula of the division range of the ROI area is as follows: ; ; Wherein, the Representing the width of the region of interest (ROI), Representing the length of the region of interest (ROI), Representing the actual width of the tracked vehicle, Representing the number of pixels of the width of the camera image, Representing a predetermined altitude at which the drone is flying, Representing the horizontal angle of view of the camera, Representing the vertical field angle of view of the camera, Representing the actual length of the tracked vehicle, Representing the number of pixels of the camera image height, Representing a safety factor.
  3. 3. The method of tracking control of a ground vehicle according to claim 1, characterized in that the method further comprises: and calculating an x-axis gradient in the x direction and a y-axis gradient in the y direction of each grid through a Sobel operator, and calculating the gradient amplitude of the corresponding grid based on the x-axis gradient and the y-axis gradient.
  4. 4. A method of tracking control of a ground vehicle according to claim 3, characterized in that the importance value is calculated as follows: ; ; ; Wherein, the The importance value is represented by a value of importance, Representing the actual distance of the center point of the grid to the center point of the ROI area, Representing the gradient magnitude of the grid, Representing the x-axis gradient of the grid in the x-direction, Representing the y-axis gradient of the grid in the y-direction, The preset weight is indicated to be given to the user, The preset weight is indicated to be given to the user, 、 A coordinate parameter representing a center point of the grid, 、 All representing the coordinate parameters of the center point of the ROI area.
  5. 5. The method of tracking control of a ground vehicle according to claim 1, characterized in that said step of assigning respective feature points to said ROI area and to several of said grids comprises: calculating the target quantity of the feature points in each grid based on the importance value, sorting based on the response value of each feature point, and selecting the feature point of the corresponding grid based on the sorting result and the target quantity, wherein the response value of the feature point is the gray level difference absolute value of the pixel of the feature point with the smallest difference with the neighborhood circle; the calculation formula of the target number of the feature points in each grid is as follows: ; Wherein, the Representing the target number of feature points in each of said grids, Representing the total number of the feature points in the background area, A corresponding importance value representing each of said grid matches, Representing a set of grids.
  6. 6. The method of claim 1, wherein the step of determining a real-time location of the tracked vehicle on the image based on the optical flow residuals comprises: Comparing the optical flow residual error with a dynamic threshold, and judging that the corresponding characteristic point is positioned on the tracked vehicle and the tracked vehicle is positioned in the ROI when the optical flow residual error is larger than the dynamic threshold, wherein the dynamic threshold is determined based on a speed vector and a preset flying height of the unmanned aerial vehicle; And when the optical flow residual error is smaller than the dynamic threshold value, judging that the characteristic point is not positioned on the tracking vehicle, and judging that the characteristic point is positioned in the background area.
  7. 7. The method for tracking and controlling a ground vehicle according to claim 1, wherein the expression of the real-time longitude and latitude of the tracked vehicle is as follows: ; ; Wherein, the Representing the longitude in which the tracked vehicle is located, The longitude representing the position of the unmanned aerial vehicle is sequentially converted by a local plane approximate formula and converted by the angle of radian difference, A longitude representing the position of the drone, Representing the latitude at which the tracked vehicle is located, The latitude representing the position of the unmanned aerial vehicle is sequentially converted by a local plane approximate formula and converted by the angle of radian difference, And representing the latitude of the position of the unmanned aerial vehicle.
  8. 8. A ground vehicle tracking control system for implementing the ground vehicle tracking control method according to any one of claims 1 to 7, characterized in that the system comprises: The dividing module is used for dividing each frame of image of a shot picture into an ROI (region of interest) area and a background area, dividing the background area into a plurality of grids, and distributing corresponding characteristic points to the ROI area and the grids, wherein the ROI area is the area where a tracking vehicle is located; The first calculation module is used for calculating optical flow residual errors of the feature points of the ROI area through an optical flow method, and judging the real-time position of the tracking vehicle on the image based on the optical flow residual errors; And the real-time tracking module is used for carrying out shielding prediction on the state of the tracked vehicle based on the Kalman filter to obtain the latest state of the tracked vehicle, calculating the real-time longitude and latitude corresponding to the tracked vehicle based on the latest state and the real-time position of the tracked vehicle, and issuing a waypoint to the unmanned plane based on the real-time longitude and latitude of the tracked vehicle so as to carry out real-time tracking.

Description

Method and system for tracking and controlling ground vehicle Technical Field The invention relates to the technical field of unmanned aerial vehicle control, in particular to a method and a system for tracking and controlling ground vehicles. Background The method is characterized in that a target is easy to lose when a vehicle moves rapidly (such as turns), an algorithm for processing full-pixel optical flow based on an optical flow method is difficult to meet real-time performance on an embedded platform and cannot meet the requirement of tracking the vehicle in real time, accurate detection of the target can be realized by superposing a plurality of sensors such as a visible light camera, a thermal infrared imager and a radar, and the detection precision of the target is improved, but the multi-sensor integration scheme obviously aggravates the power consumption burden of an unmanned plane and conflicts with a task of long-time tracking, and meanwhile, huge calculation resources are required to be consumed in the process of processing multi-sensor data fusion, so that the deployment feasibility of the system on a resource-limited platform is further limited. The prior art scheme has three key defects that firstly, the self vibration (caused by motor rotation, airflow disturbance and the like) of the unmanned aerial vehicle during the flight process can seriously interfere with the image stability and the target identification precision, secondly, the conventional method aims at identifying the random movement (such as sudden lane change and sharp turning of a vehicle) of a target, the phenomenon that the unmanned aerial vehicle follows a track and even the target is completely lost can be caused due to the lack of dynamic modeling and prediction capability of the target movement characteristic, and thirdly, the conventional method lacks balance in performance and power consumption, or cannot meet the real-time performance or the long endurance. Disclosure of Invention Based on the above, the invention aims to provide a method and a system for tracking and controlling a ground vehicle so as to solve the defects in the prior art. In order to achieve the above object, the present invention provides a method for tracking and controlling a ground vehicle, the method comprising: dividing each frame of image of a shot picture into an ROI (region of interest) area and a background area, dividing the background area into a plurality of grids, and distributing corresponding characteristic points to the ROI area and the grids, wherein the ROI area is the area where a tracking vehicle is located; Calculating optical flow residual errors of the feature points of the ROI area by an optical flow method, and judging the real-time position of the tracked vehicle on the image based on the optical flow residual errors; based on a Kalman filter, carrying out shielding prediction on the state of the tracked vehicle to obtain the latest state of the tracked vehicle, calculating the real-time longitude and latitude corresponding to the tracked vehicle based on the latest state and the real-time position of the tracked vehicle, and issuing a waypoint to an unmanned plane based on the real-time longitude and latitude of the tracked vehicle so as to carry out real-time tracking The method has the advantages that each frame of image of a shot picture is divided into the ROI area and the background area, the background area is divided into a plurality of grids, corresponding characteristic points are distributed to the ROI area and the grids, then the optical flow residual error of the characteristic points of the ROI area is calculated through an optical flow method, the real-time position of the tracked vehicle on the image is judged based on the optical flow residual error, then the state of the tracked vehicle is subjected to shielding prediction through a Kalman filter to obtain the latest state of the tracked vehicle, the real-time longitude and latitude corresponding to the tracked vehicle is calculated based on the latest state and the real-time position of the tracked vehicle, and the unmanned aerial vehicle is issued with the waypoint based on the real-time longitude and latitude of the tracked vehicle to carry out real-time tracking. Further, the calculation formula of the dividing range of the ROI area is as follows: ; ; Wherein, the Representing the width of the region of interest (ROI),Representing the length of the region of interest (ROI),Representing the actual width of the tracked vehicle,Representing the number of pixels of the width of the camera image,Representing a predetermined altitude at which the drone is flying,AndAll of which represent the vertical field angle of the camera,Representing the actual length of the tracked vehicle,Representing the number of pixels of the camera image height,Representing a safety factor. Further, each of the grids is matched with a corresponding importance value, and the me