Search

CN-122015824-A - Navigation control method and device of robot and robot

CN122015824ACN 122015824 ACN122015824 ACN 122015824ACN-122015824-A

Abstract

The embodiment of the application provides a navigation control method and device for a robot and the robot, wherein the control method comprises the steps of embedding a spatial external parameter and a time offset as state variables into a real-time optimization framework, performing time synchronization evaluation according to visual image stream data and measurement data based on the real-time optimization framework to obtain an evaluation result of the time offset, performing external parameter consistency evaluation according to the visual image stream data and the measurement data based on the real-time optimization framework to obtain an evaluation result of the spatial external parameter, and determining a confidence degree scoring index of a space-time parameter according to the evaluation result of the time offset and the evaluation result of the spatial external parameter. According to the embodiment of the application, the confidence score index of the space-time parameter is determined based on the real-time optimization framework, the calibration process of the online self-calibration technology is comprehensively evaluated, the evaluation of alignment quality can be realized without acquiring the scene of the dynamic capture or laser map, and the online monitoring of the navigation state of the robot is further realized.

Inventors

  • CUI HAOHAN
  • WANG CHUAN
  • BIAN CHUNHUA
  • ZHANG ZIHAO
  • QIN QIANG
  • Shi Guangdie
  • WANG CHUNBO

Assignees

  • 无锡北微传感科技有限公司

Dates

Publication Date
20260512
Application Date
20260203
Priority Date
20251222

Claims (10)

  1. 1. A control method of robot navigation, comprising: acquiring visual image stream data acquired by a camera carried on the robot and measurement data of an inertial measurement unit; embedding the spatial external parameters and the time offset as state variables into a real-time optimization architecture; based on the real-time optimization architecture, performing time synchronization evaluation according to the visual image stream data and the measurement data to obtain an evaluation result of the time offset; Based on the real-time optimization architecture, performing external parameter consistency evaluation according to the visual image stream data and the measurement data to obtain an evaluation result of the spatial external parameter; determining confidence degree scoring indexes of the space-time parameters according to the evaluation results of the time offset and the evaluation results of the spatial external parameters; and comparing the confidence degree scoring index of the space-time parameter with a target threshold value, and triggering a robot navigation state correction instruction according to a comparison result so as to control the robot navigation.
  2. 2. The method of assessing alignment of a camera with an inertial measurement unit of claim 1, wherein the acquiring visual image stream data of the camera and measurement data of the inertial measurement unit comprises: acquiring visual image stream data of the camera and measurement data of the inertial measurement unit through online subscription or offline importing a preset configuration file; and dynamically updating the visual image stream data and the measurement data acquired at the current moment according to a preset time based on a sliding window buffer module.
  3. 3. The method according to claim 1, wherein the real-time optimization architecture includes a sliding window optimization architecture, and the performing time synchronization evaluation based on the real-time optimization architecture according to the visual image stream data and the measurement data to obtain the evaluation result of the time offset includes: Defining a time offset residual function in the sliding window optimization architecture; Generating a pixel track according to the visual image stream data; generating a gesture track of the inertial measurement unit according to the angular velocity integral of the measurement data; According to the pixel track and the gesture track of the inertial measurement unit, carrying out iterative solution on the time offset residual function, and determining the optimized time offset; And determining an evaluation result of the time offset according to the posterior covariance of the optimized time offset.
  4. 4. The method according to claim 1, wherein the real-time optimization architecture includes a desired maximization architecture, and the performing time synchronization evaluation based on the real-time optimization architecture according to the visual image stream data and the measurement data to obtain the evaluation result of the time offset includes: Defining a probability generation model of time offset within the desired maximization architecture; Generating a pixel track according to the visual image stream data; generating a gesture track of the inertial measurement unit according to the angular velocity integral of the measurement data; According to the pixel track and the gesture track of the inertia measurement unit, carrying out iterative solution on the probability generation model of the time offset, and determining the optimized time offset; And determining an evaluation result of the time offset according to the posterior covariance of the optimized time offset.
  5. 5. The method according to claim 1, wherein the real-time optimization architecture includes a kalman filter architecture, and the performing time synchronization evaluation based on the real-time optimization architecture according to the visual image stream data and the measurement data to obtain the evaluation result of the time offset includes: defining a nonlinear clock error model of time offset in the Kalman filtering architecture; constructing a time offset state observation equation taking a time offset residual as a measurement value according to a nonlinear clock error model of the time offset; Generating a pixel track according to the visual image stream data; generating a gesture track of the inertial measurement unit according to the angular velocity integral of the measurement data; according to the pixel track and the gesture track of the inertial measurement unit, carrying out iterative solution on the time offset state observation equation, and determining the optimized time offset; And determining an evaluation result of the time offset according to the posterior covariance of the optimized time offset.
  6. 6. The method for controlling navigation of a robot according to claim 1, wherein the real-time optimization architecture includes a sliding window optimization architecture, and the performing, based on the real-time optimization architecture, an external parameter consistency evaluation according to the visual image stream data and the measurement data to obtain an evaluation result of the spatial external parameter includes: Defining a time offset residual function in the sliding window optimization architecture; Generating a pixel track according to the visual image stream data; generating a gesture track of the inertial measurement unit according to the angular velocity integral of the measurement data; carrying out iterative solution on the time offset residual function according to the pixel track and the gesture track of the inertial measurement unit, and determining the optimized time offset; And determining an evaluation result of the time offset according to the posterior covariance of the optimized time offset.
  7. 7. The method for controlling navigation of a robot according to claim 1, wherein the real-time optimization architecture includes a desired maximization architecture, and the performing, based on the real-time optimization architecture, an outlier consistency evaluation according to the visual image stream data and the measurement data to obtain an evaluation result of the spatial outlier includes: Defining a probability generation model of spatial outliers within the expectation-maximization architecture; Generating a pixel track according to the visual image stream data; generating a gesture track of the inertial measurement unit according to the angular velocity integral of the measurement data; according to the pixel track and the gesture track of the inertia measurement unit, carrying out iterative solution on the probability generation model of the spatial external parameter, and determining the optimized time offset; And determining an evaluation result of the time offset according to the posterior covariance of the optimized time offset.
  8. 8. The navigation control method of a robot according to any one of claims 1 to 7, further comprising: Providing a visual interface, the visual interface being configurable to generate a format report; Based on the data updating signal provided by the robot, the visual interface displays the visual image stream data and the measurement data which are updated in real time, the evaluation result of the time offset, the evaluation result of the spatial external parameter, the confidence degree scoring index of the space-time parameter and the navigation state of the robot.
  9. 9. A navigation control device for a robot, comprising: The data acquisition module is configured to acquire visual image stream data acquired by a camera carried on the robot and measurement data of the inertial measurement unit; the on-line calibration module is configured to embed a spatial external parameter and a time offset as state variables into a real-time optimization framework, perform time synchronization evaluation according to the visual image stream data and the measurement data based on the real-time optimization framework to obtain an evaluation result of the time offset, and perform external parameter consistency evaluation according to the visual image stream data and the measurement data based on the real-time optimization framework to obtain an evaluation result of the spatial external parameter; The comprehensive evaluation module is configured to determine a confidence degree scoring index of the space-time parameter according to the evaluation result of the time offset and the evaluation result of the spatial external parameter; and the signal triggering module is configured to compare the confidence degree scoring index of the space-time parameter with a target threshold value and trigger a robot navigation state correction instruction according to a comparison result so as to control the robot navigation.
  10. 10. A robot comprising the navigation control device of the robot according to claim 9.

Description

Navigation control method and device of robot and robot Technical Field The application relates to the technical field of robot navigation, in particular to a navigation control method and device for a robot and the robot. Background In the field of robot navigation, the complementarity of a camera and an Inertial Measurement Unit (IMU) is realized through a vision-inertial odometer structure, wherein the camera provides high-precision geometric constraint, the IMU provides high-frequency angular velocity and linear acceleration, and the limitations of limited frame rate, illumination sensitivity and pure integral drift of the IMU are effectively overcome. In the related art, an offline calibration tool chain (such as Kalibr) is used for jointly optimizing camera internal parameters, distortion, IMU noise, zero offset, space-time external parameters and time offset, but the tool chain is required to operate under specific motion excitation (such as high-frequency rotation) and calibration targets (such as checkerboard), and the outdoor environment is easily influenced by wind disturbance and target shielding to cause parameter deviation. On-line self-calibration techniques require adequate excitation and observability, otherwise state-to-parameter coupling can lead to slow convergence or bias. The evaluation link severely depends on true value sources such as an dynamic capture system, a laser map and the like to calculate Absolute Track Errors (ATE) and Relative Pose Errors (RPE), but outdoor or unstructured scenes (such as forests and ruins) cannot deploy such facilities, and the existing non-true value method cannot quantify time-space alignment advantages and disadvantages due to single index, lack of unified base lines and long-term drift monitoring blind areas, and cannot meet the real-time evaluation requirement of robot navigation tracks. Thus, there is a need to build a systematic, truth-free assessment framework to support reliable navigation of robots in complex dynamic environments. Disclosure of Invention The embodiment of the application provides a navigation control method and device for a robot and the robot, and confidence score indexes of space-time parameters are determined based on a real-time optimization framework. The confidence score index of the space-time parameter can normalize the evaluation result of the time offset and the evaluation result of the space external parameter, and comprehensively evaluate the calibration process of the online self-calibration technology. The method has the advantages that the estimation result of the time offset and the related parameters in the estimation result of the spatial external parameters replace external references, the estimation of alignment quality can be realized without acquiring a scene of dynamic capture or a laser map, and further, the on-line monitoring of the navigation state of the robot is realized. The navigation control method of the robot comprises the steps of obtaining visual image stream data and measurement data of an inertial measurement unit, which are collected by a camera of the robot, embedding spatial external parameters and time offset into a real-time optimization framework, performing time synchronization assessment according to the visual image stream data and the measurement data based on the real-time optimization framework to obtain an assessment result of the time offset, performing external parameter consistency assessment according to the visual image stream data and the measurement data based on the real-time optimization framework to obtain an assessment result of the spatial external parameters, determining confidence score indexes of the spatial external parameters according to the assessment result of the time offset and the assessment result of the spatial external parameters, comparing the confidence score indexes of the spatial external parameters with a target threshold, and triggering a robot navigation state correction instruction according to the comparison result to control the robot navigation. In some embodiments, the acquiring the visual image stream data of the camera and the measurement data of the inertial measurement unit includes acquiring the visual image stream data of the camera and the measurement data of the inertial measurement unit by online subscription or offline importing a preset configuration file. In some embodiments, the real-time optimization architecture comprises a sliding window optimization architecture, and the real-time optimization architecture is based on the real-time optimization architecture, performs time synchronization evaluation according to the visual image stream data and the measurement data to obtain an evaluation result of the time offset, and comprises the steps of defining a time offset residual function in the sliding window optimization architecture, generating a pixel track according to the visual image stream data, generating a gesture track o