Search

CN-120406527-B - Bionic robot action control method based on action control

CN120406527BCN 120406527 BCN120406527 BCN 120406527BCN-120406527-B

Abstract

The invention discloses a bionic robot action control method based on behavior control, which relates to the technical bionic robot control field and comprises the following steps of establishing a global terrain coordinate system, completing multi-sensor calibration, starting an autonomous navigation task by a robot, starting a visual sensor to collect surrounding environment images, processing the collected images, extracting color and texture features in the images, identifying an interfering object in the images by an interfering object identification module, extracting characteristic information of the interfering object, calculating an interference index by an interference index analysis module according to the characteristic information of the interfering object, acquiring terrain environment information in real time by a terrain environment complexity assessment module, assessing the complexity of the terrain environment, and making action control decisions by a action control decision module according to the interference index and the terrain environment complexity assessment result and sending control instructions to a motion executing mechanism.

Inventors

  • LIU JUNHUI

Assignees

  • 智绘机器人科技(江阴)有限公司

Dates

Publication Date
20260508
Application Date
20250423

Claims (6)

  1. 1. A bionic robot action control method based on behavior control is characterized by comprising the following steps: step S1, establishing a global terrain coordinate system, completing multi-sensor calibration, starting an autonomous navigation task by a robot, and starting to acquire surrounding environment images by a vision sensor; s2, processing the acquired image and extracting color and texture features in the image; s3, the interfering object identification module identifies the interfering object in the signal and extracts the characteristic information of the signal; s4, calculating an interference index by an interference index analysis module according to the characteristic information of the interfering object; s5, a terrain environment complexity evaluation module collects terrain environment information in real time and evaluates the complexity of the terrain environment; Step 6, the action control decision module synthesizes the interference index and the terrain environment complexity evaluation result, makes an action control decision, and sends a control instruction to the motion executing mechanism; step S7, the motion executing mechanism drives the motor of the joints of the limbs of the robot according to the control instruction to realize corresponding actions; Step S8, continuously repeating the steps in the movement process of the robot, and adjusting the action control strategy in real time to ensure that the robot can safely and accurately complete the autonomous navigation task; in the step S4, the specific method for calculating the interference index by the interference index analysis module according to the feature information of the interfering object is as follows: s41, distributing corresponding weights for each characteristic information according to the influence degree of the interferents on the autonomous navigation task of the robot; Step S42, calculating formula through interference index Calculated interference index Wherein Indicating the interference index (i.e. the interference index), The weight of the i-th feature is represented, A value representing the i-th feature, n representing the total number of features; step S43, according to the calculated interference index Evaluating the influence degree of the interference object, setting the interference index threshold value The interferents are classified into different classes when Low interference is judged, when Judging that the interference is high; In the step S5, the terrain environment complexity evaluation module collects the terrain environment information in real time, and the specific workflow for evaluating the complexity of the terrain environment includes the following steps: step S51, obtaining the image information processed by the previous steps, wherein the image information comprises a color histogram, an LBP histogram and an interference object identification result; step S52, extracting color histogram entropy according to a preset algorithm Complexity index of texture corresponding to LBP histogram ; Step S53, through the topography environment complexity index formula Calculating to obtain a topography environment complexity index Wherein 、 Control parameters of complexity indexes of textures corresponding to the color histogram entropy and the LBP histogram respectively; step S54 local terrain environment complexity index When the current situation is the complex situation of the terrain environment, the local terrain environment complexity index is judged And judging that the current situation is the simple situation of the terrain environment.
  2. 2. The method for controlling the motion of the bionic robot based on the behavior control of claim 1, wherein in the step S2, the collected image is processed, and the specific method for extracting the color and the texture features in the image comprises the following steps: Step S21, converting a color image acquired by a camera into a gray image, and carrying out gray processing by adopting a weighted average method, wherein the calculation expression is as follows: Wherein Respectively the pixel values of the red, green and blue channels in the color image, Pixel values for a gray scale image; s22, adopting a Gaussian filter algorithm to reduce noise of the gray image, and setting a Gaussian kernel as The calculation expression is that Wherein Is the standard deviation of the gaussian kernel, Carrying out convolution operation on the Gaussian kernel and the image to obtain a noise-reduced image as coordinates of the pixel; step S23, converting the gray image into HSV color space, wherein the conversion formula is as follows: ; ; ; Step S24, carrying out histogram statistics on hue components in an HSV color space, dividing a hue value range into a plurality of areas, counting the number of pixels in each area to obtain a color histogram, and extracting main color features in an image according to the color histogram; step S25, selecting one for each pixel in the image with the pixel as the center The gray value of the central pixel of the neighborhood is used as a threshold value, the gray value of the central pixel of the neighborhood is compared with the gray values of 8 surrounding neighborhood pixels, when the gray value of the neighborhood pixel is larger than or equal to the gray value of the central pixel, the position of the neighborhood pixel is marked as 1, otherwise, the neighborhood pixel is marked as 0, the steps are repeated until each pixel obtains an 8-bit binary number, and the binary number is converted into a decimal number to be used as the LBP value of the pixel; and step S26, counting the distribution condition of different LBP values in the image to obtain an LBP histogram, and extracting the image texture characteristics according to the LBP histogram.
  3. 3. The method for controlling the motion of the bionic robot based on the behavior control of claim 1, wherein in the step S3, the specific method for identifying the interferents in the model and extracting the characteristic information thereof is as follows: Step S31, obtaining an image after image preprocessing, color feature extraction and texture feature extraction; Step S32, establishing a template database of the interferents, and sorting and summarizing samples of different angles and illumination conditions of each template in the template database; Step S33, dividing the image into a plurality of subareas by adopting a template matching algorithm, comparing each subarea in the identification image with templates in a template library, calculating the similarity between the subareas and the templates, and measuring the similarity by adopting a normalized cross-correlation coefficient, wherein the formula is that Wherein Is the pixel value of the sub-region, Is the pixel mean value of the sub-region, For the pixel value of the template, For the pixel mean of the template, when the similarity exceeds the threshold When the interference objects exist in the subareas, the interference objects are judged, and the interference objects are primarily classified according to the similarity matching result; And step S34, extracting characteristic information of the identified interference object region, wherein the characteristic information comprises color characteristics and texture characteristics of the interference object and the position and size characteristics of the interference object in the image.
  4. 4. The method for controlling the motion of the biomimetic robot based on the behavior control of claim 1, wherein the motion control decision in the step S6 is specifically: When the interference index is of a high interference level and the terrain environment is complex, the robot selects to bypass; When the interference index is of a low interference level and the terrain environment is simple, the robot directly spans or steps on the interfering object; when the interference index is of a low interference level but the terrain environment is complex, further analyzing and judging whether the interference object is considered to cover the potential pit danger, if the potential pit danger is judged to exist, carefully processing, otherwise, passing through normally.
  5. 5. The method for controlling actions of a biomimetic robot based on behavior control of claim 4, wherein the specific method for analyzing and judging whether the potential pit danger is blocked by the interfering object is as follows: step A, extracting a terrain elevation mutation area by adopting a gradient threshold method based on depth map data of a visual sensor, and if the depth difference of adjacent pixels exceeds the depth difference Marking as a suspected pit boundary; Step B, analyzing the surface temperature abnormal region by combining the infrared thermal imaging data, if the temperature gradient distribution and the peripheral topography difference are more than the difference Judging the potential pit; step C, projecting the detected pit characteristics to a global topographic coordinate system, and recording a pit center coordinate set Wherein k=1, 2, n; Step D, calculating the three-dimensional space distance between adjacent pits Traversing all pit combinations to generate a distance matrix; step E, calculating pit distribution density index based on the distance matrix When (when) The determination needs to consider that the interference object obstructs the potential pit hazard Is a preset pit distribution density threshold.
  6. 6. The method for controlling the motion of a biomimetic robot based on behavior control of claim 4, wherein if it is determined that there is a potential pit hazard, the method of carefully handling is as follows: controlling the foot end of the robot to implement heuristic contact on the edge of the interfering object; monitoring the fluctuation of ground reaction force in real time through a six-dimensional force sensor, if the vertical rigidity coefficient is Judging that the virtual support risk exists; The piezoelectric film sensor is combined to detect the earth surface vibration frequency spectrum, and when the main frequency component is offset The presence of pits is confirmed.

Description

Bionic robot action control method based on action control Technical Field The invention relates to the technical field of bionic robot control, in particular to a bionic robot action control method based on action control. Background With the continuous development of robot technology, the bionic robot is gradually widely applied in the fields of disaster relief, complex terrain inspection, military reconnaissance, field operation and the like. Compared with the traditional wheeled or crawler-type mobile robot, the bionic robot simulates the limb structure and the movement mode of animals, has stronger terrain adaptability and flexibility, and is particularly suitable for task execution in rugged, uneven, obstacle-dense and other complex environments. However, in a complex natural environment, robots face multiple challenges such as obstacle avoidance, path selection, terrain awareness and adaptation, and intelligent response to the environment is difficult to achieve by simply relying on rule presetting or path planning algorithms. At present, some bionic robots perform navigation control by combining path planning with obstacle avoidance control, but comprehensive consideration of types and influence degrees of interferents in the environment, complexity of the terrain environment and other multidimensional factors is often ignored. For example, when the robot faces to the disturbance objects such as a grass, a stone, fallen leaves and the like, the system can only simply judge according to the existence or non-existence of the disturbance objects, and the deep analysis and response strategy selection of the specific characteristics of the disturbance objects are lacking. Meanwhile, most of current bionic robot control strategies are based on rigid control logic, lack of behavior decision-making capability of quasi-living things, and difficulty in dynamically adjusting action strategies according to environmental changes, so that the motion efficiency and the safety of the bionic robot in an uncertainty environment are low. For example, when encountering a blocked field of view or potential pit, conventional robots lack heuristic action and perceived linkage mechanisms, with the risk of misstepping, dropping, etc. Therefore, there is an urgent need for a control method of a bionic robot with a behavior decision capability, which can identify the characteristics of an interfering object in an environment and evaluate the interference degree, and dynamically adjust a motion control strategy in combination with terrain complexity analysis. Disclosure of Invention The invention aims to provide a bionic robot action control method based on behavior control, which aims to solve the problems in the background technology. In order to solve the technical problems, the invention provides a bionic robot action control method based on action control, which comprises the following steps: step S1, establishing a global terrain coordinate system, completing multi-sensor calibration, starting an autonomous navigation task by a robot, and starting to acquire surrounding environment images by a vision sensor; s2, processing the acquired image and extracting color and texture features in the image; s3, the interfering object identification module identifies the interfering object in the signal and extracts the characteristic information of the signal; s4, calculating an interference index by an interference index analysis module according to the characteristic information of the interfering object; s5, a terrain environment complexity evaluation module collects terrain environment information in real time and evaluates the complexity of the terrain environment; Step 6, the action control decision module synthesizes the interference index and the terrain environment complexity evaluation result, makes an action control decision, and sends a control instruction to the motion executing mechanism; step S7, the motion executing mechanism drives the motor of the joints of the limbs of the robot according to the control instruction to realize corresponding actions; And S8, continuously repeating the steps in the movement process of the robot, and adjusting the action control strategy in real time to ensure that the robot can safely and accurately complete the autonomous navigation task. According to the above technical scheme, in the step S2, the specific method for processing the collected image and extracting the color and texture features in the image includes: S21, converting a color image acquired by a camera into a Gray image, and carrying out graying treatment by adopting a weighted average method, wherein the calculation expression is gray=0.299R+0.587G+0.114B, wherein R, G and B are pixel values of red, green and blue channels in the color image respectively, and Gray is the pixel value of the Gray image; s22, adopting a Gaussian filter algorithm to reduce noise of the gray image, setting a Gaussian kernel as G (x, y), a