Search

CN-121977610-A - Multi-mobile stage accurate positioning method

CN121977610ACN 121977610 ACN121977610 ACN 121977610ACN-121977610-A

Abstract

The invention relates to the technical field of attitude calibration and discloses a multi-mobile stage accurate positioning method which comprises the steps of arranging a plurality of contact point position measuring units at the bottom or edge of each independently movable stage module to obtain a plurality of three-dimensional contact point coordinates at a module falling point, fitting to obtain normal vectors of a local support plane based on the coordinates, calculating attitude deflection of the module by utilizing a reference direction vector carried by the module, substituting an attitude vector and a space position vector of the module relative to the reference attitude into a relative rigid body transformation model to obtain an accurate position and a three-dimensional attitude of the module in a stage global coordinate system, and completely calculating based on-site real contact point data to realize high-accuracy fitting positioning among the plurality of modules under a complex ground condition.

Inventors

  • FENG HUA
  • SUN TAO
  • WU LIFENG
  • HE HAIYA

Assignees

  • 浙江大丰实业股份有限公司

Dates

Publication Date
20260505
Application Date
20260104

Claims (9)

  1. 1. The multi-mobile stage accurate positioning method is characterized by comprising the following steps of: taking each movable stage as a movable module, arranging an IMU, a contact sensor, a short-distance range finder and a camera on each movable module, presetting a reference coordinate system, and performing on-site self-calibration to obtain the relative position of the sensor, a contact response map and initial internal and external parameters of the camera; Constructing a module bottom contact point set by using the relative height difference obtained by mapping the contact unit pressure; determining a module local plane normal by geometric fitting through a contact point set; Collecting camera pixel offset, and performing closed mapping on the pixel offset into an in-plane displacement correction amount based on a local contact surface; Integrating the IMU acceleration over a short time window to obtain a short term velocity as a redundant measure; Inputting the three-dimensional corresponding point set of the adjacent modules into analytic registration to obtain relative rigid transformation, and algebraic compensating the ranging time sequence deviation; And mapping the obtained relative rigid body transformation to a specific actuator instruction through initializing and self-calibrating actuator response, and issuing the specific actuator instruction.
  2. 2. The method of claim 1, wherein each mobile stage is regarded as a mobile module, an IMU, a contact sensor, a short range finder and a camera are arranged on each mobile module, a reference coordinate system is preset, and on-site self-calibration is performed to obtain a sensor relative position, a contact response map and initial internal and external parameters of the camera, comprising: An IMU of a 3-axis accelerometer and a 3-axis gyroscope is arranged on each mobile module, a contact sensor is arranged on 3 non-collinear contact points, a short-distance range finder is arranged facing the edge of an adjacent module, a camera is arranged facing the edge of the module, and a driving odometer for counting travelling mileage and wheel speed is arranged; For each mobile module, when the modules are stationary, short-distance measuring devices between the control modules measure distance, a preset same characteristic point is observed through vision, and the relative positions of the sensors in a preset module coordinate system are analyzed through a distance measuring result and vision projection; Applying a known static load to each contact sensor, recording the corresponding height change of the pressure readings and the visual measurement, and analyzing the function mapping of the height and the pressure value of each contact sensor through linear fitting; and obtaining preliminary transformation parameters from the camera to the module plane through shooting the attached visual identification and the calibrated geometric layout and through a standard DLT.
  3. 3. The method for precisely positioning a plurality of mobile stages according to claim 1, wherein the constructing a module bottom contact point set using the relative height difference obtained by mapping the contact unit pressures comprises: converting the pressure value measured by the contact sensor into a relative height difference through the mapping function obtained in the initializing step, and combining the height difference with the known relative installation position of the sensor to form a three-dimensional position vector of each contact point under a module coordinate system; collecting accelerometer samples and calculating an average value when the module is stationary to obtain an estimated vector of the inertial measurement unit in the gravity direction, and mapping the gravity vector to a module body coordinate system according to the coordinate transformation of the inertial measurement unit relative to the module body to obtain a vertical estimated vector provided by inertia; Constructing a plane normal under a camera coordinate system through observation of at least three non-collinear characteristic points on a module plane to which the camera belongs, and mapping the normal under the camera coordinate system to a module body coordinate system according to a rotation relation of the camera relative to the module body coordinate system; calculating an included angle between a vertical vector of the inertia estimation and a normal direction obtained by mapping of the camera, and calculating a consistency judgment threshold value based on an angle variance value of the inertia estimation and the normal direction estimation of the camera; When the included angle of the two is smaller than or equal to the threshold value, combining the inertia estimation vertical vector and the camera normal mapping result in an equal weight mode, and normalizing the vertical unit vector serving as a module body; When the included angle between the two is over the threshold value, the inertia estimation vertical vector is used as a module body vertical unit vector, and an observation inconsistency alarm notification manual judgment processing is generated.
  4. 4. The multi-mobile stage precise positioning method according to claim 1, wherein the determining the module local plane normal from the contact point set through geometric fitting comprises: Three non-collinear three-dimensional positions of the contact points under a module coordinate system are selected; The vector difference between the selected contact points is calculated, the direction of the plane normal is obtained through vector cross multiplication, and the obtained vector is normalized to obtain a unit normal, wherein the unit normal is the normal vector of the module local support plane.
  5. 5. The method for precisely positioning a multiple stage according to claim 1, wherein the capturing camera pixel offset and the close mapping of the pixel offset to an in-plane displacement correction based on the local contact surface comprises: Acquiring pixel offset of the adjacent boundary by the camera when the module moves; According to camera internal parameters obtained by on-site self-calibration, the height of the camera under a module coordinate system and the module plane height determined by the contact point set, converting pixel plane offset into three-dimensional displacement correction in the module plane according to an analytic geometrical relationship; The camera height is obtained by initializing camera outliers and the planar height uses the height component in the contact point three-dimensional position vector.
  6. 6. The method of claim 1, wherein integrating IMU acceleration over a short time window to obtain short term velocity as a redundant measure comprises: In a preset short time window, performing time integration after removing a gravity component from the acceleration measured by the IMU to obtain short-term speed estimation; And carrying out accumulation calculation by taking the speed of the short period above the moment as an initial value, and taking zero at the time of initialization.
  7. 7. The method of claim 1, wherein the step of inputting, analyzing and registering three-dimensional corresponding point sets of adjacent modules to obtain relative rigid body transformations, algebraically compensating for ranging time sequence deviations, comprises: identifying each pair of adjacent modules as a module A and a module B, and detecting a shared boundary between the two modules to output a corresponding pixel point set on the boundary; Converting each pixel point into a three-dimensional local point under a module plane coordinate system according to the camera internal parameters of the respective module and the real distance from an imaging surface obtained by field calibration to the camera optical center; Based on the initialized position of each module and the displacement obtained by short-term speed integration, combining with in-plane displacement correction given by vision, obtaining the position of the module under a global coordinate system through forward integration and rotation transformation; Integrating the angular speed of the module in a short time to obtain corresponding local rotation, and constructing a pose transformation matrix of the module so as to project local three-dimensional points to a global coordinate system; After the corresponding global point set of the module A and the module B is obtained, the centroids of the corresponding points of the two parties are calculated, a covariance matrix is formed, and a singular value decomposition is adopted to solve a rotation matrix and a translation vector, so that the relative rigid body transformation is obtained through analysis.
  8. 8. The method of claim 7, wherein the three-dimensional corresponding point set input of adjacent modules are resolved and registered to obtain a relative rigid transformation, algebraically compensate for ranging time sequence deviation, and further comprising: And (3) carrying out time sequence analysis on a ranging sequence directly measured by a short-distance sensor, algebraically solving a time offset according to the ranging value of an adjacent time sample and a motion prior provided by a short-term speed under the condition that communication clock jitter exists, and correcting the ranging value of a middle time point according to the algebraic solving time offset.
  9. 9. The method of claim 1, wherein mapping the initialized self-calibrated actuator response of the obtained relative rigid-body transformation to a specific actuator command and issuing comprises: before executing the control instruction, calibrating an actuator response matrix by sending a plurality of groups of known control instructions to the module actuator and recording the pose change of the module after each instruction; Forming the recorded pose change vectors into a matrix according to columns, forming the corresponding control instruction vectors into a matrix according to columns, and calculating the pseudo-inverse of the response matrix of the actuator by utilizing matrix operation so as to establish a linear mapping relation between the input of the actuator and the pose change; After the relative rotation and translation of the adjacent modules are acquired, converting the rotation part into a corresponding small-angle rotation vector, combining the translation and the small-angle rotation vector, and converting the translation and the small-angle rotation vector into a specific actuator command vector through inverse mapping of the response matrix; The actuator command vector is used for being issued to the module actuator to drive the module to complete the required displacement and angle adjustment.

Description

Multi-mobile stage accurate positioning method Technical Field The invention relates to the technical field of attitude calibration, in particular to a multi-mobile-stage accurate positioning method. Background With the wide application of large-scale performance stages, immersive display platforms and modularized exhibition equipment, a mode of splicing a plurality of independently movable stage modules to form an integral stage structure is gradually adopted by a multi-mobile stage; in such a system, each module needs to realize high-precision spatial positioning and posture alignment after reaching a designated position, otherwise stage joints can be dislocated, separated or inclined, so that the mechanical safety and stage performance effect are affected; in addition, the traditional scheme takes the whole module as an object to position, and lacks the perception of the state of the real contact point at the bottom or edge of the module, so that the real local geometric relationship at the falling point of the module cannot be accurately described, thereby causing the estimation deviation of the attitude angle; In the prior art, although an attitude identification method based on an IMU or a laser radar exists, the IMU is easy to generate accumulated drift in a stage rigid impact and micro-vibration scene, and the laser radar is difficult to acquire real drop point information of the edge of a module in a complex shielding environment, so that the problem of millimeter-level attitude registration during module welting and splicing cannot be effectively solved; In view of the above, there is a need for a positioning method that can directly use actual contact point information of a module at a target position to calculate the pose and spatial position of the module. Disclosure of Invention The invention provides a multi-mobile stage accurate positioning method which facilitates solving the problems mentioned in the background art. The invention provides a multi-mobile stage accurate positioning method, which comprises the following steps: taking each movable stage as a movable module, arranging an IMU, a contact sensor, a short-distance range finder and a camera on each movable module, presetting a reference coordinate system, and performing on-site self-calibration to obtain the relative position of the sensor, a contact response map and initial internal and external parameters of the camera; Constructing a module bottom contact point set by using the relative height difference obtained by mapping the contact unit pressure; determining a module local plane normal by geometric fitting through a contact point set; Collecting camera pixel offset, and performing closed mapping on the pixel offset into an in-plane displacement correction amount based on a local contact surface; Integrating the IMU acceleration over a short time window to obtain a short term velocity as a redundant measure; Inputting the three-dimensional corresponding point set of the adjacent modules into analytic registration to obtain relative rigid transformation, and algebraic compensating the ranging time sequence deviation; And mapping the obtained relative rigid body transformation to a specific actuator instruction through initializing and self-calibrating actuator response, and issuing the specific actuator instruction. Optionally, each mobile stage is regarded as a mobile module, and an IMU, a contact sensor, a short range finder and a camera are arranged on each mobile module, a reference coordinate system is preset, and a field self-calibration is implemented to obtain a relative position of the sensor, a contact response map and initial internal parameters and external parameters of the camera, including: An IMU of a 3-axis accelerometer and a 3-axis gyroscope is arranged on each mobile module, a contact sensor is arranged on 3 non-collinear contact points, a short-distance range finder is arranged facing the edge of an adjacent module, a camera is arranged facing the edge of the module, and a driving odometer for counting travelling mileage and wheel speed is arranged; For each mobile module, when the modules are stationary, short-distance measuring devices between the control modules measure distance, a preset same characteristic point is observed through vision, and the relative positions of the sensors in a preset module coordinate system are analyzed through a distance measuring result and vision projection; Applying a known static load to each contact sensor, recording the corresponding height change of the pressure readings and the visual measurement, and analyzing the function mapping of the height and the pressure value of each contact sensor through linear fitting; and obtaining preliminary transformation parameters from the camera to the module plane through shooting the attached visual identification and the calibrated geometric layout and through a standard DLT. Optionally, the constructing a module bottom contact