CN-121985295-A - Indoor BLE equipment three-dimensional positioning and visualization method
Abstract
The invention relates to the field of augmented reality, in particular to a three-dimensional positioning and visualization method for indoor BLE equipment. A three-dimensional positioning and visualizing method for indoor BLE equipment comprises the following steps of determining global coordinates of the Bluetooth equipment in a base station space coordinate system based on position information of a signal base station, obtaining global coordinates of the visual equipment in the base station space coordinate system based on the position information, and obtaining visual field space coordinates of the Bluetooth equipment in a visual field of the visual equipment according to the visual global coordinates and the Bluetooth global coordinates, wherein the global coordinates of the Bluetooth equipment are determined based on the position information of the signal base station. The invention obtains the global coordinates of the Bluetooth equipment in the base station space coordinate system and the global coordinates of the visual equipment in the base station space coordinate system. The global coordinates of the Bluetooth are converted into the space coordinates of the field of view by using the global coordinates of the visual equipment and the global coordinates of the Bluetooth equipment, so that decoupling of the positioning end and the visual end is realized, and the equipment is more flexible to deploy and use.
Inventors
- ZHAO TIANCI
- SHEN XINGFA
- ZHANG SHAOBO
Assignees
- 杭州电子科技大学
Dates
- Publication Date
- 20260505
- Application Date
- 20260113
Claims (10)
- 1. The indoor BLE equipment three-dimensional positioning and visualization method is characterized by comprising the following steps of: Step S10, determining global coordinates of Bluetooth equipment in a base station space coordinate system based on the position information of a signal base station; step S20, based on the position information, obtaining global coordinates of the visual equipment in the base station space coordinate system; and step S30, acquiring field space coordinates of the Bluetooth equipment in the field of view of the visual equipment according to the visual global coordinates and the Bluetooth global coordinates.
- 2. The three-dimensional positioning and visualization method according to claim 1, characterized in that said step S30 comprises: Establishing a visual equipment coordinate system according to the visual field of the visual equipment; and acquiring the field of view space coordinate of the Bluetooth device in the visual device coordinate system based on the visual base station coordinate and the Bluetooth device global coordinate.
- 3. The three-dimensional positioning and visualization method according to claim 2, characterized in that said step S30 further comprises: And updating the space coordinates of the field of view according to the gesture of the visual equipment.
- 4. A three-dimensional positioning and visualization method as defined in claim 3, wherein the updating the field of view spatial coordinates according to the pose of the visualization device comprises: Acquiring inertial sensor data of the visualization device; Based on the inertial sensor data, the field of view spatial coordinates are updated.
- 5. The three-dimensional localization and visualization method of claim 1, comprising: Calculating pixel coordinates of the field of view space coordinates according to the visualization parameters of the visualization equipment; and if so, carrying out visual display on the Bluetooth equipment according to the visual field space coordinate and visual field information of the visual equipment.
- 6. The method according to claim 5, wherein the visually displaying the bluetooth device according to the field of view spatial coordinates and field of view image information of the visual device comprises: Acquiring the view field image information; acquiring a target mask and a corresponding pixel-level depth map based on the view field image information; Obtaining the average depth of the target mask based on the target mask and the pixel-level depth map; According to the relative distance between the visual equipment and the Bluetooth equipment obtained through wireless positioning; Calculating a distance confidence from the average depth and the relative distance; And visually displaying the Bluetooth equipment according to the distance confidence.
- 7. The method of three-dimensional localization and visualization of claim 6, wherein the obtaining a target mask and a corresponding pixel-level depth map based on the field of view image information comprises: and processing the view field image information by using a monocular depth technology to obtain the target mask and the pixel level depth map.
- 8. The three-dimensional localization and visualization method of claim 6, wherein the calculating a distance confidence from the average depth and the relative distance comprises: calculating the absolute value of the difference value between the average depth and the relative distance to obtain an absolute depth difference; and calculating the absolute depth difference to obtain the distance confidence coefficient.
- 9. The method for three-dimensional positioning and visualization according to claim 8, wherein said visually displaying the bluetooth device according to the distance confidence comprises: Acquiring the image confidence of the view field image information; Multiplying the distance confidence coefficient by the image confidence coefficient to obtain a fusion confidence coefficient of the target mask; and visually displaying the Bluetooth device corresponding to the fusion confidence coefficient which is highest in the view field.
- 10. The method of three-dimensional localization and visualization of claim 7, wherein the deriving an average depth based on the target mask and the pixel level depth map comprises: and carrying out matching calculation on the pixel-level depth map and the target mask to obtain the average depth.
Description
Indoor BLE equipment three-dimensional positioning and visualization method Technical Field The invention relates to the field of augmented reality, in particular to a three-dimensional positioning and visualization method for indoor BLE equipment. Background With the development of the internet of things and mobile computing, indoor positioning demands are rapidly increasing. Compared with schemes such as UWB, wi-Fi, RFID and the like, bluetooth has the advantages of low power consumption, low cost, ecological maturity, high terminal popularity and the like, and becomes one of important technical routes for indoor positioning. Existing bluetooth positioning is mainly based on RSSI fingerprint or simple multilateral positioning, and a display form usually stays on a 2D plane thermodynamic diagram or a top view track. The method is difficult to express spatial information such as three-dimensional height difference, shielding relation and the like, a user still needs additional information in a complex three-dimensional environment to find a target, and the interaction burden is heavy. Based on the above limitations, the industry is urgent to need a new positioning method, which can output a stable three-dimensional position, superimpose the state and the spatial position of the display device in real time in the current view angle of the user, and support multi-target matching, shielding judgment and out-of-view prompt, so as to reduce the cognitive burden of the user and improve the retrieval and inspection efficiency. Disclosure of Invention The invention aims to provide a three-dimensional positioning and visualizing method for indoor BLE equipment. The technical scheme adopted by the invention for solving the technical problems is that the three-dimensional positioning and visualization method of the indoor BLE equipment comprises the following steps: Step S10, determining global coordinates of Bluetooth equipment in a base station space coordinate system based on the position information of a signal base station; step S20, based on the position information, obtaining global coordinates of the visual equipment in the base station space coordinate system; and step S30, acquiring field space coordinates of the Bluetooth equipment in the field of view of the visual equipment according to the visual global coordinates and the Bluetooth global coordinates. Optionally, the step S30 includes: Establishing a visual equipment coordinate system according to the visual field of the visual equipment; and acquiring the field of view space coordinate of the Bluetooth device in the visual device coordinate system based on the visual base station coordinate and the Bluetooth device global coordinate. Optionally, the step S30 further includes: And updating the space coordinates of the field of view according to the gesture of the visual equipment. Optionally, the updating the field of view spatial coordinates according to the pose of the visualization device includes: Acquiring inertial sensor data of the visualization device; Based on the inertial sensor data, the field of view spatial coordinates are updated. Optionally, the method comprises: Calculating pixel coordinates of the field of view space coordinates according to the visualization parameters of the visualization equipment; and if so, carrying out visual display on the Bluetooth equipment according to the visual field space coordinate and visual field information of the visual equipment. Optionally, the performing visual display on the bluetooth device according to the field of view spatial coordinates and field of view image information of the visual device includes: Acquiring the view field image information; acquiring a target mask and a corresponding pixel-level depth map based on the view field image information; obtaining the average depth of the target mask based on the pixel-level depth map; According to the relative distance between the visual equipment and the Bluetooth equipment obtained through wireless positioning; Calculating a distance confidence from the average depth and the relative distance; And visually displaying the Bluetooth equipment according to the distance confidence. Optionally, the acquiring the target mask and the corresponding pixel-level depth map based on the field image information includes: and processing the view field image information by using a monocular depth technology to obtain the target mask and the pixel level depth map. Optionally, the calculating a distance confidence according to the average depth and the relative distance includes: calculating the absolute value of the difference value between the average depth and the relative distance to obtain an absolute depth difference; and calculating the absolute depth difference to obtain the distance confidence coefficient. Optionally, the visually displaying the bluetooth device according to the distance confidence includes: Acquiring the image confidence of the view