Search

CN-121979379-A - AR perception interaction system and method based on integration of computing network

CN121979379ACN 121979379 ACN121979379 ACN 121979379ACN-121979379-A

Abstract

The invention belongs to the technical field of industrial collaborative augmented reality, and relates to an AR perception interaction system and method based on integration of computing network, wherein the AR perception interaction system comprises a multi-mode perception acquisition module, a multi-mode original perception flow acquisition module and a multi-mode interaction module; the system comprises a primary spatial relation calculation module, a global cooperative characteristic construction module, a cooperative mode identification judgment module, a differential task allocation module, a self-adaptive virtual-real fusion display module, a broken network self-supporting cooperative module and a broken network self-supporting cooperative module, wherein the primary spatial relation calculation module generates a primary spatial relation vector representing the local position of equipment, the global cooperative characteristic construction module constructs a global cooperative behavior characteristic matrix, the cooperative mode identification judgment module generates a unique calculation power and precision strategy instruction, the differential task allocation module dynamically recombines augmented reality content to generate and issue a differential cooperative rendering task packet, the self-adaptive virtual-real fusion display module executes self-adaptive virtual-real fusion display, and the broken network self-supporting cooperative module is used as a unique positioning reference and a unique cooperative center for other broken network equipment in a field. The invention solves the problems that the high-precision coordination requirement is difficult to meet, the position deviation among multiple devices is accumulated, and the virtual-real fusion effect is poor.

Inventors

  • HUANG CHENGUANG
  • CHEN KAI
  • ZHOU ZILU
  • GUI ZHENGRONG
  • LI BING
  • Tan Yidao
  • WANG ZIHAN
  • WANG WENYUAN

Assignees

  • 中国建筑第四工程局有限公司

Dates

Publication Date
20260505
Application Date
20251222

Claims (10)

  1. 1. An AR perception interaction system based on integration of computing network is characterized by comprising the following modules: the multi-mode sensing acquisition module is executed by a plurality of augmented reality devices in parallel and acquires multi-mode original sensing streams containing a first visual angle environment image, physiological index data and encrypted low-frequency magnetic field signals; The primary spatial relationship calculation module is locally executed by each augmented reality device and generates a primary spatial relationship vector representing the local position of the device based on the encrypted low-frequency magnetic field signals in the multi-mode original perceived stream; the global collaborative feature construction module is executed by the field edge server and constructs a global collaborative behavior feature matrix of the combined global digital twin map and the physiological state time sequence model according to the preliminary spatial relation vector and the physiological index data received from all the augmented reality equipment; The collaborative pattern recognition judging module is executed by the field edge server, analyzes the global collaborative behavior feature matrix to recognize the group behavior pattern, judges a unique collaborative pattern grade according to the matching result of the group behavior pattern and a preset rule base, and generates a unique calculation power and precision strategy instruction; The differentiated task allocation module is executed by the field edge server, dynamically reorganizes the augmented reality content according to the calculation force and precision strategy instruction to generate and issue a differentiated collaborative rendering task package; the self-adaptive virtual-real fusion display module is locally executed by each augmented reality device, receives and analyzes the differentiated collaborative rendering task package, and executes self-adaptive virtual-real fusion display according to the content of the task package and the first visual angle environment image; the off-grid self-sustaining cooperative module is automatically executed by the augmented reality device which is in communication interruption with the field edge server, is switched to a temporary master control node, and starts a high-power magnetic field emission mode to construct an independent miniature space field, so that other off-grid devices in the field can be used as unique positioning references and cooperative centers.
  2. 2. The AR-aware interaction system based on the integration of computing networks according to claim 1, wherein the multi-modal raw perceived stream of physiological index data and encrypted low frequency magnetic field signals comprises the steps of: driving an image sensor on each augmented reality device to capture a first perspective environmental image; activating a wearable sensing unit connected with the augmented reality equipment to acquire physiological index data; starting a magnetic induction chip set integrated by the augmented reality equipment, and receiving encrypted low-frequency magnetic field signals which are transmitted by other augmented reality equipment and contain unique equipment identity codes; and aligning the first visual angle environment image, the physiological index data and the encrypted low-frequency magnetic field signal under the sharing time reference to form a multi-mode original perception stream.
  3. 3. The AR-aware interaction system based on the integration of computing networks according to claim 1, wherein generating a preliminary spatial relationship vector characterizing the local location of the device comprises the steps of: analyzing the encrypted low-frequency magnetic field signal, and decoding a unique equipment identity code of the signal source equipment; Based on the intensity and phase difference of the received magnetic field signals from different sources, calculating the relative position and posture of the equipment relative to one or more other equipment through a triangulation algorithm; and packaging the calculated relative position and posture information to form a preliminary spatial relationship vector.
  4. 4. The AR-aware interaction system based on the integration of computing networks according to claim 1, wherein the construction of the global collaborative behavior feature matrix combining the global digital twin map with the physiological state timing model comprises the following steps: constructing a global digital twin map for describing the physical positions and orientations of all the devices under a unified coordinate system according to each preliminary spatial relation vector; Carrying out normalization processing on the received physiological index data of each device, and establishing a physiological state time sequence model reflecting the whole synergistic group and the physiological state of the individual; And combining the global digital twin map and the physiological state time sequence model to jointly form a global collaborative behavior feature matrix.
  5. 5. The AR-aware interaction system based on the integration of computing networks according to claim 1, wherein the unique collaborative pattern level is determined, comprising the steps of: Extracting a group behavior mode from the global cooperative behavior feature matrix, wherein the group behavior mode comprises an aggregation state of equipment on a physical space and a synchronization state of physiological index data of corresponding operators; And matching the extracted group behavior pattern with a preset rule base to judge the unique collaborative pattern level.
  6. 6. The AR-aware interaction system based on the integration of computing networks according to claim 1, wherein the dynamic reorganization of the augmented reality content to generate and deliver differentiated collaborative rendering task packages comprises the steps of: if the collaboration mode level is high-precision focusing collaboration, the on-site edge server completes high-precision space calculation and rendering, and a final rendering result video stream is generated as the content of a differentiated collaborative rendering task package; If the collaboration mode level is a conventional decentralized operation, only a lightweight three-dimensional model and coordinate data are issued as the contents of the differentiated collaborative rendering task package.
  7. 7. The AR-aware interaction system based on the integration of computing networks according to claim 1, wherein the adaptive virtual-real fusion display is performed according to the content of the task package and the first view environment image, and the method comprises the following steps: If the content of the differentiated collaborative rendering task package is a final rendering result video stream, performing pixel level alignment superposition on the content of the differentiated collaborative rendering task package and the first visual angle environment image; And if the content of the differentiated collaborative rendering task package is the light three-dimensional model and the coordinate data, calling a local computing resource to complete rendering, and fusing and displaying the local computing resource and the first visual angle environment image.
  8. 8. The AR-aware interaction system based on the integration of computing networks according to claim 1, wherein the system is switched to a temporary master control node and enables a high-power magnetic field emission mode to construct an independent miniature space field for other off-network devices in the field to serve as a unique positioning reference and a coordination center, and comprises the following steps: after the communication with the field edge server is interrupted, designating one device in the off-network device according to a preset election mechanism, and switching the role of the device from a conventional node to a temporary master control node; enabling a high-power magnetic field emission mode by the temporary main control node, and actively constructing an independent miniature space field; other off-network devices located within the miniature space field automatically stop connection attempts to the main server, and instead use the temporary master control node as its sole location reference and coordination center.
  9. 9. The AR-aware interaction system based on integration of computing networks of claim 8, wherein generating unique computing power and accuracy policy instructions comprises the steps of: If the cooperative mode level is the individual high-risk early warning, the on-site edge server generates and issues a high-priority alarm instruction to the target augmented reality equipment; after receiving the high priority alert instructions, the target augmented reality device triggers a local alert interface that includes audible, visual, and tactile feedback.
  10. 10. An AR perception interaction method based on integration of computing network is characterized by comprising the following steps: S1, a plurality of augmented reality devices execute in parallel, and a multi-mode original perceived stream comprising a first visual angle environment image, physiological index data and an encrypted low-frequency magnetic field signal is acquired; s2, locally executing by each augmented reality device, and generating a preliminary spatial relation vector representing the local position of the device based on the encrypted low-frequency magnetic field signals in the multi-mode original perceived stream; S3, executing by a field edge server, and constructing a global collaborative behavior feature matrix of the combined global digital twin map and the physiological state time sequence model according to the preliminary spatial relation vector and the physiological index data received from all the augmented reality devices; S4, executing by the field edge server, analyzing the global cooperative behavior feature matrix to identify a group behavior mode, judging a unique cooperative mode level according to a matching result of the group behavior mode and a preset rule base, and generating a unique calculation power and precision strategy instruction; s5, executing by the field edge server, dynamically reorganizing the augmented reality content according to the calculation force and precision strategy instruction to generate and issue a differentiated collaborative rendering task package; s6, locally executing by each augmented reality device, receiving and analyzing the differentiated collaborative rendering task package, and executing self-adaptive virtual-real fusion display according to the content of the task package and the first visual angle environment image; And S7, automatically executing by the augmented reality equipment with interrupted communication with the field edge server, switching to a temporary master control node, and starting a high-power magnetic field emission mode to construct an independent miniature space field, wherein other off-grid equipment in the field is used as a unique positioning reference and a collaboration center.

Description

AR perception interaction system and method based on integration of computing network Technical Field The invention belongs to the technical field of industrial collaborative augmented reality, and relates to an AR perception interaction system and method based on integration of computing networks. Background The cooperative work of the current industrial field has various challenges, the traditional single augmented reality device is used for individuals, the support of the cooperative capability of multiple persons is lacking, and when multiple operators need to finish precise tasks together, the devices often lack effective position sensing and state synchronization mechanisms, so that the cooperative efficiency is low, and the cooperative deviation is easy to generate. Meanwhile, in a complex industrial environment, GPS signals are often shielded, wi-Fi signals are unstable, and the existing positioning method based on signal intensity or time difference is difficult to meet the cooperative docking requirement of centimeter-level precision. The network connection on site is often not reliable enough, and any interruption can lead to paralysis of the whole collaboration system. With the wide application of AR technology in industries such as industry, architecture, energy, etc., AR glasses are used as front-end sensing and interaction devices, and their data processing and coordination capabilities become key to system performance. However, the prior art has the common problems that the traditional AR system depends on cloud processing, data needs to come and go from a terminal and a remote server, interaction delay is large, real-time performance is affected, collaboration is difficult, unified data sources and collaboration mechanisms are lacked among a plurality of AR devices, sharing of visual angles and information synchronization by multiple persons are difficult to achieve, computing power is insufficient, AR glasses have limited local computing power, AI tasks such as complex image recognition, behavior analysis and semantic understanding are difficult to support, a network is unstable, bandwidth and stability in a public network environment cannot be guaranteed, and reliability of the AR system is affected. The commonly adopted solution in the industry relies on a centralized processing mode of a cloud server, all devices upload original data to a remote server to perform unified processing and decision making, the server calculates the position of each device, evaluates the cooperative state, and then issues rendering instructions to each device. The method can theoretically ensure the consistency of global information, but has serious network delay problem in practical application, especially in the industrial field with weak signals, the round trip delay of uplink uploading and downlink downloading can reach hundreds of milliseconds, and the precise butting operation with high real-time requirement is seriously influenced. Therefore, a new system architecture is needed that can support efficient deployment and real-time collaboration of large-scale AR devices. Based on the above problems, the traditional centralized server processing mode has main drawbacks, namely, the low-precision positioning scheme is difficult to meet the high-precision cooperation requirement, and the position deviation among multiple devices is accumulated, so that the virtual-real fusion effect is poor. Disclosure of Invention In a first aspect, the present invention provides an AR-aware interaction system based on integration of computing networks, which adopts the following technical scheme: An AR perception interaction system based on integration of computing networks comprises the following modules: the multi-mode sensing acquisition module is executed by a plurality of augmented reality devices in parallel and acquires multi-mode original sensing streams containing a first visual angle environment image, physiological index data and encrypted low-frequency magnetic field signals; The primary spatial relationship calculation module is locally executed by each augmented reality device and generates a primary spatial relationship vector representing the local position of the device based on the encrypted low-frequency magnetic field signals in the multi-mode original perceived stream; the global collaborative feature construction module is executed by the field edge server and constructs a global collaborative behavior feature matrix of the combined global digital twin map and the physiological state time sequence model according to the preliminary spatial relation vector and the physiological index data received from all the augmented reality equipment; The collaborative pattern recognition judging module is executed by the field edge server, analyzes the global collaborative behavior feature matrix to recognize the group behavior pattern, judges a unique collaborative pattern grade according to the matching result