CN-121979395-A - AR text tourist attraction five-sense immersion type navigation interactive system
Abstract
The AR text tourist attraction five-sense immersive navigation interactive system comprises an environment state sensing module, an intelligent content generating module, a multi-mode sensory feedback driving module and a central cooperative control module, wherein the environment state sensing module is used for collecting a geospatial signal and a visual scene signal around a user in real time, the user state sensing module is used for collecting a biological characteristic signal and a behavior gesture signal of the user in real time, the intelligent content generating module is used for receiving the geospatial signal, the visual scene signal, the biological characteristic signal and the behavior gesture signal, carrying out fusion calculation through a preset emotion calculation model and a context sensing model, the multi-mode sensory feedback driving module is used for receiving an immersive scenario control signal and synchronously generating a corresponding physical stimulus signal, and the central cooperative control module is used for coordinating the synchronous time sequence of data flow and control flow among the modules and providing scenario configuration signals and sensory linkage logic signals for the intelligent content generating module. The AR text tourist attraction five-sense immersive navigation interactive system can solve the problems that an existing AR navigation system is single in experience, lacks of immersive sense and emotion resonance.
Inventors
- LUO LIHUA
- HUA CONGLI
- WANG KANG
Assignees
- 汉合天辰(厦门)科技股份有限公司
Dates
- Publication Date
- 20260505
- Application Date
- 20260227
Claims (10)
- 1. An AR travel attraction five-sense immersive navigation interactive system, comprising: the environment state sensing module is used for collecting geographic space signals and visual scene signals around a user in real time; The user state sensing module is used for collecting the user biological characteristic signals and the behavior gesture signals in real time; The intelligent content generation module receives the geospatial signal, the visual scene signal, the biological characteristic signal and the behavior gesture signal, performs fusion calculation through a preset emotion calculation model and a context perception model, and generates an immersive scenario control signal containing audio-visual enhancement content and multi-sense linkage instructions; The multi-mode sensory feedback driving module receives the immersive scenario control signals and synchronously generates corresponding visual enhancement signals, spatial audio signals and physical stimulation signals for driving the touch and olfactory feedback devices; The central cooperative control module coordinates the synchronous time sequence of the data flow and the control flow among the modules and provides a scenario script configuration signal and a sensory linkage logic signal for the intelligent content generation module.
- 2. The AR text travel scenic spot five-sense immersive navigation interactive system according to claim 1, wherein the environmental state sensing module comprises a global satellite positioning system receiving unit and a visual recognition unit, the global satellite positioning system receiving unit is used for receiving satellite signals to generate a geospatial signal representing accurate geographic coordinates of a user, the visual recognition unit is internally provided with an image sensor and a deep learning model and is used for continuously capturing and analyzing real scene images in front of the view of the user in real time, and specific identifiers, building features or preset visual marks in the scene are recognized, so that a visual scene signal describing the content of the current scene is generated, and the geospatial signal and the visual scene signal are transmitted to the intelligent content generating module together to serve as core basis for context sensing and content calling.
- 3. The AR text tourist attraction five-sense immersive navigation interactive system according to claim 1, wherein the user state sensing module is integrated with a biosensor and an inertial measurement unit, the biosensor is used for non-invasively collecting biological characteristic signals of heart rate, skin electric activity and the like of a user to indirectly reflect the emotional excitement or tension degree of the user, the inertial measurement unit is used for continuously monitoring the rotation angle, the moving speed and the acceleration of the head of the user to generate behavior gesture signals describing the current behavior gesture and the attention focus of the user, and the signals are synchronously uploaded to the intelligent content generating module to provide quantitative input for the emotion calculation model so as to judge the real-time immersive state and emotion feedback of the user.
- 4. The AR text tourist attraction five-sense immersive navigation interactive system according to claim 1, wherein an association database and a machine learning engine are preset in the intelligent content generation module, the association database stores multimedia scenario contents and multi-sense linkage logic bound with different geographic coordinates and visual scenes, the machine learning engine performs fusion analysis on the received biological characteristic signals and behavior gesture signals through running an emotion calculation model, outputs a predicted result of the current emotion state of a user, dynamically selects or adaptively adjusts presentation intensity and interaction rhythm of scenario contents from the association database according to the predicted result, and finally synthesizes a unified immersive scenario control signal.
- 5. The AR text travel scenic spot five-sense immersive navigation interactive system according to claim 1, wherein the multi-modal sensory feedback driving module comprises a graphic rendering unit, an audio processing unit and a physical effect driving unit, the graphic rendering unit receives visual parts in immersive scenario control signals and generates virtual augmented reality images or information annotations superimposed on real world views to form visual enhancement signals, the audio processing unit generates three-dimensional surround sound effects with spatial orientation sense, namely spatial audio signals according to audio instructions in the signals, and the physical effect driving unit further decodes touch and smell instructions in the signals into control commands to drive a touch simulation device worn on a user to release vibration or thermal stimulus and controls an olfactory device deployed on a specific area of the scenic spot to release corresponding smell molecules to jointly form physical stimulus signals.
- 6. The AR text tourist attraction five-sense immersive navigation interactive system according to claim 1, wherein the central cooperative control module is provided with a high-precision clock synchronization unit and a data distribution management unit, the high-precision clock synchronization unit broadcasts unified time stamp signals to all other modules in the system to ensure that the whole process from signal perception and content generation to feedback execution has strict time sequence consistency, the data distribution management unit is responsible for monitoring communication links among the modules, forwarding scenario script configuration signals and sensory linkage logic signals according to a preset priority scheduling strategy, and managing control flow interaction between the intelligent content generation module and the multi-mode sensory feedback driving module, so that the synchronism and fluency of multi-sense experience are ensured.
- 7. The AR text travel scenic spot five-sense immersive navigation interactive system according to claim 1, wherein the deep learning model adopted by the visual recognition unit is a convolutional neural network model trained by a large number of scenic spot image data, the model has the capability of recognizing multiple kinds and multiple scales of targets, static objects in a scene including ancient building components, sculpture artworks, signs and dynamic objects including pedestrians, vehicles and animals can be simultaneously recognized, and recognition results are filled into visual scene signals as rich semantic information, so that the accuracy of understanding depth and content matching of the intelligent content generation module to a complex real environment is greatly enhanced.
- 8. The AR text tourist attraction five-sense immersive navigation interactive system according to claim 1, wherein the biosensor adopts a mode of combining an optical heart rate sensor and a skin electric reaction sensor, the optical heart rate sensor extracts heart rate variation signals by irradiating the skin surface of a user and analyzing the change of reflected light intensity, the skin electric reaction sensor captures physiological reactions caused by emotion fluctuation by measuring the tiny electric conductance change of the skin surface, the two signals form biological characteristic signals together after filtering and characteristic extraction, and multidimensional data input is provided for an emotion calculation model so as to improve the accuracy of emotion state judgment.
- 9. The AR text travel attraction five-sense immersive tour guide interactive system according to claim 1, wherein scenario content and linkage logic stored in the association database are organized in the form of scene cards, each scene card is uniquely associated with one or more geographic coordinate intervals and visual identifiers, and encapsulates a section of audiovisual media file, a section of parameters describing the type and intensity of tactile feedback and a code specifying the type of smell which can be triggered in the scene, and the intelligent content generation module acquires specific sensory linkage instructions by querying the matched scene cards and programs the specific sensory linkage instructions into the immersive scenario control signals.
- 10. The AR text tourist attraction five-sense immersive navigation interactive system according to claim 1, wherein the touch simulation device connected with the physical effect driving unit is wrist-wearing equipment, a linear resonance actuator and a thermoelectric semiconductor refrigerating and heating sheet are integrated in the touch simulation device, the linear resonance actuator can generate mechanical vibration with different frequencies and amplitudes according to received control commands to simulate touch texture or impact feeling, and the thermoelectric semiconductor refrigerating and heating sheet can quickly change the surface temperature of the touch simulation device to simulate warmth of sunlight irradiation or shade in a cave, so that abundant thermal touch feedback experience is provided for users.
Description
AR text tourist attraction five-sense immersion type navigation interactive system Technical Field The invention relates to the technical field of augmented reality and intelligent travel, in particular to a five-sense immersive navigation interactive system for an AR travel scenic spot. Background Current tourist attraction navigation methods are gradually changing from traditional signboards, paper maps and manual explanation to digital and intelligent, wherein the augmented reality technology is regarded as a direction with very potential due to the characteristic that virtual information can be superimposed on the real world. The existing AR navigation system is usually based on a smart phone or special glasses equipment, and the user position is determined through a global satellite positioning system or an image recognition technology, so that preset digital content such as three-dimensional model restoration of historical characters, original appearance reproduction of ancient sites or image-text audio-video introduction is triggered and displayed, the information presentation form is enriched to a certain extent, and the interest and knowledge of the tour are improved. However, the system has remarkable limitations generally, the experience is limited to visual and simple auditory layers, the interaction mode is mainly based on screen touch, the user is difficult to generate real feeling of being in the scene, the experience depth is insufficient due to the lack of sensory dimension, and deep emotion resonance and memory points are difficult to form. While there has been a search in the academia and industry for multisensory interactive technologies, such as haptic feedback gloves in virtual reality, dynamic seats in 4D cinema, and smell devices, these technologies are mostly in experimental stages or simply applied in isolation, lacking a systematic framework to deeply blend them into augmented reality navigation in outdoor mobile scenes. How to perform high-precision and low-delay synchronous control on multiple sensory stimuli, so that the sensory stimuli are adaptively matched with a dynamically-changed real environment and a user state, and the interference and synergy problems caused by deploying the technology in a public open space are solved, and are technical bottlenecks which are not effectively solved in the field. Therefore, there is a need in the market for an intelligent navigation solution that can comprehensively utilize five senses, provide a high level of immersive and personalized experience, to break through the experience ceilings brought by current technology. Disclosure of Invention In view of the shortcomings of the prior art, the invention aims to provide an AR text scenic spot five-sense immersive navigation interaction system, which is used for solving the problems of single experience, lack of immersive sense and emotion resonance of the existing AR navigation system. According to the invention, through a central cooperative control module, the environment and the user state sensing data are integrated, the intelligent content generation module dynamically generates control signals according to emotion calculation and a context sensing model, and finally the multi-mode sensory feedback device is driven to synchronously output multi-sensory stimuli such as vision, hearing, touch, smell and the like, so that an immersed interactive experience which is deeply fused with the real environment, is different from person to person and has rich emotion is created. The invention provides an AR text tourist attraction five-sense immersive guide interactive system, which comprises: The environment state sensing module is used for collecting geographic space signals and visual scene signals around a user in real time; The user state sensing module is used for collecting the biological characteristic signals and the behavior gesture signals of the user in real time; the intelligent content generation module receives the geospatial signal, the visual scene signal, the biological characteristic signal and the behavior gesture signal, performs fusion calculation through a preset emotion calculation model and a context perception model, and generates an immersive scenario control signal containing audio-visual enhancement content and a multi-sense linkage instruction; the multi-mode sensory feedback driving module receives the immersive scenario control signals and synchronously generates corresponding visual enhancement signals, spatial audio signals and physical stimulation signals for driving the touch and olfactory feedback devices; the central cooperative control module coordinates the synchronous time sequence of the data flow and the control flow among the modules and provides a scenario script configuration signal and a sensory linkage logic signal for the intelligent content generation module. In one embodiment of the invention, the environment state sensing module comprises a gl