CN-122023740-A - Immersive thinking education scene simulation experience system based on augmented reality
Abstract
The invention provides an immersive thinking education scene simulation experience system based on augmented reality, which comprises a scene construction module for generating a dynamic virtual thinking scene, an augmented reality presentation module for three-dimensionally fusing a virtual object with a real environment, a multi-modal interaction and response module for collecting gesture, voice and physiological signals and identifying intention-driven scene dynamic feedback, and a learning evaluation and individuation adaptation module for generating quantitative evaluation and dynamically adjusting scene content or difficulty according to user behavior data so as to realize immersive individuation thinking education experience. The invention can realize immersive thinking education experience, dynamically adjust scene content and difficulty according to user behavior data, and promote learning effect and personalized experience.
Inventors
- LU MINGJIE
Assignees
- 四川交通职业技术学院
Dates
- Publication Date
- 20260512
- Application Date
- 20260130
Claims (10)
- 1. An immersive thinking education scenario simulation experience system based on augmented reality, comprising: the scenario construction module is used for generating and managing virtual thinking education scenarios with dynamic scenarios according to the thinking education subjects; The augmented reality presentation module is used for acquiring real environment information, carrying out three-dimensional space fusion and rendering on virtual objects in the virtual thinking education scene and the real environment information, and generating and outputting an augmented reality picture; the multi-mode interaction and response module is used for collecting various interaction inputs of a user, carrying out semantic understanding and intention recognition on the interaction inputs, generating corresponding interaction instructions, and driving the virtual thinking education scene to generate dynamic feedback according to the interaction instructions; the learning evaluation and individuation adaptation module is used for generating a quantized learning evaluation result according to multidimensional behavior data of a user in the experience process and dynamically adjusting the presentation content or difficulty parameter of the subsequent virtual thinking education scene based on the quantized learning evaluation result.
- 2. The augmented reality-based immersive thinking education scenario simulation experience system of claim 1, wherein the scenario construction module comprises: The dynamic parameterized scene generation unit is used for dynamically combining and generating a virtual scene containing interactive virtual objects from a three-dimensional model database according to the thinking education theme and preset scene element parameters, wherein the scene element parameters comprise historical period characteristic parameters and environmental atmosphere parameters; The system comprises an adjustable narrative network unit, a logic judgment unit and a current scene advancing path, wherein the adjustable narrative network unit is used for constructing a directed graph-based thinking education narrative logic network, the narrative logic network comprises a plurality of scene nodes and scene branching paths connected with the scene nodes, each scene branching path is associated with one logic judgment condition, and the adjustable narrative network unit activates the corresponding logic judgment conditions according to user image information or real-time interaction results so as to determine the current scene advancing path.
- 3. The augmented reality-based immersive thinking education scenario simulation experience system of claim 2, wherein the augmented reality presentation module comprises: the multi-source sensor fusion registration unit is used for synchronously acquiring the real environment image acquired by the image sensor and the motion data acquired by the inertial measurement unit, calculating to obtain high-precision three-dimensional structure information of the user position, the gesture and the real environment based on the visual SLAM algorithm and the motion data, and generating a space registration matrix; The virtual-real light and shadow fusion and shielding processing unit is used for placing the virtual object in a three-dimensional structure of a real environment according to the space registration matrix, calculating the light and shadow effect of the virtual object in real time based on the illumination estimation result of the real environment, and calculating and rendering the real-time shielding relation between the virtual object and the object in the real environment; And the self-adaptive rendering output unit is used for dynamically adjusting the rendering resolution and the frame rate of the augmented reality picture according to the performance parameter of the augmented reality display device and the viewpoint moving speed of the user and outputting the final rendering picture.
- 4. The augmented reality-based immersive thinking education scenario simulation experience system of claim 1, wherein the multi-modal interaction and response module comprises: the multi-channel input sensing unit is used for simultaneously acquiring gesture input and voice input of a user and physiological signal input acquired through the biosensor to form an original interaction data stream; the intent recognition and instruction generation unit is used for carrying out fusion analysis on the original interaction data stream, wherein the fusion analysis comprises the steps of recognizing an operation type and a target virtual object corresponding to the gesture input, recognizing a keyword and a command intent in the voice input, and analyzing an emotion state reflected by the physiological signal input; And the scene dynamic response unit is used for calling a physical engine to perform real-time physical simulation update on the state of the virtual object according to the interaction instruction, or driving a virtual character in the virtual thinking education scene to perform dialogue and behavior feedback and triggering the plot promotion in the adjustable narrative network unit.
- 5. The augmented reality-based immersive thinking education scenario simulation experience system of claim 1, wherein the learning assessment and personalization adaptation module comprises: The system comprises a multi-dimensional behavior data acquisition unit, a data processing unit and a data processing unit, wherein the multi-dimensional behavior data acquisition unit is used for continuously recording an operation sequence, key selection nodes, task completion time, emotion tendency of voice content and a physiological index change curve monitored by a biosensor in the experience process of a user, and is used as multi-dimensional behavior data; the multidimensional evaluation model calculation unit is used for calculating quantitative indexes of the user in knowledge cognition dimension, value judgment dimension, emotion acceptance dimension and behavior practice dimension in parallel according to the multidimensional behavior data; The personalized strategy adapting unit is used for generating personalized scene adjusting parameters for the current user according to the quantization indexes output by the multi-dimensional evaluation model calculating unit and a preset adapting rule base, and feeding the personalized scene adjusting parameters back to the scene constructing module and the augmented reality presenting module.
- 6. The augmented reality-based immersive thinking education scenario simulation experience system according to claim 5, wherein in the multi-dimensional evaluation model calculation unit, a calculation formula of emotion input degree E, which is a quantization index of emotion recognition dimension, is: Wherein, the Representing an attention focusing index, and obtaining based on the ratio of the stay time length of the user sight on the key virtual object to the total time length; representing the normalized physiological response fluctuation index, To experience differences in average heart rate from baseline heart rate, Is the baseline heart rate; representing the voice emotion positive index, and obtaining based on acoustic feature analysis of the voice of the user; , , Weight coefficients of attention focusing index, physiological response fluctuation index, and speech emotion aggression index, respectively, and ; As a coefficient of the decay in time, And the time difference between the current evaluation time and the scene key trigger time.
- 7. The augmented reality-based immersive thinking education scenario simulation experience system of claim 6, wherein the learning assessment and personalization adaptation module further comprises a comprehensive assessment report generation unit for: Receiving each dimension quantization index output by the multi-dimension evaluation model calculation unit; the comprehensive evaluation value S of the learning effect under the current thinking education theme is calculated by adopting the following formula: Wherein, the A quantization index value representing the ith dimension, N being the total number of dimensions; The dynamic weight of the ith dimension at time t is adaptively adjusted according to the teaching target emphasis point of the thinking education theme and the historical performance of the user; representing the semantic fitness score of the user's selection at the key choice point j and the preset ideal answer; Selecting an influence coefficient for the choice; and generating the quantized learning evaluation result of image-text combination based on the learning effect comprehensive evaluation value S and the dimension quantization indexes.
- 8. The augmented reality-based immersive thinking education scenario simulation experience system of claim 1, further comprising: a scenario generation algorithm module, comprising: The multi-mode teaching resource analysis unit is used for receiving and analyzing the input original material of the political education, wherein the original material comprises a text case, a historical image material, an audio file and a picture material; performing entity recognition, event extraction and emotion tendency analysis on texts, performing key scene and character action recognition on images and pictures, performing voice-to-text and emotion label marking on audios, and generating a structured multi-modal thinking education knowledge graph; The narrative logic automatic construction unit is connected with the multi-mode teaching resource analysis unit and is used for learning structural features and plot conversion modes of the history high-quality narrative logic network by utilizing a graphic neural network model based on the multi-mode thinking education knowledge graph, and automatically generating candidate narrative logic networks and plot driving rule sets corresponding to the candidate narrative logic networks aiming at new thinking education subjects; The dynamic script and scene parameter generating unit is connected with the automatic narrative logic building unit and is used for calling a pre-trained large language model according to the selected candidate narrative logic network, generating a dynamic dialogue script and a scene description text which accord with historical background and character setting, and outputting a virtual scene configuration parameter set matched with the dynamic dialogue script and the scene description text, wherein the virtual scene configuration parameter set is provided for the scene building module.
- 9. The augmented reality-based immersive thinking education scenario simulation experience system of claim 1 or 8, further comprising: cloud collaboration and persistence module, comprising: A distributed user state management unit for creating and maintaining an independent state container for each online user session, the state container storing in real time the user's interaction context, the current snapshot of the virtual thinking education scenario, and real-time assessment intermediate data generated by the learning assessment and personalization adaptation module; The system comprises a multi-dimension evaluation model calculation unit, a mass behavior data analysis and model optimization unit, a scene generation algorithm module, a multi-dimension analysis and model optimization unit and a terminal module, wherein the mass behavior data analysis and model optimization unit is used for receiving and permanently storing all the anonymized multi-dimension behavior data and corresponding quantitative learning evaluation results to form a historical training data set; The security and authority management unit is used for managing user identity authentication, data access authority and access levels of different thinking education scene contents, and the unit performs end-to-end data encryption transmission and performs differential privacy protection processing on all stored personalized data.
- 10. The augmented reality-based immersive thinking education scenario simulation experience system of claim 3 or 9, wherein the adaptive rendering output unit is specifically configured to: constructing a rendering resource constraint model according to the type, the residual electric quantity, the real-time calculation load and the network bandwidth condition of the augmented reality display device; And under the rendering resource constraint model, aiming at maintaining a preset minimum acceptable interactive frame rate, dynamically distributing rendering resources through an optimization algorithm, wherein the optimization comprises the following steps: rendering a virtual object which is positioned in a user visual field focus area and is strongly related to the current plot promotion by adopting a high-precision model and a real-time dynamic light shadow; rendering the virtual object positioned in the edge of the visual field or the secondary background by adopting a detail level model and a static shadow map; when the high-speed movement of the user viewpoint or the shortage of system resources is detected, the rendering resolution of the non-key virtual object is automatically reduced, and an asynchronous time warping technology is started to carry out picture compensation; The pictures output by the adaptive rendering output unit are adapted to various augmented reality display devices including an optical perspective head-mounted display, a video perspective head-mounted display and a mobile intelligent terminal with depth sensing capability.
Description
Immersive thinking education scene simulation experience system based on augmented reality Technical Field The invention relates to the technical field of augmented reality, in particular to an immersive thinking education scene simulation experience system based on augmented reality. Background In the current digital age, thinking education is an important means for culturing the correct value, and faces the problems of single traditional teaching mode, low participation of students and the like. With the development of Augmented Reality (AR) technology, its application in the educational field is receiving attention. The AR technology brings immersive experience to the user by fusing virtual information with a real environment, so that the learning interestingness and interactivity can be effectively improved. However, the prior art often lacks deep integration and personalized adaptation of the teaching content when applying AR to the political education. For example, most systems can only provide static virtual scenes, and can not dynamically generate and adjust scene contents according to teaching topics, and meanwhile, the processing of user interaction is simpler, and the deep fusion and semantic understanding of multi-mode interaction input can not be realized, so that the user experience is limited. In addition, the existing system is weak in learning effect evaluation, lacks a multidimensional behavior data analysis and dynamic adjustment mechanism, and is difficult to meet learning requirements of different students. In the implementation process of the embodiment of the invention, the prior art has at least the following problems or defects that firstly, virtual scenes with dynamic plots cannot be dynamically generated according to the thinking education subjects, so that the teaching contents are difficult to update, secondly, the deep fusion and semantic understanding of multi-modal interactive input of users are lacking, the accurate interactive response cannot be realized, thirdly, the learning effect evaluation is not comprehensive enough, the teaching contents and the difficulty cannot be dynamically adjusted according to the user behavior data, and the personalized learning requirement is difficult to meet. Disclosure of Invention The invention provides an immersion thinking education scene simulation experience system based on augmented reality, which comprises the following steps: the scenario construction module is used for generating and managing virtual thinking education scenarios with dynamic scenarios according to the thinking education subjects; The augmented reality presentation module is used for acquiring real environment information, carrying out three-dimensional space fusion and rendering on virtual objects in the virtual thinking education scene and the real environment information, and generating and outputting an augmented reality picture; the multi-mode interaction and response module is used for collecting various interaction inputs of a user, carrying out semantic understanding and intention recognition on the interaction inputs, generating corresponding interaction instructions, and driving the virtual thinking education scene to generate dynamic feedback according to the interaction instructions; the learning evaluation and individuation adaptation module is used for generating a quantized learning evaluation result according to multidimensional behavior data of a user in the experience process and dynamically adjusting the presentation content or difficulty parameter of the subsequent virtual thinking education scene based on the quantized learning evaluation result. Further, the scenario construction module includes: The dynamic parameterized scene generation unit is used for dynamically combining and generating a virtual scene containing interactive virtual objects from a three-dimensional model database according to the thinking education theme and preset scene element parameters, wherein the scene element parameters comprise historical period characteristic parameters and environmental atmosphere parameters; The system comprises an adjustable narrative network unit, a logic judgment unit and a current scene advancing path, wherein the adjustable narrative network unit is used for constructing a directed graph-based thinking education narrative logic network, the narrative logic network comprises a plurality of scene nodes and scene branching paths connected with the scene nodes, each scene branching path is associated with one logic judgment condition, and the adjustable narrative network unit activates the corresponding logic judgment conditions according to user image information or real-time interaction results so as to determine the current scene advancing path. Further, the augmented reality presentation module includes: the multi-source sensor fusion registration unit is used for synchronously acquiring the real environment image acquired by the image sen