CN-122024556-A - Virtual operation simulation training system based on VR/AR technology and interaction method thereof
Abstract
The invention discloses a virtual surgery simulation training system based on VR/AR technology and an interaction method thereof, and relates to the technical field of virtual surgery simulation training. The multi-user cross-region immersive collaborative surgery training system is constructed, role division collaboration is supported, and the problem of team training deficiency is solved. Based on expert knowledge graph, multidimensional quantitative evaluation is realized, operational compliance, safety and technical points are covered, and personalized improvement suggestions are provided. And the clinical-level precision interaction is realized by integrating force touch feedback, gesture tracking and eye movement analysis technologies. And VR/AR bimodality is supported, and the AR realizes sub-millimeter level registration by adopting SLAM technology. And simulating the physical process of the soft tissue in real time by adopting a finite element and mass point-spring hybrid algorithm. And the complexity of the case is adaptively adjusted, closed-loop training optimization is formed, and training data recording and multi-dimensional report generation are supported.
Inventors
- Lu Zhenhan
- ZHU TAO
- ZHU MIN
Assignees
- 安徽理工大学
Dates
- Publication Date
- 20260512
- Application Date
- 20260323
Claims (9)
- 1. A virtual operation simulation training system based on VR/AR technology comprises a multi-mode perception interaction system, a virtual-real fusion presentation system, a physical simulation system, an evaluation and guidance system and a collaborative training system, and is characterized in that: The multi-mode perception interaction system is used for collecting and processing the hand operation gesture of a user, the spatial position and the motion track of a surgical instrument, capturing the force change in the surgical operation process through a force feedback device, simultaneously fusing multi-mode information such as vision, hearing, touch and the like, transmitting the tissue feedback (such as cutting resistance and suture tension) and the environmental sound effect in a virtual surgical scene to the user in real time, and realizing natural and accurate immersive interaction between the user and the virtual surgical environment; The virtual-real fusion presentation system is used for accurately superposing and fusing the dynamic change of the virtual operation tissue generated by the physical simulation system with a real operation scene or a training model; The physical simulation system is used for simulating the mechanical characteristics and dynamic response of various biological tissues in the virtual operation scene in real time, and dynamically adjusting simulation parameters according to the operation data transmitted by the multi-mode perception interaction system; The evaluation and guidance system is used for performing multidimensional quantitative evaluation on the operation behaviors, decision logic and skill mastery level of the user in the virtual operation training process, and providing real-time intelligent guidance by combining an expert knowledge base and an operation standardization system; The collaborative training system is used for realizing multi-user cross-region immersive collaborative surgery training and simulating the division collaborative process of a real surgery team.
- 2. The virtual surgery simulation training system based on the VR/AR technology according to claim 1, wherein the multi-modal perception interaction system comprises a force touch feedback device, a multi-modal gesture tracking module and an eye movement tracking and gaze analysis module, wherein the force touch feedback device is used for simulating mechanical feedback of cutting, stitching and grabbing operations, the multi-modal gesture tracking module is used for achieving millimeter-level precision tracking of hand gestures, and the eye movement tracking and gaze analysis module captures an operator gaze point in real time for attention analysis, evaluation of surgical area attention and self-adaptive scene rendering optimization.
- 3. The virtual surgery simulation training system based on the VR/AR technology according to claim 2, wherein the force touch feedback device is a variable stiffness actuator, the variable stiffness actuator is a six-dimensional force/moment sensor, the force control accuracy of 0.1N and the position accuracy of 0.5mm are achieved, and the multi-mode gesture tracking module fuses data glove, infrared optical tracking and computer vision technology.
- 4. The virtual surgery simulation training system based on the VR/AR technology according to claim 1 is characterized in that the virtual-real fusion presentation system comprises a display module, a variable-focus display module and a rendering module, wherein the display module supports VR and AR bimodal switching, the variable-focus display technology module adjusts a display focal length in real time according to a fixation depth captured by an eye movement tracking module, and the rendering module dynamically adjusts illumination intensity, color temperature and shadow distribution of a virtual scene based on real illumination parameters acquired by an ambient light sensor.
- 5. The VR/AR technology-based virtual surgical simulation training system of claim 4, wherein the display module employs a SLAM-based spatial anchoring technique in AR mode to achieve sub-millimeter registration of the virtual anatomy with the solid model or real patient image.
- 6. The virtual surgery simulation training system based on the VR/AR technology according to claim 1, wherein the physical simulation system comprises a tissue modeling module, a coupling simulation module and an interaction model, wherein the tissue modeling module is based on a mixed algorithm of a Finite Element Method (FEM) and a particle-spring model and is used for realizing real-time physical simulation of soft tissue cutting, tearing and deformation by combining a biomechanical parameter database, the simulation module is used for simulating the flow characteristics of body fluid and the physical process of interaction between the body fluid and tissues, and the interaction model is used for establishing an interaction parameter library of a surgical instrument and different biological tissues and realizing accurate simulation of complex physical processes such as instrument energy transmission, thermal injury diffusion and the like.
- 7. The virtual surgery simulation training system based on the VR/AR technology according to claim 1, wherein the assessment and instruction system comprises an expert knowledge graph construction module, a semantic understanding module, a multi-dimensional assessment module and an adaptive adjustment module, wherein the expert knowledge graph construction module is used for constructing knowledge graphs covering standard surgery flows, key steps, risk points and complications processing based on a large number of real surgery videos and expert labels, the semantic understanding module is used for carrying out real-time semantic analysis on gesture tracks, instrument movements and tissue interactions of operators by means of a deep learning method to identify current surgery steps and operation intents, the multi-dimensional assessment module is used for carrying out real-time scoring by combining the expert knowledge graphs to generate personalized training reports and improved suggestions, and the adaptive adjustment module is used for dynamically adjusting case complexity, complication occurrence rate and time pressure according to the skill level of the operators to achieve progressive skill training.
- 8. The virtual surgery simulation training system based on the VR/AR technology according to claim 1, wherein the collaborative training system comprises a network synchronization module, a role assignment and authority management module and a communication module, wherein the network synchronization module adopts a data transmission mechanism based on a QUIC protocol to achieve millisecond synchronization of multi-user virtual surgery scenes, instrument operation tracks, tissue dynamic changes and force feedback data, the role assignment and authority management module presets different operation authorities and function access ranges based on operation team role assignment, and the communication module integrates real-time voice interaction, virtual gesture instruction recognition and text message pushing functions to support instant communication among participants.
- 9. A virtual surgery simulation training interaction method based on the system of any one of claims 1-8, comprising the steps of: Step S1, initializing a training scene and configuring individuation, namely receiving a surgery type, case difficulty and training mode (single/multi-person cooperation) selected by a user, loading a corresponding virtual patient anatomy model, pathological features and surgical instrument configuration, and automatically adjusting initial difficulty parameters and important attention areas according to historical training data of the user; S2, multi-mode sensing data acquisition and fusion, namely acquiring six-dimensional force/moment data applied by an operator through a force touch feedback device, acquiring hand gestures, finger bending and instrument holding states through a gesture tracking system, acquiring gaze point tracks, pupil diameters and blink frequencies through an eye movement tracking module, and performing space-time alignment and fusion on VR/AR technology multi-source heterogeneous data by adopting a Kalman filtering algorithm; step S3, physical simulation calculation and virtual-real fusion rendering, namely driving an intelligent physical simulation layer to calculate physical processes such as tissue deformation, cutting separation, fluid flow and the like based on the interactive input in the step S2, calling a corresponding rendering pipeline to generate visual feedback according to a current display mode (VR/AR), acquiring an environment image in real time in the AR mode, and superposing a virtual anatomical structure on a real scene through an image registration algorithm; S4, operation semantic understanding and real-time evaluation, namely performing space-time feature extraction on a continuous action sequence of an operator, identifying a current operation step, comparing an operation track with a standard path in an expert knowledge graph, detecting deviation operation, calculating an operation score in real time, and triggering instant early warning and intervention when high-risk operation (such as accidentally injuring an important blood vessel) is detected; S5, intelligent feedback and self-adaptive adjustment, namely providing differentiated touch prompts (such as resistance increase of a risk area) through a force touch feedback device according to an evaluation result, superposing guide information (such as standard incision lines and highlighting of a safe operation area) on a visual interface, dynamically adjusting difficulty and emphasis of subsequent training content, and forming closed-loop training optimization; and S6, recording complete operation process data including motion trail, mechanical interaction, physiological indexes and evaluation scores, generating a multi-dimensional training report including a skill radar chart, weak link analysis and improvement suggestion, and supporting three-dimensional playback of the operation process and expert remote criticizing.
Description
Virtual operation simulation training system based on VR/AR technology and interaction method thereof Technical Field The invention relates to the technical field of virtual surgery simulation training, in particular to a virtual surgery simulation training system based on VR/AR technology and an interaction method thereof. Background Virtual operation simulation training is a training method for simulating a real operation process by using a computer technology, and creates a realistic virtual environment by using technologies such as three-dimensional reconstruction, real-time rendering, tactile feedback and the like, thereby helping medical professionals to promote operation skills and decision-making ability. The operation simulation training in the prior art has the defects that a single training mode is mostly adopted, a multi-user remote collaborative operation training function is not available, the collaborative training requirement of an operation team is not met, simple objective indexes such as operation time, path length and the like are mostly adopted, an intelligent evaluation system based on expert knowledge patterns is not available, and the compliance, safety and technical points of operation cannot be accurately evaluated and individually guided in real time. Therefore, the invention provides a virtual operation simulation training system based on VR/AR technology and an interaction method thereof. Disclosure of Invention Aiming at the defects of the prior art, the invention provides a virtual operation simulation training system based on VR/AR technology and an interaction method thereof, so as to solve the problems. In order to achieve the aim, the virtual operation simulation training system based on the VR/AR technology comprises a multi-mode perception interaction system, a virtual-real fusion presentation system, a physical simulation system, an evaluation and guidance system and a collaborative training system; The multi-mode perception interaction system is used for collecting and processing the hand operation gesture of a user, the spatial position and the motion track of a surgical instrument, capturing the force change in the surgical operation process through a force feedback device, simultaneously fusing multi-mode information such as vision, hearing, touch and the like, transmitting the tissue feedback (such as cutting resistance and suture tension) and the environmental sound effect in a virtual surgical scene to the user in real time, and realizing natural and accurate immersive interaction between the user and the virtual surgical environment; The virtual-real fusion presentation system is used for accurately superposing and fusing the dynamic change of the virtual operation tissue generated by the physical simulation system with a real operation scene or a training model; The physical simulation system is used for simulating the mechanical characteristics and dynamic response of various biological tissues in the virtual operation scene in real time, and dynamically adjusting simulation parameters according to the operation data transmitted by the multi-mode perception interaction system; The evaluation and guidance system is used for performing multidimensional quantitative evaluation on the operation behaviors, decision logic and skill mastery level of the user in the virtual operation training process, and providing real-time intelligent guidance by combining an expert knowledge base and an operation standardization system; The collaborative training system is used for realizing multi-user cross-region immersive collaborative surgery training and simulating the division collaborative process of a real surgery team. Preferably, the multi-mode perception interaction system comprises a force touch feedback device, a multi-mode gesture tracking module and an eye movement tracking and gazing analysis module, wherein the force touch feedback device is used for simulating mechanical feedback of cutting, stitching and grabbing operations, the multi-mode gesture tracking module is used for achieving millimeter-level precision tracking of hand gestures, and the eye movement tracking and gazing analysis module is used for capturing an operator gazing point in real time and used for attention analysis, operation area attention assessment and self-adaptive scene rendering optimization. Preferably, the force touch feedback device adopts a variable stiffness actuator, the variable stiffness actuator adopts a six-dimensional force/moment sensor to realize the force control precision of 0.1N and the position precision of 0.5mm, and the multi-mode gesture tracking module fuses the data glove, the infrared optical tracking and the computer vision technology. The virtual-real fusion presentation system comprises a display module, a variable-focus display module and a rendering module, wherein the display module supports VR and AR bimodal switching, the variable-focus display technology mo