Search

CN-122006058-A - Emotion intervention method based on wearable BCI and interactive mixed augmented reality

CN122006058ACN 122006058 ACN122006058 ACN 122006058ACN-122006058-A

Abstract

The invention discloses a mood intervention method based on wearable BCI and interactive mixed augmented reality, which is characterized in that an interactive personalized virtual reality sand table environment is constructed, sand tool library expansion and on-demand generation are realized by combining AIGC, interaction smoothness, personalized expression capacity and immersion experience are improved, and simultaneously electroencephalogram signals are acquired and analyzed in real time to objectively and continuously evaluate the mood state of a user, so that the combination of active interaction and objective monitoring in the mood intervention process is realized. The method can simultaneously support the instant state evaluation of single intervention and the long-term effect evaluation of multiple interventions, has good application value in the aspects of emotion regulation, psychological function improvement and intervention process tracking, can reduce the dependence on manual observation and subjective report, and is beneficial to improving the participation degree, persistence and practical application value of the interventions.

Inventors

  • Zhao sha
  • DONG FENGHE
  • PAN GANG
  • LI SHIJIAN

Assignees

  • 浙江大学

Dates

Publication Date
20260512
Application Date
20260415

Claims (10)

  1. 1. A method of emotional intervention based on a wearable BCI and interactive mixed augmented reality, comprising the steps of: (1) Constructing an interactive virtual reality sand table intervention scene, completing sand table space three-dimensional modeling in the scene, configuring a sand table and a sand tool display area, and presetting interactive operation elements for scene construction; (2) Constructing an expandable three-dimensional sand tool library based on AIGC technology, and generating personalized sand tools according to the requirement; (3) Synchronously acquiring EEG and fNIRS signals of a subject in the whole intervention process through a wearable BCI device, and synchronously acquiring a head movement inertia signal as a reference to remove movement artifacts in the EEG and fNIRS signals; (4) Setting a structured emotional intervention paradigm to collect multimodal data including questionnaire information, EEG and fNIRS signals, sand table scene layout, interview information, before intervention, after intervention, and at a corresponding stage of follow-up; (5) And comprehensively evaluating emotion intervention effects based on multi-mode data obtained in the process of multiple interventions, and analyzing emotion change trend and sand table interaction style characteristics of the subject in the process of multiple interventions.
  2. 2. The emotional intervention method based on the wearable BCI and the interactive mixed augmented reality according to claim 1, wherein the interactive virtual reality sand table intervention scene in the step (1) is a virtual room scene which is constructed by adopting three-dimensional modeling software in an immersive mode, a central desktop and a rectangular sand table are arranged inside the interactive virtual reality sand table intervention scene, the sand table surface is covered with a sand layer, the inner side edge of the sand table is set to be blue to simulate a 'water area boundary' in a solid sand table, storage shelves for placing sand tools are arranged on two sides of the sand table, rich interactive operation elements are preset in the constructed scene, the visual angle movement and rotation are carried out through a handle controller at a first person visual angle, a water area is added or deleted on the sand table surface by utilizing a brush tool and a rubber tool, sand tools are selected from the storage shelves, placed into the sand table, and the placed sand tools are subjected to continuous posture adjustment and size scaling.
  3. 3. The emotional intervention method based on the wearable BCI and the interactive mixed augmented reality according to claim 1, wherein the personalized sand tool in the step (2) is automatically generated by a AIGC model, the three-dimensional sand tool library at least comprises the following 11 basic categories of characters, animals, vegetation, buildings, vehicles, fences and marks, natural objects, fantasy objects, spirit and mystery objects, landscape components and household objects, the AIGC model generates prompting rules to follow emotional design constraints, and the prompting rules comprise soft and coordinated color matching and moderate contrast, clear outline, reasonable proportion and sharp structure avoidance and scale agreement of the shape and appearance, balanced layout and stable structure maintenance of the space structure, and the consistency of the overall style of the visual semantics and the avoidance of violence or uncomfortable appearance; The generation prompt rule is simultaneously used for generating sand tools according to requirements in the intervention process so as to realize dynamic expansion and personalized symbol expression of the sand tool library, and for the sand tools generated by AIGC, screening is carried out according to structural integrity, appearance consistency and emotion suitability, and sand tools with abnormal structures, abnormal vision or unsuitable emotion expression are removed so as to form the three-dimensional sand tool library for intervention.
  4. 4. The method for emotional intervention based on the wearable BCI and the interactive mixed augmented reality of claim 1, wherein the wearable BCI device in the step (3) is a forehead attached wireless device, and the device comprises an EEG acquisition module, a fNIRS acquisition module and an inertia acquisition module, wherein the EEG acquisition module and the fNIRS acquisition module are integrated in the same attached structure in a stacked manner to realize synchronous acquisition of EEG and fNIRS signals of a forehead apposition area, and the inertia acquisition module comprises an accelerometer and a gyroscope and is used for acquiring head movement inertia signals and providing references for motion artifact suppression of subsequent EEG and fNIRS signals.
  5. 5. The method of claim 4, wherein EEG signals are collected from four electrodes of Fpz, AF7, AF8 and M2 according to the international 10-20 system standard in the step (3), M1 is used as a reference electrode, the sampling rate is 250Hz, the fNIRS signals are collected by 8 optical probes in the bilateral forehead area, the dual wavelengths of 735nm and 850nm are adopted, the sampling rate is 25Hz, and 90.91Hz from a virtual reality head mounted display and 12.5Hz from a wearable BCI device are synchronously collected as reference signals during the intervention to adaptively remove artifacts in the EEG signals.
  6. 6. The emotional intervention method based on the wearable BCI and the interactive mixed augmented reality according to claim 1, wherein the emotional intervention paradigm in the step (4) comprises four phases of pre-intervention, post-intervention and follow-up, wherein the intervention phase lasts for two weeks and four intervention orders are performed every week, the subjects are energy, travel, connection and birth in sequence, each intervention order comprises subject guidance, virtual reality sand table construction for 20 minutes and review of a jettance, the review of a jettance comprises guided jettance and semi-structured interview, each dry prognosis will save sand table scene layout and interview information, the rest state EEG and fNIRS signals of a subject are acquired in the pre-intervention phase, the post-intervention phase and each pre-and post-intervention phase, and task state EEG and fNIRS signals of the subject are continuously acquired in the virtual reality sand table construction process, and the state and special volume scales of the subject are acquired at the same time, and the follow-up information is acquired again in the follow-up phase to evaluate the continuous intervention effect.
  7. 7. The emotional intervention method based on the wearable BCI and the interactive mixed augmented reality of claim 1, wherein the questionnaire information in the step (4) comprises two types of a special questionnaire scale and a state questionnaire scale, and the emotional intervention effect is comprehensively evaluated based on the scale scores, and the acquisition of the special questionnaire scale in a pre-intervention stage, a post-intervention stage and a follow-up stage comprises the following steps: PHQ and BDI, these scale scores are used to assess the extent of depression, with higher scores indicating more severe symptoms of depression; GAD, BAI and STAI-T, which scale scores are used to evaluate anxiety levels, the higher the score is, the more severe the anxiety level is; CD-RISC, a scale score for assessing mental toughness, a higher score indicating that the individual has a higher level of mental toughness and recovery; FS, the scale score is used to evaluate the extent of fatigue, with higher scores indicating more severe fatigue; SRSS, the scale score is used to evaluate the degree of sleep problems, the higher the score is, the more pronounced the sleep problem is; ACS, the scale score being used to evaluate attention control ability, a higher score indicating a stronger attention control ability; Collecting a status questionnaire scale before and after each intervention session of an intervention phase begins, comprising: SAM, which scale score is used to evaluate the current potency and arousal status of a subject; NRS, which score evaluates the instantaneous intensity of pleasure, happiness, calm, relaxation, anger, aversion, fear, anxiety, sadness, and dizziness, with higher scores representing higher corresponding emotional or state intensities; PANAS, the scale score evaluates current positive and negative emotion levels, wherein a higher positive emotion score indicates a stronger positive emotion and a higher negative emotion score indicates a stronger negative emotion; STAI-S, the scale score evaluates the level of state anxiety, the higher the score is, the more severe the current anxiety level is.
  8. 8. The emotion intervention method based on the wearable BCI and the interactive mixed augmented reality according to claim 1 is characterized in that in the step (5), acquired EEG and fNIRS signals are required to be processed and electroencephalogram characteristics are extracted, specifically, firstly, band-pass filtering is carried out on EEG signals for 1-45 Hz for 10 seconds at first and 10 seconds last on each section of signals, the EEG signals are divided into time window segments with the overlapping rate of 50% for 4 seconds, noise segments are removed according to amplitude threshold values and standard deviation threshold values, then, characteristics including kurtosis, HFD, detrack fluctuation analysis parameters, power and power ratio of each frequency band, hjorth complexity and mobility characteristics, sample entropy, spectral entropy and differential entropy are extracted on the basis of the frequency band power of AF7 and AF8 channels, and then, after fNIRS signals are converted and removed head motion artifacts, optical density variation is obtained, and further, the converted into hemoglobin concentration variation and subjected to 0.01-0.1 Hz filtering according to the correction ratio-Lambert law, and total hemoglobin concentration change with time are obtained.
  9. 9. The method for emotional intervention based on the wearable BCI and the interactive mixed augmented reality according to claim 1, wherein the comprehensive evaluation of the emotional intervention effect in the step (5) comprises two modes of long-term pre-and-post intervention evaluation and single pre-and-post intervention evaluation, wherein the long-term pre-and-post intervention evaluation mode is used for analyzing the effect generated by accumulation of multiple overall interventions, the overall change trend of the emotional state of the subject is evaluated by comparing questionnaire information and brain electrical characteristic signals in a resting state after the intervention in an initial stage and after the intervention in a plurality of times, the single pre-and-post intervention evaluation mode is used for analyzing the change of the state of the subject before and after the single sand table intervention, and the instant influence of the single intervention on the emotional level and the related physiological state of the subject is evaluated by comparing the questionnaire information acquired before and after the intervention and the brain electrical characteristic signals in the resting state and the task state.
  10. 10. The emotion intervention method based on the wearable BCI and the interactive mixed augmented reality, which is disclosed in claim 1, is characterized in that analysis of sand table interaction style characteristics in the step (5) is performed on the basis of multi-modal data to carry out grouping identification on a subject, specifically, narrative text, sand table scene layout information, brain electricity characteristics and questionnaire information of the subject in the sand table intervention process are firstly extracted, corresponding characteristic representations are respectively constructed, wherein the narrative text is encoded into narrative semantic characteristics through a text embedding model, the sand table scene layout information is characterized into layout characteristics through scene space distribution and object operation descriptors, the brain electricity characteristics are used for representing the neural activity state of the subject in the intervention process, the questionnaire information is used for representing the emotion state characteristics of the subject through scale scores, then, dimension reduction processing is performed on each modal data characteristic, the multi-modal data characteristics after dimension reduction are spliced and fused to form joint characteristic representations used for representing the sand table interaction styles of the subject, and finally, classification is performed on the subject by adopting a clustering algorithm on the subject on the basis of the joint characteristic representations to distinguish the sand table interaction styles of the subject.

Description

Emotion intervention method based on wearable BCI and interactive mixed augmented reality Technical Field The invention belongs to the technical field of biological signal processing and psychological intervention, and particularly relates to a emotional intervention method based on wearable BCI (brain-computer interface) and interactive mixed augmented reality. Background The emotional state has important influence on human mental health and is closely related to affective related disorders such as depression, anxiety, fatigue and the like, and the mental health disorders not only can influence the daily life and working state of people, but also bring obvious social burden, which highlights the important significance of developing an effective and generalized emotion regulating intervention mode. The traditional emotional intervention modes can be roughly divided into pharmaceutical intervention, nerve regulation intervention and behavior intervention, wherein the pharmaceutical intervention is mainly regulated by influencing the neurochemical process in the brain, but can have dependence or side effects such as headache, dry mouth and the like, and the nerve regulation intervention (such as electric stimulation and magnetic stimulation) is used for regulating the nerve activity by applying stimulation to specific nerve parts, but generally depends on medical scenes, and can bring about safety concerns, so that the daily use requirement is difficult to meet. Behavioral intervention, in contrast, alters an individual's emotional and behavioral response through a structured psychological strategy, is safer and less costly than other intervention methods. However, many behavioral interventions still depend on professional online development, and have shortcomings in daily use convenience and popularization and application, so a portable and wearable behavioral intervention technical scheme is needed. Along with the development of digital technology, an intervention mode based on mixed augmented reality gradually becomes a behavior intervention mode with application potential, and the virtual reality environment has the advantages of immersive property, controllability, repeatability and the like, is beneficial to realizing a standardized intervention flow, reducing the dependence on specific intervention places and psychological specialists, and improving the intervention convenience. However, most of the prior virtual reality interventions are mainly presented with passive content, the user interaction degree is limited, the participation feeling and the continuous investment degree are easy to reduce in the long-term use process, and in order to improve the user participation degree, more and more researches begin focusing on virtual reality intervention forms supporting active interaction. The sand table is introduced into a virtual reality scene as a psychological intervention form with expressive and anti-thinking properties, people can externalize the intrinsic ideas into editable scene elements by constructing the sand table scene in the virtual environment, and iterative expression and anti-thinking are carried out by continuous adjustment and reconstruction, so that emotion adjustment is supported. However, the existing virtual reality sand table intervention technology has key limitations, and specifically comprises the following aspects: The interactive limitation is that a sand tool (miniature model) library in the existing virtual reality sand table system is usually limited by factors such as copyright, storage, cost, manual screening and maintenance, the scale is small, rich symbolic expressions are difficult to support, and when a user cannot find a proper object in the process of constructing a scene, the expression process is easily interrupted and the interactive continuity is easily damaged. On the other hand, some existing systems lack efficient and adjustable operation modes, have the problems of large operation burden, complex interaction and the like, and further weaken continuous participation and immersion experience. The personalized limitation is that scene elements in the existing virtual reality sand table are mostly predefined contents, a large number of sand tools are directly assembled by common ready resources, customized design facing individual expression requirements is lacking, the objects often carry preset semantics and are not completely matched with stories and images which are wanted to be expressed by users, the users can only perform approximate expression, the personalized expression capability is insufficient, the narrative details are limited, and therefore the intervention effect is affected. The method lacks the limitation of objective and real-time assessment that in the prior virtual reality sand table intervention research, the emotional state assessment usually depends on therapist observation and subjective report of participants, and the method lacks th