Search

KR-20260066912-A - APPARATUS AND METHOD FOR RECOGNIZING EMOTIONS BASED ON BRAIN WAVE USING BRAIN NETWORK COMMUNITY OF USER

KR20260066912AKR 20260066912 AKR20260066912 AKR 20260066912AKR-20260066912-A

Abstract

The present invention relates to a brainwave-based emotion recognition device and method using a user's brain network community. According to one embodiment of the present invention, the brainwave-based emotion recognition device comprises: a data collection unit that collects brainwave data as signals measuring electrical activity of the brain, in which multiple regions of interest (ROIs) are interconnected to form multiple communities through multiple electrodes attached to the user's scalp; a data preprocessing unit that processes the collected brainwave data into input data by performing source localization, functional parcellation, and differential entropy (DE) calculations; a feature extraction unit that extracts global features by learning brain activity characteristics across the entire brain through capturing the intrinsic relationships between the multiple ROIs in the processed input data, and extracts local features by learning functional features related to emotion in the functional communities in the processed input data; and integrates the global features and the local features into integrated features using multiple transformers and a loss function, and based on the integrated features, extracts functional features related to emotion in each of the multiple communities and the multiple communities It may include an emotion recognition processing unit that recognizes emotions by considering the interaction between them.

Inventors

  • 이원희
  • 김태성
  • 김태성

Assignees

  • 경희대학교 산학협력단

Dates

Publication Date
20260512
Application Date
20241105

Claims (14)

  1. A data collection unit that collects brainwave data as signals measuring electrical activity of the brain, in which multiple regions of interest (ROIs) are interconnected to form multiple communities through multiple electrodes attached to the user's scalp; A data preprocessing unit that processes the collected brainwave data as input data by performing source localization, functional parcellation, and differential entropy (DE) calculations on the data; A feature extraction unit that extracts global features by capturing the implicit relationships between the plurality of ROIs in the processed input data and learning brain activity characteristics throughout the brain, and extracts local features by learning functional features related to emotions in the functional community in the processed input data; and Characterized by including an emotion recognition processing unit that integrates the global feature and the local feature into an integrated feature using a plurality of transformers and a loss function, and recognizes an emotion by considering the functional feature related to emotion in each of the plurality of communities and the interaction between the plurality of communities based on the integrated feature. Brainwave-based emotion recognition device.
  2. In paragraph 1, The above data preprocessing unit is characterized by performing brainwave analysis to restore the source electrical signals generated in each brain region, estimating them as source-level brainwave data with increased spatial expressiveness of brain activity, and performing the source localization. Brainwave-based emotion recognition device.
  3. In paragraph 2, The above data preprocessing unit is characterized by grouping the source-level brainwave data into functional regions using a plurality of atlases and performing the functional partitioning according to classification into a plurality of functional communities. Brainwave-based emotion recognition device.
  4. In paragraph 3, The above data preprocessing unit is characterized by performing the differential entropy calculation for a plurality of frequency bands at preset time intervals from the source level brainwave data, accumulating the calculation results, and processing them as the input data. Brainwave-based emotion recognition device.
  5. In paragraph 1, The above feature extraction unit is characterized by compressing input features into frequency and spatial dimensions through an average pooling layer in the spectral SE block and the spatial SE block based on a SSAM (Spectral-Spatial Attention Module) composed of a spectral SE (Squeeze-and Excitation) block and a spatial SE block, learning weights through a fully connected layer and a sigmoid function, and outputting graph data by emphasizing features for recognizing emotions in the frequency and spatial domains by multiplying the input features and the weights. Brainwave-based emotion recognition device.
  6. In paragraph 5, The above feature extraction unit is characterized by extracting local features by capturing functional characteristics only from regions of interest (ROIs) within one of the plurality of communities using an adjacency matrix with regions of interest (ROIs) as nodes in the graph data through a PDGCN (Pearson correlation coefficients-based Dynamical Graph Convolutional Network) to identify interactions between different regions of the brain, and extracting global features by learning potential relationships in the plurality of communities. Brainwave-based emotion recognition device.
  7. In paragraph 1, The emotion recognition processing unit is characterized by the fact that, in an integrated transformer composed of a Multi-Head Cross-Attention (MHCA) and a Feed Forward Neural Network (FFNN) among the plurality of transformers, the cross-attention (CA) of the MHCA learns and outputs the global features by weighting them according to their relevance to each community, using the local features as a query and the global features as key and value, and integrates the local features and the global features by connecting the output to the local features, passing it through the FFNN, and performing a weighted sum based on community characteristics. Brainwave-based emotion recognition device.
  8. In Paragraph 7, The emotion recognition processing unit is characterized by the fact that, among the plurality of transformers, in a fusion transformer composed of MHSA (Multi-Head Self-Attention) and FFNN, the SA (self-attention) of the MHSA captures the relationship between community-specific features to compute an emotion processing process formed through the complex interaction of the plurality of communities, and recognizes the emotion based on the result output through the FFNN. Brainwave-based emotion recognition device.
  9. In paragraph 1, The emotion recognition processing unit described above is characterized by reducing redundant information between the global feature and the local feature using a difference loss function and a classification loss function defined based on cross-entropy included in the loss function. Brainwave-based emotion recognition device.
  10. A step of collecting brainwave data in a data collection unit using signals that measure the electrical activity of the brain, in which multiple regions of interest (ROIs) are interconnected to form multiple communities through multiple electrodes attached to the user's scalp; A step in which, in a data preprocessing unit, source localization, functional parcellation, and differential entropy (DE) calculations are performed on the collected brainwave data to process it as input data; In a feature extraction unit, a step of learning brain activity features across the entire brain by capturing the implicit relationships between the plurality of ROIs in the processed input data, and learning functional features related to emotion in the functional community in the processed input data to extract global features and local features; and The emotion recognition processing unit is characterized by including a step of integrating the global feature and the local feature into an integrated feature using a plurality of transformers and a loss function, and recognizing an emotion by considering the functional feature related to emotion in each of the plurality of communities and the interaction between the plurality of communities based on the integrated feature. Brainwave-based emotion recognition method.
  11. In Paragraph 10, The step of processing with the above input data is, A step of performing source localization by performing brainwave analysis to restore the source electrical signals generated in each brain region and estimating them as source-level brainwave data with increased spatial expressiveness of brain activity; A step of performing the functional partitioning according to classification into multiple functional communities by grouping the source-level brainwave data into functional regions using multiple atlases; and Characterized by including the step of performing the differential entropy calculation for a plurality of frequency bands at preset time intervals from the source level brainwave data, accumulating the calculation results, and processing them as the input data. Brainwave-based emotion recognition method.
  12. In Paragraph 10, The step of extracting the above global and local features is, A step of compressing input features into frequency and spatial dimensions through an average pooling layer in the spectral SE block and the spatial SE block based on a SSAM (Spectral-Spatial Attention Module) composed of a spectral SE (Squeeze-and Excitation) block and a spatial SE block, learning weights through a fully connected layer and a sigmoid function, and outputting graph data by emphasizing features for recognizing emotions in the frequency domain and spatial domain by multiplying the input features and the weights; and The method is characterized by including the step of extracting local features by capturing functional characteristics only from regions of interest (ROIs) within one of the plurality of communities and learning potential relationships among the plurality of communities, so as to identify interactions between different regions of the brain through an adjacency matrix using regions of interest (ROIs) as nodes in the graph data via PDGCN (Pearson correlation coefficients-based Dynamical Graph Convolutional Network), and extracting global features. Brainwave-based emotion recognition method.
  13. In Paragraph 10, The step of integrating the global feature and the local feature into an integrated feature using the plurality of transformers and loss functions, and recognizing emotion by considering the functional feature related to emotion in each of the plurality of communities and the interaction between the plurality of communities based on the integrated feature, A step of integrating the local features and the global features by connecting the output to the local features and passing it through the FFNN, and performing a weighted sum based on community characteristics, in an integrated transformer composed of a Multi-Head Cross-Attention (MHCA) and a Feed Forward Neural Network (FFNN) among the plurality of transformers, wherein the cross-attention (CA) of the MHCA learns and outputs the local features using the local features as a query and the global features as key and value, and the output is combined with the local features and passed through the FFNN to perform a weighted sum based on community characteristics; and The method is characterized by including a step in which, among the plurality of transformers, a fusion transformer composed of a Multi-Head Self-Attention (MHSA) and an FFNN comprises a self-attention (SA) of the MHSA that captures the relationship between community-specific features to compute an emotion processing process formed through the complex interaction of the plurality of communities, and recognizes the emotion based on the result output through the FFNN. Brainwave-based emotion recognition method.
  14. In Paragraph 10, The step of integrating the global feature and the local feature into an integrated feature using the plurality of transformers and loss functions, and recognizing emotion by considering the functional feature related to emotion in each of the plurality of communities and the interaction between the plurality of communities based on the integrated feature, Characterized by including a step of reducing redundant information between the global feature and the local feature using a difference loss function and a classification loss function defined based on cross-entropy included in the loss function. Brainwave-based emotion recognition method.

Description

Apparatus and Method for Recognizing Emotions Based on Brainwaves Using a User's Brain Network Community The present invention relates to a brainwave-based emotion recognition device and method using a user's brain network community, and more specifically, to a technology that can effectively recognize a user's emotions through the interaction between the functional roles of multiple brain regions involved in emotion processing by identifying complex brain activity patterns based on multiple brain network communities. Emotion recognition technology is rapidly developing due to its potential applications in various fields such as mental health diagnosis, user experience evaluation, and Human-Computer Interaction (HCI), and human emotions manifest in diverse ways, including voice, facial expressions, and biosignals. Among these, biosignals are actively utilized as indicators for gauging emotions due to their characteristics that make them difficult to consciously control. In particular, Electroencephalography (EEG), which represents brain waves, is a signal that measures the brain's electrical activity through electrodes attached to the scalp; as it is easy to measure and inexpensive, numerous studies on EEG-based emotion recognition are being reported. Electrical signals generated from the brain pass through various tissues, such as the skull and cerebrospinal fluid, become distorted, and are transmitted to the scalp. EEG is a signal that measures electrical activity generated in the brain by attaching multiple electrodes to the scalp, and it is utilized as one of the indicators for recognizing emotions. However, because electrical signals resulting from brain activity are transmitted to the scalp via various tissues that make up the head, EEG has a limitation in that it has low spatial resolution. Accordingly, it is difficult to develop a deep learning-based emotion recognition model that utilizes functional characteristics based on brain location using EEG. Due to this, EEG has a limitation in that it is difficult to identify the exact source brain region where the signal originated. This impairs the ability to capture meaningful spatial information in the brain and understand functional relationships. However, existing EEG-based AI models for emotion recognition predict emotions without considering the limitation of EEG's low spatial resolution. Furthermore, the human brain forms a functional community composed of interconnected regions of interest (ROIs). These domains process emotions through mutual interaction, but most EEG-based emotion recognition models do not actively utilize the functional characteristics related to emotions of various communities. FIG. 1 is a diagram illustrating a brainwave-based emotion recognition device using a user's brain network community according to an embodiment of the present invention. FIGS. 2a to 3 are drawings illustrating the operation of a data preprocessing unit constituting a brainwave-based emotion recognition device according to an embodiment of the present invention. FIGS. 4a to 4c are drawings illustrating the operation of a feature extraction unit constituting a brainwave-based emotion recognition device according to an embodiment of the present invention. FIGS. 5 and 6 are drawings illustrating the operation of an emotion recognition processing unit constituting a brainwave-based emotion recognition device according to an embodiment of the present invention. FIG. 7 is a diagram illustrating the configuration of utilizing a loss function of a brainwave-based emotion recognition device according to an embodiment of the present invention. FIGS. 8A and 8B are drawings illustrating the performance of a brainwave-based emotion recognition device according to an embodiment of the present invention. FIG. 9 is a diagram illustrating a brainwave-based emotion recognition method using a user's brain network community according to an embodiment of the present invention. Hereinafter, various embodiments of this document are described with reference to the attached drawings. The embodiments and the terms used therein are not intended to limit the technology described in this document to specific embodiments and should be understood to include various modifications, equivalents, and/or substitutions of said embodiments. In describing various embodiments below, if it is determined that a detailed description of related known functions or configurations could unnecessarily obscure the essence of the invention, such detailed description will be omitted. Furthermore, the terms described below are defined considering their functions in various embodiments, and these may vary depending on the intentions or practices of the user or operator. Therefore, their definitions should be based on the content throughout this specification. In relation to the description of the drawings, similar reference numerals may be used for similar components. A singular expression may include a pl