CN-122004863-A - Electroencephalogram emotion recognition method based on GCN-BiLSTM hybrid neural network
Abstract
The invention particularly discloses an electroencephalogram emotion recognition method based on a GCN-BiLSTM hybrid neural network, and relates to the technical field of artificial intelligence. The method comprises the steps of S1, obtaining original electroencephalogram data and preprocessing, S2, constructing an electroencephalogram channel diagram structure reflecting the connection relation among electroencephalogram channels according to the space distribution relation when an electroencephalogram data set is collected, S3, inputting the electroencephalogram channel diagram structure into a diagram convolution network to obtain an electroencephalogram space characteristic sequence, S4, describing dynamic change characteristics of electroencephalogram signals in a time dimension based on a two-way long-short-period memory network, S5, constructing an emotion classification module based on an output result of the two-way long-period memory network, judging emotion states corresponding to input electroencephalogram signals, and outputting emotion recognition results. The invention reduces the dependence on complex artificial characteristic engineering, improves the stability and applicability of electroencephalogram emotion recognition, and has good engineering realizability and popularization and application values.
Inventors
- ZHU WENZHAO
- WANG ZHIWEI
- ZHANG RUIXI
- CHEN XIAOLING
- XIE PING
Assignees
- 燕山大学
Dates
- Publication Date
- 20260512
- Application Date
- 20260401
Claims (8)
- 1. The brain electricity emotion recognition method based on the GCN-BiLSTM hybrid neural network is characterized by comprising the following specific steps: Step S1, acquiring original electroencephalogram data of a public data set and preprocessing to obtain standardized electroencephalogram data; S2, constructing an electroencephalogram channel diagram structure reflecting the connection relation between electroencephalogram channels according to the spatial distribution relation during acquisition of an electroencephalogram data set; S3, inputting an electroencephalogram channel graph structure into a graph convolution network to obtain an electroencephalogram space feature sequence reflecting the global space dependency relationship; s4, inputting the spatial feature sequence output by the graph rolling network into a two-way long-short-term memory network, and describing the dynamic change characteristic of the electroencephalogram signal in the time dimension through joint modeling of forward and reverse time sequence information; And S5, constructing an emotion classification module based on the output result of the two-way long-short-term memory network, judging the emotion state corresponding to the input brain electrical signal, and outputting an emotion recognition result.
- 2. The method for recognizing brain electricity emotion based on GCN-BiLSTM hybrid neural network according to claim 1, wherein in step S1, preprocessing is performed on brain electricity signals, including denoising, filtering, artifact suppression, signal normalization and segmentation processing, and the preprocessed brain electricity signals are divided into a plurality of time slices.
- 3. The method for recognizing brain electricity emotion based on GCN-BiLSTM hybrid neural network according to claim 2, wherein in step S2, each brain electricity channel is regarded as a graph node, the connection relationship between brain electricity channels is represented as graph edges, and a brain electricity channel graph structure representing brain electricity space topological characteristics is formed, specifically as follows: Step S21, setting an electroencephalogram data set to totally comprise N electroencephalogram channels, wherein the formula is as follows: ; Wherein, the Is a node set of an electroencephalogram channel, For the ith brain electrical channel, corresponding to a graph node, , The number of the brain electrical channels is i which is an integer; step S22, distributing corresponding space coordinates for each electroencephalogram channel according to the space position information of each channel on the scalp in the electroencephalogram acquisition system: ; Wherein, the Representing the position coordinates of the ith brain electrical channel in a three-dimensional space; Step S23, defining any two electroencephalogram channels based on the space distance between the channels And (3) with The distance between them is as follows: ; Wherein, the Is an electroencephalogram channel And (3) with The distance between the two plates is set to be equal, The position coordinate of the jth electroencephalogram channel in the three-dimensional space, Is L2 norm, j is an integer; Step S24, constructing a connection relation between the electroencephalogram channels according to the space distance between the channels, and defining an adjacent matrix of the graph, wherein the formula is as follows: ; ; Wherein, the Is an adjacency matrix of the brain electrical channel diagram, For the elements of the j-th column of the i-th row of the adjacency matrix, As a threshold value of the distance, As a weight function of distance.
- 4. The method for recognizing brain electricity emotion based on GCN-BiLSTM hybrid neural network according to claim 3, wherein in step S3, convolution operation is performed on the characteristics of brain electricity signals on the graph structure to mine the spatial correlation characteristics among different brain electricity channels, specifically as follows: Step S31, the electroencephalogram obtained in step S2 is expressed as: ; Wherein, the In order to be an electroencephalogram channel diagram, Is the connection relation between channels; Step S32, the characteristics of the preprocessed electroencephalogram signals are expressed as follows: ; Wherein, the As the characteristics of the preprocessed electroencephalogram signals, Feature dimensions for each channel; step S33, in the graph rolling network, a unit matrix I is introduced by normalizing the graph structure to obtain an adjacent matrix with self-connection, and the formula is as follows: ; Wherein, the An adjacency matrix that is self-connecting; Step S34, calculating a corresponding degree matrix: ; Wherein, the The degree of self-connection is taken into account for the ith electrode node, The connection weight from the ith node to the ith node in the adjacency matrix with the self connection; Step S35, obtaining a single-layer graph convolution operation formula: ; Wherein, the Is the first Input feature matrix of layer graph rolling network when In the time-course of which the first and second contact surfaces, ; Is the first The layer may train a matrix of weight parameters, In the form of a degree matrix, Representing a nonlinear activation function.
- 5. The method for recognizing brain electricity emotion based on GCN-BiLSTM hybrid neural network according to claim 4, wherein in step S4, the specific steps are as follows: step S41, let the spatial feature sequence output by the graph rolling network in step S3 on the continuous time slices be expressed as: ; Wherein, the For a collection of spatial features output by a graph-convolution network over successive time slices, Is the first Brain electricity space characteristic vectors corresponding to the time steps, For the length of the time series, As a dimension of the spatial feature, ; Step S42, modeling a forward long-short-term memory network, wherein in the forward long-term memory network, a space feature sequence is input into the network according to a time sequence, and the hidden state updating process is expressed as follows: ; Wherein, the In the hidden state of the forward network at time step t, The spatial feature vector input into the network for the current time step t, Mapping functions for the forward long-short-term memory network; step S43, modeling a reverse long-short-term memory network, wherein in the reverse long-short-term memory network, a space feature sequence is input into the network according to a reverse time sequence, and the hidden state updating process is expressed as follows: ; Wherein, the In order to reverse the hidden state of the network at time step t, A mapping function is memorized for a reverse long-short period; Step S44, joint modeling of forward and reverse time sequence information is carried out, and bidirectional time sequence characteristic representation is obtained by fusing the outputs of the forward and reverse long-short-term memory networks: ; Wherein, the Is the electroencephalogram time sequence characteristic vector corresponding to the t-th time step, Is a feature fusion function.
- 6. The method for recognizing brain electricity emotion based on a GCN-BiLSTM hybrid neural network according to claim 5, wherein in step S5, the specific steps are as follows: step S51, let the output characteristics of the two-way long-short term memory network in step S4 on the time sequence be expressed as: ; Wherein, the As a sequence of time-series characteristics, The brain electrical time sequence feature vector corresponding to the T time step; step S52, aggregating the time sequence feature sequences to obtain a global time sequence feature vector: ; Wherein, the As a global timing feature vector, Aggregation functions for time sequence features; step S53, based on the global timing feature vector And constructing an emotion classification function, and judging an emotion state corresponding to the electroencephalogram signals.
- 7. The method for recognizing brain waves based on a GCN-BiLSTM hybrid neural network according to claim 6, wherein in step S53, an emotion state discrimination formula is as follows: ; Wherein, the For the emotion classification prediction result vector, For a predefined number of emotion categories, Indicating that the input brain electric signal belongs to the first The outcome of the prediction of the emotional-like state, , Representing the emotion classification mapping function.
- 8. The method for recognizing brain waves based on a GCN-BiLSTM hybrid neural network according to claim 7, wherein a final emotion recognition result is determined according to a prediction result vector, and the formula is as follows: ; Wherein, the In order to distinguish the resulting emotion type label, Indexing the function for the maximum value.
Description
Electroencephalogram emotion recognition method based on GCN-BiLSTM hybrid neural network Technical Field The invention relates to the technical field of artificial intelligence, in particular to an electroencephalogram emotion recognition method based on a GCN-BiLSTM hybrid neural network. Background Emotion is a core influencing factor of Human perception, cognition and social behavior, an intelligent system capable of accurately understanding and actively responding to Human emotion is developed in the field of Human-computer interaction (Human-computer Interaction, HCI), the intelligent system is a key direction for realizing humanization and intellectualization of Human-computer interaction, and emotion recognition technology provides core technical support for the direction. The electroencephalogram signal is used as a physiological core index for reflecting the emotional state of a human body, has the remarkable advantages of objectivity and instantaneity, and the emotion recognition technology based on the electroencephalogram signal also becomes a research hot spot, and has wide application prospects in the emerging fields of wearable equipment, emotion calculation, intelligent man-machine interaction and the like. The current electroencephalogram emotion recognition technology is mainly divided into three types of technical schemes, each scheme has obvious technical shortboards, although a machine learning-based method is clear in structure and low in calculation complexity, the machine learning-based method is highly dependent on manual feature extraction, complex nonlinear relations of electroencephalogram signals are difficult to characterize, recognition accuracy is greatly reduced under a cross-tested scene, CNN only has the advantages of spatial feature extraction and insufficient modeling capability of time sequence dependency relation in a single-depth learning model-based method, RNN can capture time sequence features but has defects in spatial feature expression, a single model cannot give consideration to spatial distribution and time dynamic features of the electroencephalogram signals, and a complex network structure or feature fusion strategy-based method is used for attempting to enhance feature expression capability, but core technical problems are not solved, and model generalization capability and engineering practicability are limited. The common pain point of the existing electroencephalogram emotion recognition technology is more prominent and becomes a key factor for limiting the actual landing of the electroencephalogram, on one hand, the multi-channel characteristic of the electroencephalogram determines that complex spatial association and topological relation exist between electrodes, the existing method cannot effectively model the characteristic to cause insufficient space feature expression capability, on the other hand, the emotion state has obvious time dynamic evolution characteristic, the long-term dependence of the time dimension of the electroencephalogram is not fully depicted in the prior art, dynamic change of emotion is difficult to accurately reflect, and in addition, the partial method is a network structure with excessively complicated pursuit performance, so that model training difficulty is improved, generalization capability is reduced, and the method cannot be adapted to actual application scenes with limited resources and high real-time requirements. Therefore, there is a need for an electroencephalogram emotion recognition method capable of effectively modeling electroencephalogram signal space correlation characteristics and time dynamic characteristics simultaneously and combining recognition performance and engineering realizability. Disclosure of Invention The invention aims to provide an electroencephalogram emotion recognition method based on a GCN-BiLSTM hybrid neural network, which effectively models multichannel space correlation characteristics of electroencephalogram signals and characterizes time sequence dynamic characteristics thereof, so that stability, applicability and engineering realizability of electroencephalogram emotion recognition are improved on the premise of ensuring a simple model structure, and effective recognition of emotion states is realized. In order to achieve the above purpose, the invention provides an electroencephalogram emotion recognition method based on a GCN-BiLSTM hybrid neural network, which comprises the following steps: Step S1, acquiring original electroencephalogram data of a public data set and preprocessing to obtain standardized electroencephalogram data; S2, constructing an electroencephalogram channel diagram structure reflecting the connection relation between electroencephalogram channels according to the spatial distribution relation during acquisition of an electroencephalogram data set; S3, inputting an electroencephalogram channel diagram structure into a diagram convolutional network GCN to