Search

CN-116184397-B - Non-contact emotion recognition method based on frequency modulation continuous wave radar

CN116184397BCN 116184397 BCN116184397 BCN 116184397BCN-116184397-B

Abstract

The invention discloses a non-contact emotion recognition method based on a frequency modulation continuous wave radar, and belongs to the field of non-contact emotion monitoring in the traditional industry of high and new technology improvement. The method mainly comprises the following steps of (1) transmitting frequency modulation continuous wave signals by a transmitting antenna array and receiving the frequency modulation continuous wave signals by a receiving antenna array, (2) extracting echo signals from the chest wall of a monitored object in radar echo signals by an MVDR algorithm, (3) demodulating motion signals of the chest wall of the monitored object from phases of the radar echo signals, (4) extracting respiration signals from the motion signals of the chest wall of the monitored object by a filter, (5) extracting duration time of successive heartbeat from the motion signals of the chest wall of the monitored object, (6) calculating physiological characteristics for emotion recognition, and (7) training a random forest machine learning model to realize emotion recognition. The invention adopts microwaves as detection media, and has the advantages of convenient use, low environmental requirement, quick measurement, higher accuracy and the like.

Inventors

  • LIU TAO
  • HAN XIANGYU
  • ZHAI QIAN
  • HAN YI

Assignees

  • 浙江大学

Dates

Publication Date
20260505
Application Date
20230315

Claims (10)

  1. 1. A non-contact emotion recognition method based on a frequency modulation continuous wave radar is characterized by comprising the following steps: S1, a transmitting antenna array transmits periodic linear frequency modulation waves to a region where a monitoring object is located, and corresponding echo signals are acquired through a receiving antenna array; s2, constructing a virtual antenna array according to the geometric characteristics of the transmitting antenna array and the receiving antenna array, and extracting radar echo signals from the direction of the monitored object from echo signals acquired in the step S1 by an MVDR wave beam forming method based on the virtual antenna array; s3, demodulating a chest wall motion signal of the monitored object from the phase of the radar echo signal; S4, extracting a respiratory signal from the chest wall motion signal through a filter; S5, extracting duration time of successive heart beats from the chest wall motion signal of a monitored object by a non-contact heart rate variability monitoring method based on a frequency modulation continuous wave radar; S6, respectively calculating a time domain feature, a frequency domain feature and a nonlinear domain feature based on the chest wall motion signal in the step S3, the respiratory signal in the step S4 and the duration time of successive heart beats in the step S5, and constructing a feature vector for emotion recognition; S7, training a random forest machine learning model for emotion recognition based on the feature vector in the step S6, wherein the random forest machine learning model is used for completing emotion classification tasks; The nonlinear domain features include: Is a function of the approximate entropy of (a); is a first-order detrending trend fluctuation analysis coefficient; is a second order detrending fluctuation analysis coefficient; Is a function of the approximate entropy of (a); is a first-order detrending trend fluctuation analysis coefficient; is a second order detrending fluctuation analysis coefficient; Is a function of the approximate entropy of (a); is a first-order detrending trend fluctuation analysis coefficient; is a second order detrending fluctuation analysis coefficient; In Poincare scatter plots ; In Poincare scatter plots ; In Poincare scatter plots And Is a product of (2); In Poincare scatter plots And (3) with Wherein, For the chest wall motion signal, In order to be a respiratory signal, For the duration of the successive heart beats.
  2. 2. The non-contact emotion recognition method based on the frequency modulation continuous wave radar, which is characterized in that the pulse frequency modulation range of the linear frequency modulation wave is from 77GHz to 81GHz, the single pulse duration period is 50 mu s, the sampling frequency of a fast time axis is 4MHz, the sampling point number is 128, and the sampling frequency of a slow time axis is 100Hz.
  3. 3. The method for recognizing non-contact emotion based on frequency modulated continuous wave radar according to claim 1, wherein the transmitting antenna array comprises 3 transmitting antennas, and the horizontal distance between two adjacent transmitting antennas is The receiving antenna array comprises 4 receiving antennas, and the horizontal distance between two adjacent receiving antennas is The virtual antenna array comprises 12 virtual antennas positioned on the same plane, and the coordinates of the virtual antennas with the numbers of 2-12 are respectively as follows when the virtual antenna with the number of 1 is taken as the origin of coordinates: , , , , , , , , , , Wherein, the method comprises the steps of, Is the wavelength of the chirped wave.
  4. 4. A non-contact emotion recognition method based on a frequency modulated continuous wave radar as set forth in claim 3, wherein said MVDR beamforming method in S2 uses only echo signals received by virtual antennas numbered 1-8 in the virtual antenna array.
  5. 5. The non-contact emotion recognition method based on frequency modulated continuous wave radar according to claim 1, wherein the MVDR beamforming method in step S2 is specifically as follows: S2-1, respectively marking radar echo signals received by 8 virtual antennas in the virtual antenna array after ADC conversion as Wherein Recording the received signal vector Is that ; S2-2, based on the direction angle of the monitoring object relative to the virtual antenna array obtained by measurement in advance Constructing steering vectors for beamforming , ; S2-3 based on the steering vector described in step S2-2 Creating an optimal weight vector , ; Wherein the matrix Is that The covariance matrix of the moment is calculated according to the following formula ; S2-4 based on the optimal weight vector described in step S2-3 Extracting radar echo signals from the direction of a monitored object , 。
  6. 6. The non-contact emotion recognition method based on frequency modulation continuous wave radar according to claim 1, wherein the step S3 is specifically as follows: s3-1 the radar echo signal extracted according to step S2 Is a complex exponential signal with the mathematical expression of , Wherein the method comprises the steps of Representing the amplitude of the signal as a constant; Which is indicative of the frequency of the signal, Representing the phase of the signal; S3-2 solving the step S3-1 by means of an arctangent function Is of the phase of (a) And unwinding; s3-3 based on the result of step S3-2, according to Chest wall motion signal with monitored subject Relationship between , Obtaining chest wall motion signals of a monitored subject 。
  7. 7. The method for recognizing emotion without contact based on frequency modulated continuous wave radar according to claim 1, wherein said filter is a butterworth band-pass filter having an order of 20 and a passband of 。
  8. 8. The non-contact emotion recognition method based on the frequency modulation continuous wave radar according to claim 1, wherein the non-contact heart rate variability monitoring method based on the frequency modulation continuous wave radar in step S5 is specifically as follows: S8-1, transmitting periodic linear frequency modulation waves to the area where the monitored object is located, and acquiring corresponding radar echo signals by using a receiving antenna; S8-2, preprocessing the acquired radar echo signals by adopting a frequency mixing and fast Fourier transform mode in sequence; S8-3, extracting the position of the heart part of the monitored object from the preprocessed radar echo signal; s8-4, extracting the motion information of the heart part of the monitored object from the position obtained in the step S8-3 by adopting a phase correlation method, and calculating an acceleration signal of the heart part of the monitored object according to the motion information; S8-5, smoothing the obtained acceleration signal by calculating short-time average power, and estimating the positions of the dividing points among the heartbeats by a peak detection method; S8-6, generating an acceleration template signal of a single heartbeat based on the positions of the dividing points between the heartbeats estimated in the step S8-5, and then precisely dividing the acceleration signal obtained in the step S8-4 by using the acceleration template signal to obtain the heartbeat interval of each heartbeat; and S8-7, calculating heart rate variability indexes based on the heart beat intervals obtained in the step S8-6, so as to realize non-contact heart rate variability monitoring.
  9. 9. The non-contact emotion recognition method based on frequency modulation continuous wave radar according to claim 1, wherein the step S6 is specifically as follows: according to the chest wall motion signal in step S3 The respiratory signal in step S4 And the duration of the beat-to-beat in step S5 Creating a feature vector for the set of samples from the computed 26 time domain features, 23 frequency domain features, and 13 non-linear features ; The time domain features include: Is a variance of (2); average value of absolute value of first order difference, normalized The mean value of the absolute value of the first order difference; normalized means of the absolute value of the second order difference An average of absolute values of second order differences; the average value of the time interval between adjacent peak points; A mean value of the time interval between adjacent valley points; variance of time interval between adjacent peak points; Variance of time intervals between adjacent valley points; Inverse of the mean value of the time intervals between adjacent peak points; Inverse of the mean of the time intervals between adjacent valley points; average value of absolute value of first order difference, normalized The mean value of the absolute value of the first order difference; normalized means of the absolute value of the second order difference An average of absolute values of second order differences; Is the average value of (2); Inverse of the mean of (2); standard deviation of (2); root mean square value of the first order difference; average value of absolute value of first order difference, normalized The mean value of the absolute value of the first order difference; normalized means of the absolute value of the second order difference An average of absolute values of second order differences; is a degree of deviation of (2); Kurtosis of (a); The absolute value in the first order difference is larger than The proportion of the term; The frequency domain features include: is smaller than Is a mean amplitude of frequency components of the frequency spectrum; Is between the frequency spectrum of Is a mean amplitude of frequency components of the frequency spectrum; Is between the frequency spectrum of Is a mean amplitude of frequency components of the frequency spectrum; Is between the frequency spectrum of Is a mean amplitude of frequency components of the frequency spectrum; is smaller than Is a mean amplitude of frequency components of the frequency spectrum; Is between the frequency spectrum of Is a mean amplitude of frequency components of the frequency spectrum; Is between the frequency spectrum of Is a mean amplitude of frequency components of the frequency spectrum; Is between the frequency spectrum of Is a mean amplitude of frequency components of the frequency spectrum; Is between the frequency spectrum of Is a mean amplitude of frequency components of the frequency spectrum; Is between the frequency spectrum of Is a mean amplitude of frequency components of the frequency spectrum; Is between the frequency spectrum of The average amplitude of the frequency components of (a) is between Is a frequency component of (a) a ratio of the average magnitudes; is smaller than The sum of the magnitudes of the frequency components of (a); Is between the frequency spectrum of The sum of the magnitudes of the frequency components of (a); Is between the frequency spectrum of The sum of the magnitudes of the frequency components of (a); is smaller than The frequency with the largest amplitude in the frequency components; Is between the frequency spectrum of The frequency with the largest amplitude in the frequency components; Is between the frequency spectrum of The frequency with the largest amplitude in the frequency components; Is between the frequency spectrum of Sum and sum of magnitudes of frequency components of (a) The frequency spectrum is smaller than Is a frequency component of (a) the ratio of the sum of the magnitudes; Is between the frequency spectrum of Sum and sum of magnitudes of frequency components of (a) The frequency spectrum is smaller than Is a frequency component of (a) the ratio of the sum of the magnitudes; Is between the frequency spectrum of Sum and sum of magnitudes of frequency components of (a) The frequency spectrum is smaller than Is a frequency component of (a) the ratio of the sum of the magnitudes; The frequency spectrum is smaller than Sum and sum of magnitudes of frequency components of (a) Is between the frequency spectrum of Is a frequency component of (a) the ratio of the sum of the magnitudes; The frequency spectrum is smaller than Sum and sum of magnitudes of frequency components of (a) Is between the frequency spectrum of Is a frequency component of (a) the ratio of the sum of the magnitudes; The frequency spectrum is between Sum and sum of magnitudes of frequency components of (a) Is between the frequency spectrum of Is a frequency component of (a) the ratio of the sum of the magnitudes.
  10. 10. The method for recognizing non-contact emotion based on frequency modulated continuous wave radar according to claim 1, wherein in the random forest machine learning model of step S7, the number of decision trees is 200, and any sub-node generated after branching of each node is set to at least contain 2 samples.

Description

Non-contact emotion recognition method based on frequency modulation continuous wave radar Technical Field The invention belongs to the field of non-contact emotion monitoring in the high and new technology transformation traditional industry, and particularly relates to a non-contact emotion recognition method based on a frequency modulation continuous wave radar. Background In recent years, researches around non-contact emotion recognition have been made, and various non-contact emotion recognition methods are continuously developed. Traditional emotion recognition is mainly achieved through monitoring of brain electricity or facial expression. However, in practical applications, both have significant limitations. For the former, the electroencephalogram cap is required to be worn by a monitored object for acquisition of the electroencephalogram signals, the wearing process which is very complicated and requires professional knowledge greatly limits the use scene of the means, and in addition, discomfort caused by the electroencephalogram cap for the monitored object can also influence the emotion of the monitored object. In the latter case, obtaining the facial expression of the monitored subject puts a certain requirement on the brightness of the environment, and this method is susceptible to subjective expression spoofing of the monitored subject, such as "phoney" of the monitored subject. These limitations greatly limit the use of emotion recognition in everyday life. The accurate recognition of the emotion of the monitored object has very important significance in modern life. For example, the emotion recognition is applied to psychological diagnosis and treatment, so that a psychological diagnosis and treatment engineer can be helped to master the emotion state of a patient more accurately, the emotion recognition is applied to psychological condition assessment of soldiers, psychological dispersion can be carried out for the soldiers more accurately, the fight state of the soldiers is improved, and the emotion recognition is applied to home scenes, so that the working state of intelligent household appliances, such as air-conditioning temperature, light brightness and the like, can be adjusted in time according to the moods of owners. On the other hand, physiological features of a person (e.g., respiration, heart rate variability, etc.) are more difficult to subjectively manipulate than expressive features of a face, and thus emotion recognition based on the above physiological features theoretically has higher accuracy. And the physiological characteristics of the monitored object can not be influenced by the ambient illumination condition by using the frequency modulation continuous wave radar. Based on the method, the emotion recognition method based on the frequency modulation continuous wave radar provided by the invention adopts microwaves as detection media, and has the advantages of convenience in use, low environmental requirements, rapidness in measurement, higher accuracy and the like. Disclosure of Invention In order to solve the problems of low measurement accuracy, limited use condition and the like of the conventional non-contact emotion recognition method, the invention provides a non-contact emotion recognition method based on a frequency modulation continuous wave radar. The specific technical scheme adopted by the invention is as follows: The invention provides a non-contact emotion recognition method based on a frequency modulation continuous wave radar, which comprises the following steps: S1, a transmitting antenna array transmits periodic linear frequency modulation waves to a region where a monitoring object is located, and corresponding echo signals are acquired through a receiving antenna array; s2, constructing a virtual antenna array according to the geometric characteristics of the transmitting antenna array and the receiving antenna array, and extracting radar echo signals from the direction of the monitored object from echo signals acquired in the step S1 by an MVDR wave beam forming method based on the virtual antenna array; s3, demodulating a chest wall motion signal of the monitored object from the phase of the radar echo signal; S4, extracting a respiratory signal from the chest wall motion signal through a filter; S5, extracting duration time of successive heartbeat from the chest wall motion signal of the monitored object by a non-contact heart rate variability monitoring method based on a frequency modulation continuous wave radar; S6, respectively calculating a time domain feature, a frequency domain feature and a nonlinear domain feature based on the chest wall motion signal in the step S3, the respiratory signal in the step S4 and the duration time of successive heart beats in the step S5, and constructing a feature vector for emotion recognition; and S7, training a random forest machine learning model for emotion recognition based on the feature vector in t