Search

CN-122025141-A - Real-time analysis method for facial microexpressions and limb behavioral and psychological behaviors

CN122025141ACN 122025141 ACN122025141 ACN 122025141ACN-122025141-A

Abstract

The invention relates to the technical field of psychological analysis, and discloses a method for analyzing facial micro-expressions and limbs behaviors and psychology in real time, which is characterized in that a psychological analysis model is built, a user real-time dynamic acquisition module, a user real-time dynamic preprocessing module, a user psychological real-time analysis module, a user psychological real-time evaluation module and a user psychological improvement module are built in the model, the user real-time dynamic acquisition module acquires original data, the user real-time dynamic preprocessing module preprocesses the acquired original data, the user psychological real-time analysis module utilizes the original data acquired in real time, and then the user historical data is combined to calculate the comprehensive characteristic value of the facial micro-expressions Comprehensive characteristic value of limb behaviors And comprehensive mood index The user psychological real-time assessment module assesses the current mood of the user according to the calculation result, and the user psychological improvement module provides personalized psychological state improvement measures according to the current psychological assessment result of the user and by combining the current environmental conditions and the personal information of the user.

Inventors

  • ZHAO YUE
  • ZHANG YONGREN

Assignees

  • 安徽宝葫芦信息科技集团股份有限公司

Dates

Publication Date
20260512
Application Date
20260203

Claims (10)

  1. 1. The real-time analysis method for the facial microexpressions and the physical and psychological behaviors is characterized by comprising the following steps of: firstly, establishing a psychological analysis model, and setting a user real-time dynamic acquisition module, a user real-time dynamic preprocessing module, a user psychological real-time analysis module, a user psychological real-time assessment module and a user psychological state improvement module in the model; Step two, a user real-time dynamic acquisition module acquires facial microexpressions and limb behavior data of a user in real time through a multi-mode sensor; step three, a user real-time dynamic preprocessing module carries out cleaning, noise reduction and standardization processing on the acquired original data; step four, the user psychological real-time analysis module performs comprehensive calculation based on the user facial micro-expression and limb behavior data acquired in real time and combining the user historical data; step five, the user psychological real-time assessment module assesses the current mood of the user according to the calculation result; Step six, the user mental state improvement module provides personalized mental state improvement measures according to the current mental assessment result of the user and by combining the current environmental conditions and the personal information of the user.
  2. 2. The method for real-time analysis of facial micro-expressions and limb behaviors and psychology according to claim 1, wherein the user real-time dynamic acquisition module comprises a facial micro-expression data unit and a limb behavior data unit.
  3. 3. The method for real-time analysis of facial micro-expression and physical and psychological analysis of limbs according to claim 2 wherein said facial micro-expression data unit obtains facial micro-expression data including fine changes of eyes, eyebrows, mouth and ear parts through a multispectral camera and a bioradar.
  4. 4. The method for real-time analysis of facial micro-expressions and physical and psychological activities according to claim 2 wherein the physical and psychological activities data unit obtains physical and psychological activities data through 3D depth sensors, including hand gestures, leg gestures, upper body gestures, lower body gestures, head and foot movements.
  5. 5. The method for real-time analysis of facial micro-expressions and physical and psychological activities according to claim 1, wherein the user psychological real-time analysis module comprises a facial micro-expression analysis unit, a physical and psychological activities analysis unit and a comprehensive mood analysis unit.
  6. 6. The method for real-time analysis of facial micro-expressions and limbs behavior and psychological factors according to claim 5, wherein said facial micro-expression analysis unit calculates facial micro-expression integrated characteristic values from facial micro-expression data The calculation formula is as follows: In the formula (i), Represents the comprehensive characteristic value of the micro-expression of the face, Indicating the frequency of eye jog in a unit time, Representing the minimum frequency of eye jog per unit time, Represents the maximum inching frequency of the eye in unit time, Indicating the minimum frequency of contraction of the glabellar muscle per unit time, Indicating the maximum contraction frequency of the glabellar muscle in a unit time, Indicating the frequency of glabellar muscle contraction per unit time, Representing the amplitude of the angular displacement of the mouth, The number of times of the ear retraction is represented, 、 、 And Respectively represent the weighting coefficients of eyes, eyebrows, mouth and ear parts in the facial micro-expression comprehensive characteristic values, Representing the time decay factor.
  7. 7. The method for real-time analysis of facial micro-expressions and physical and psychological activities according to claim 5, wherein said physical and psychological activities analysis unit calculates physical and psychological integrated characteristic values according to physical and psychological activities data The calculation formula is as follows: In the formula (i), Representing the comprehensive characteristic value of limb behaviors, k ∊ { gesture, leg, upper body posture, lower body posture, head and foot }, The motion amplitude of the part is represented, The frequency of operation per unit time is indicated, Representing the location weighting coefficients.
  8. 8. The method for real-time analysis of facial micro-expressions and limb behavioral psychology according to claim 5, wherein the comprehensive mood analyzing unit integrates the feature values according to facial micro-expressions And limb behavior integrated characteristic value Calculating comprehensive mood index The calculation formula is as follows: In the formula (i), The comprehensive mood index is represented by the index, Represents the comprehensive characteristic value of the micro-expression of the face, Represents the comprehensive characteristic value of the limb behaviors, 、 Respectively represents the fusion weights of facial micro-expressions and limb behavior modes, Representing an environmental correction term.
  9. 9. The method for real-time analysis of facial micro-expressions and physical and psychological activities according to claim 1, wherein the user psychological real-time assessment module is based on comprehensive mood index The current mood of the user is subjected to classification evaluation as follows: (1) When comprehensive mood index Setting a threshold value Judging that the user is positive; (2) Setting a threshold value Not less than the comprehensive mood index Setting a threshold value When the heart condition is judged to be neutral; (3) When comprehensive mood index Setting a threshold value less than or equal to When the user decides to have a negative mood.
  10. 10. The method for analyzing facial microexpressions and physical and psychological activities in real time according to claim 1, wherein said user mental state improving module recommends differential improvement measures according to the mental state classification result of said mental state real-time evaluation module in combination with environmental conditions and personal information of the user.

Description

Real-time analysis method for facial microexpressions and limb behavioral and psychological behaviors Technical Field The invention relates to the technical field of psychological analysis, in particular to a real-time analysis method for facial micro-expressions and limb behaviors and psychology. Background In modern society, people face various pressure and emotional challenges, and psychological health problems are increasingly concerned. Traditional emotion analysis methods rely mainly on subjective self-assessment or post hoc interviews, and it is difficult to capture the emotional state of the user objectively in real time. In recent years, with rapid development of computer vision, sensor technology and artificial intelligence, emotion analysis based on facial microexpressions and limb behaviors has become a research hotspot. Facial micro-expressions and limb behaviors are important nonverbal clues of emotion expression, and can reflect the real psychological states of people. However, most of the existing analysis methods only focus on single-mode data, lack comprehensive analysis on multi-mode data, and fail to fully consider the influence of environmental factors on emotion. Therefore, developing a method capable of analyzing the psychological state of the user accurately in real time has important scientific significance and application value. Disclosure of Invention (One) solving the technical problems Aiming at the defects of the prior art, the invention provides a real-time analysis method for facial micro-expressions and limb behaviors and psychology, which has the advantages of multi-modal data fusion, environment self-adaptive compensation, real-time dynamic evaluation and personalized intervention, and solves the problems that the traditional method depends on single-modal data, ignores environment interference and cannot capture real emotional states in real time. (II) technical scheme In order to achieve the purpose, the invention provides the following technical scheme that the real-time analysis method for the facial microexpressions and the physical and psychological behaviors comprises the following steps: firstly, establishing a psychological analysis model, and setting a user real-time dynamic acquisition module, a user real-time dynamic preprocessing module, a user psychological real-time analysis module, a user psychological real-time assessment module and a user psychological state improvement module in the model; Step two, a user real-time dynamic acquisition module acquires facial microexpressions and limb behavior data of a user in real time through a multi-mode sensor; step three, a user real-time dynamic preprocessing module carries out cleaning, noise reduction and standardization processing on the acquired original data; step four, the user psychological real-time analysis module performs comprehensive calculation based on the user facial micro-expression and limb behavior data acquired in real time and combining the user historical data; step five, the user psychological real-time assessment module assesses the current mood of the user according to the calculation result; Step six, the user mental state improvement module provides personalized mental state improvement measures according to the current mental assessment result of the user and by combining the current environmental conditions and the personal information of the user. Preferably, the user real-time dynamic acquisition module comprises a facial micro-expression data unit and a limb behavior data unit. Preferably, the facial micro-expression data unit acquires facial micro-expression data including fine changes of eyes, eyebrows, mouth and ear parts through a multispectral camera and a biological radar. Preferably, the limb behavior data unit acquires limb behavior data including a gesture, a leg, an upper body posture, a lower body posture, and limb movements of head and foot motions through the 3D depth sensor. Preferably, the user psychological real-time analysis module comprises a facial micro-expression analysis unit, a limb behavior analysis unit and a comprehensive mood analysis unit. Preferably, the facial micro-expression analysis unit calculates a facial micro-expression integrated feature value from the facial micro-expression dataThe calculation formula is as follows: In the formula (i), Represents the comprehensive characteristic value of the micro-expression of the face,Indicating the frequency of eye jog in a unit time,Representing the minimum frequency of eye jog per unit time,Represents the maximum inching frequency of the eye in unit time,Indicating the minimum frequency of contraction of the glabellar muscle per unit time,Indicating the maximum contraction frequency of the glabellar muscle in a unit time,Indicating the frequency of glabellar muscle contraction per unit time,Representing the amplitude of the angular displacement of the mouth,The number of times of the ear retraction is repre