Search

JP-7854861-B2 - Emotion estimation device, emotion estimation method, emotion estimation program, and emotion estimation system

JP7854861B2JP 7854861 B2JP7854861 B2JP 7854861B2JP-7854861-B2

Inventors

  • 長谷川 渉
  • 村下 君孝
  • 加藤 徹洋
  • 橋本 和真

Assignees

  • 株式会社デンソーテン

Dates

Publication Date
20260507
Application Date
20220603

Claims (5)

  1. An emotion estimation device that estimates emotions, having a controller, The aforementioned controller, The user's first and second biosignals are acquired. Based on the acquired first biosignal, the first index value is calculated. Based on the acquired second biosignal, the second index value is calculated. Until a predetermined number of coordinate values composed of the first and second indicator values at the same time are calculated, the user's emotions are estimated based on the pattern of change of the coordinate values. After a predetermined number of coordinate values have been calculated, the first and second correction values are calculated using the average values of the first and second index values at each calculated time point. The first corrected index value is calculated by subtracting the first corrected index value from the first index value. The second corrected index value is calculated by subtracting the second corrected index value from the second index value. An emotion estimation device that estimates the user's emotions based on the first correction index value and the second correction index value .
  2. A method for estimating emotions, The user's first and second biosignals are acquired. Based on the acquired first biosignal, the first index value is calculated. Based on the acquired second biosignal, the second index value is calculated. Until a predetermined number of coordinate values composed of the first and second indicator values at the same time are calculated, the user's emotions are estimated based on the pattern of change of the coordinate values. After a predetermined number of coordinate values have been calculated, the first and second correction values are calculated using the average values of the first and second index values at each calculated time point. The first corrected index value is calculated by subtracting the first corrected index value from the first index value. The second corrected index value is calculated by subtracting the second corrected index value from the second index value. An emotion estimation method performed by a controller that estimates the user's emotions based on the first correction index value and the second correction index value .
  3. The user's first and second biosignals are acquired. Based on the acquired first biosignal, a first index value is calculated. Based on the acquired second biosignal, the second index value is calculated. Until a predetermined number of coordinate values composed of the first and second indicator values at the same time are calculated, the user's emotions are estimated based on the pattern of change of the coordinate values. After a predetermined number of coordinate values have been calculated, the first and second correction values are calculated using the average values of the first and second index values at each calculated time point. The first corrected index value is calculated by subtracting the first corrected index value from the first index value. The second corrected index value is calculated by subtracting the second corrected index value from the second index value. An emotion estimation program that causes a computer to perform a process to estimate the user's emotions based on the first correction index value and the second correction index value .
  4. An emotion estimation system comprising an emotion estimation device for estimating emotions and a terminal device, The terminal device has a terminal controller, and the terminal controller is The biosensor receives the user's biosignals, The received user biosignals are transmitted to the emotion estimation device. The estimated emotion-related information transmitted from the emotion estimation device is received. The emotion estimation device has an emotion estimation controller, The aforementioned emotion estimation controller, The terminal device acquires the user's first biosignal and second biosignal. Based on the acquired first biosignal, a first index value is calculated. Based on the acquired second biosignal, the second index value is calculated. Until a predetermined number of coordinate values composed of the first and second indicator values at the same time are calculated, the user's emotions are estimated based on the pattern of change of the coordinate values. After a predetermined number of coordinate values have been calculated, the first and second correction values are calculated using the average values of the first and second index values at each calculated time point. The first corrected index value is calculated by subtracting the first corrected index value from the first index value. The second corrected index value is calculated by subtracting the second corrected index value from the second index value. Based on the first and second correction index values, the user's emotions are estimated. The estimated emotion-related information concerning the estimated emotion is transmitted to the terminal device. The aforementioned terminal device is An emotion estimation system that outputs information indicating the user's emotions, which has been estimated by the emotion estimation device.
  5. An emotion estimation system comprising a game device in which a user plays a game, an emotion estimation device for estimating emotions, and a terminal device for managing the game, The game device has a game machine controller, and the game machine controller is By using a biosensor, the user's biosignals are measured. The measured user's biosignals are transmitted to the terminal device. The terminal device receives the estimated emotion-related information transmitted from the terminal device. The system provides the user with the received estimated emotion-related information. The terminal device has a terminal controller, and the terminal controller is The game device receives the user's biosignals, The received user biosignals are transmitted to the emotion estimation device. The estimated emotion-related information transmitted from the emotion estimation device is received. The received estimated emotion-related information is transmitted to the game device. The emotion estimation device has an emotion estimation controller, The aforementioned emotion estimation controller, The terminal device acquires the user's first biosignal and second biosignal. Based on the acquired first biosignal, the first index value is calculated. Based on the acquired second biosignal, the second index value is calculated. Until a predetermined number of coordinate values composed of the first and second indicator values at the same time are calculated, the user's emotions are estimated based on the pattern of change of the coordinate values. After a predetermined number of coordinate values have been calculated, the first and second correction values are calculated using the average values of the first and second index values at each calculated time point. The first corrected index value is calculated by subtracting the first corrected index value from the first index value. The second corrected index value is calculated by subtracting the second corrected index value from the second index value. Based on the first and second correction index values, the user's emotions are estimated. The estimated emotion-related information concerning the estimated emotion is transmitted to the terminal device. The aforementioned terminal device is An emotion estimation system that outputs information indicating the user's emotions, which has been estimated by the emotion estimation device.

Description

The present invention relates to an emotion estimation device, an emotion estimation method , an emotion estimation program , and an emotion estimation system. A technique is known for estimating a subject's emotions by applying information obtained from the subject's cardiac waveform (electrocardiogram) to the Russell circumplex model (see, for example, Patent Document 1). Japanese Patent Publication No. 2019-63324 Figure 1 shows an example of the configuration of the estimation system according to the embodiment.Figure 2 shows an example of a server configuration according to this embodiment.Figure 3 shows an example of an emotion estimation model (psychological plane).Figure 4 shows an example of an emotion estimation model that uses indicators of physical condition as parameters.Figure 5 shows an example of a sensor table and a psychological planar table.Figure 6 shows an example of an emotion change table.Figure 7 illustrates the method for extracting emotion types.Figure 8 shows an example of an evaluation screen and a diagram illustrating the usage of information related to emotions.Figure 9 shows an example of an evaluation screen and a diagram illustrating the usage of information related to emotions.Figure 10 illustrates a method for calibrating an emotion estimation model.Figure 11 illustrates the training method for machine learning models.Figure 12 is a flowchart showing the model generation process.Figure 13 is a flowchart showing the emotion estimation process for estimating emotional states. The embodiments of the emotion estimation device, emotion estimation method, and emotion estimation system will be described in detail below with reference to the attached drawings. However, the present invention is not limited to the embodiments described below. First, the estimation system according to the embodiment will be explained using Figure 1. Figure 1 is a diagram showing an example configuration of the estimation system according to the embodiment. As shown in Figure 1, the estimation system 1 includes a server 10, a terminal device 20, sensors 31a, 32a, 31b, and 32b. The estimation system 1 estimates the emotions of subjects U02a and U02b. Server 10 is an example of an emotion estimation device. Subjects U02a and U02b are, for example, e-sports players. Estimation System 1 estimates the emotions of subjects U02a and U02b while they are playing a video game. In this description of the embodiment, to make the explanation more concrete and easier to understand, we will assume an application scenario in e-sports as described above, and explain it including state transitions, etc. The results of the emotion estimation will be used, for example, in the mental training of subjects U02a and U02b in esports. For instance, if subject U02 experiences unfavorable emotions (anxiety, anger, etc.) during video game play, it will be determined that intensive training corresponding to that emotional state is necessary. In this training, the emotion information of subjects U02a and U02b estimated by estimation system 1 will be used. Furthermore, in various types of esports, subjects U02a and U02b participate in cooperative and competitive games. By displaying the emotional state of each player in these esports games, it becomes possible to perform more advanced gameplay, such as changing game tactics according to the player's emotional state. This also enhances the enjoyment of watching games, as spectators can understand the emotional state of each player. Furthermore, another application example is that the subjects may be patients in a medical institution. In this case, the emotions estimated by estimation system 1 are used for examinations and treatments, etc. For example, if a patient (subject) is feeling anxious, medical staff can provide support such as counseling. Furthermore, the subjects may be students in an educational institution. In this case, the emotions estimated by estimation system 1 will be used to improve the content of the lessons. For example, if a student (the subject) finds a lesson boring, the teacher can improve the lesson content to make it more engaging for the student. Furthermore, the subjects may be vehicle drivers. In this case, the emotions estimated by estimation system 1 are used to promote safe driving. For example, if a subject (the driver) is not feeling adequately focused while driving, the in-vehicle system can display a message encouraging them to concentrate on driving. Furthermore, the subjects may be viewers of content such as videos and music. In this case, the emotions estimated by estimation system 1 will be used to create further content. For example, a video content provider can create a highlight reel by compiling scenes that viewers (subjects) found enjoyable. In this way, estimated emotions can be used for a variety of purposes. Returning to the system description, Server 10 and Terminal Device 20 are connected via Network N. For example, Network N is the In