Search

JP-2026074671-A - Emotion estimation device, emotion estimation program, training data generation method, and training data generation program

JP2026074671AJP 2026074671 AJP2026074671 AJP 2026074671AJP-2026074671-A

Abstract

[Problem] To provide an emotion estimation device, an emotion estimation program, a training data generation method, and a training data generation program that can improve the accuracy of emotion estimation. [Solution] The emotion estimation device according to the embodiment is an emotion estimation device that estimates the emotion information of a vehicle driver based on a facial image of the driver, and has a controller. The controller acquires information on external factors that affect the facial image, and estimates the emotion information based on the facial image and the information on external factors. [Selection Diagram] Figure 1B

Inventors

  • 橋本 和真
  • 加藤 徹洋
  • 村下 君孝

Assignees

  • 株式会社デンソーテン

Dates

Publication Date
20260507
Application Date
20241021

Claims (8)

  1. An emotion estimation device that estimates the emotional information of a vehicle driver based on the facial image of the driver, It has a controller, The aforementioned controller, An emotion estimation device that acquires information on external factors affecting the facial image and estimates emotion information based on the facial image and the information on external factors.
  2. The emotion estimation device according to claim 1, wherein the external factor information is encoded information indicating whether or not an external factor affecting the facial image has occurred.
  3. The aforementioned controller, The emotion estimation device according to claim 1, which estimates emotion information based on facial feature quantities obtained by quantifying the features in the facial image.
  4. The aforementioned controller, The emotion estimation device according to claim 1, wherein if the external factor information is information indicating that an external factor affecting the facial image has occurred, it is estimated that there has been no change in the driver's emotions.
  5. The aforementioned controller, The turning state of the aforementioned vehicle is acquired, The emotion estimation device according to claim 4, wherein if the turning state of the driver's face direction based on the facial image is equivalent to the turning state of the vehicle, the external factor information indicates that an external factor has occurred.
  6. An emotion estimation program executed by an emotion estimation device that estimates the emotion information of a vehicle driver based on the facial image of the driver, A controller-executed emotion estimation program that acquires external factor information influencing the facial image and estimates the emotion information based on the facial image and the external factor information.
  7. A method for generating learning data, performed by a learning data generation device that generates learning data for a learning model that estimates the emotional information of a vehicle driver based on the facial image of the vehicle driver, A method for generating training data, which is executed by a controller, that acquires information on external factors affecting the face image and generates the training data based on the face image and the information on external factors.
  8. A learning data generation program executed by a learning data generation device that generates learning data for a learning model that estimates the emotional information of a vehicle driver based on the facial image of the vehicle driver, Emotional information is estimated based on the subject's biometric information. The facial image of the subject is obtained, We obtain information on external factors that affect the facial image of the subject, A training data generation program executed by the controller generates the training data in which the aforementioned emotional information is the target variable and the aforementioned facial image and the aforementioned external factor information are the explanatory variables.

Description

This invention relates to an emotion estimation device, an emotion estimation program, a training data generation method, and a training data generation program. Conventionally, there are technologies that estimate a driver's emotions from their facial expressions and other characteristics. Such technologies have proposed techniques that use AI (Artificial Intelligence) models to estimate a driver's emotions from their facial images. For example, Patent Document 1 discloses a technology that identifies the orientation of the face from a facial image and estimates emotions based on that orientation. Patent No. 7358956 Figure 1A is an explanatory diagram of the emotion estimation process according to the embodiment.Figure 1B is an explanatory diagram of the emotion estimation process according to the embodiment.Figure 2 shows an example of the configuration of an emotion estimation device according to an embodiment.Figure 3 shows an example of training data generated during the learning process.Figure 4 shows an example of inference input data generated during the inference process.Figure 5 is a flowchart showing the processing procedure of the learning process performed by the emotion estimation device according to the embodiment.Figure 6 is a flowchart showing the processing procedure of the inference process performed by the emotion estimation device according to the embodiment. The emotion estimation device, emotion estimation program, learning data generation method, and learning data generation program according to the following embodiments will be described in detail with reference to the attached drawings. However, this invention is not limited to the embodiments described below. First, the emotion estimation process according to the embodiment will be explained using Figures 1A and 1B. Figures 1A and 1B are explanatory diagrams of the emotion estimation process according to the embodiment. The emotion estimation process according to the embodiment is performed by the emotion estimation device 1. As shown in Figure 1A, the emotion estimation process involves inputting inference data into an emotion estimation model and obtaining emotion information from the model. The inference input data includes, for example, a facial image of the driver captured by an in-vehicle camera. The emotion information obtained through the emotion estimation process is output to various in-vehicle devices and vehicle control systems and used for driver assistance, such as providing driving advice and vehicle control. Next, using Figure 1B, we will explain the training process of the emotion estimation model and the inference process using the trained emotion estimation model. The training and inference processes are performed by the emotion estimation device 1 according to the training mode and estimation mode set, for example, by user operation. Note that the training process may be handled by a training device and the inference process by an emotion estimation device; each process may be performed by separate devices. Below, we will give an example where the emotion estimation device 1 performs both the training and inference processes. (Learning process) First, the learning process will be explained using Figure 1B. The learning process is executed when the learning mode is set by the user's operation. Specifically, as shown in Figure 1B, the learning process generates emotion information D5 by performing emotion estimation processing C2 based on the driver's biological information (such as brain waves and heart rate) D1 detected by the biosensor C1. Emotion estimation processing C2 estimates two emotion indices that indicate the driver's mental and physical state based on biological information such as brain waves and heart rate. Specifically, the emotion indices are the level of arousal of the central nervous system and the activity level of the autonomic nervous system. For example, the level of arousal is calculated from the β/α waves of the brain wave, and the activity level is calculated from the standard deviation of the heart rate LF (Low Frequency) component (low frequency component of the heart rate waveform signal). Then, emotion (type) information is estimated from the level of arousal and the activity level. Specifically, a matrix table is constructed with arousal level and activity level as the two axes, and matrix table data is created that stores the emotion information corresponding to each cell defined by the level of arousal and activity level in the matrix table. Then, the matrix table data is searched using the arousal and activity levels calculated based on the biological information D1, and the search results are determined as emotional information D5. Based on the in-vehicle camera image (driver's face image) D2 captured by the in-vehicle camera C3, the exterior camera image (vehicle surroundings image) D3 captured by the exterior camera C4, and the vehicle state information (other factor informat