Search

JP-2026076256-A - Video and image analysis system

JP2026076256AJP 2026076256 AJP2026076256 AJP 2026076256AJP-2026076256-A

Abstract

[Challenge] In a situation where online communication is the primary mode of communication, objectively evaluate these communications in order to facilitate more efficient communication. [Solution] The system of this disclosure includes: a video acquisition unit that acquires video images obtained by photographing a user during an online session for each of several users; an analysis unit that analyzes changes in the user's biological response based on the video images acquired by the video acquisition unit; a character information identification unit that identifies character information having attributes corresponding to the information related to the analysis results by the analysis unit; and an output unit that outputs the identified character information. [Selection Diagram] Figure 1

Inventors

  • 神谷 渉三

Assignees

  • 株式会社I’mbesideyou

Dates

Publication Date
20260511
Application Date
20260122

Claims (6)

  1. A video analysis system that analyzes the reactions of users based on video footage obtained by capturing users during an online session, regardless of whether the users are displayed on the screen, in an environment where multiple users are conducting online sessions, For each of the multiple users, a video acquisition unit acquires video images obtained by capturing the user during the online session, Based on the video images acquired by the video image acquisition unit, an analysis unit analyzes changes in the user's biological responses, A character information identification unit identifies character information having attributes corresponding to the information related to the analysis results by the aforementioned analysis unit, An output unit that outputs the identified character information, A video analysis system equipped with the following features.
  2. A video analysis system according to claim 1, The aforementioned character information includes the character object information, The output unit outputs the object information of the character. A video analysis system.
  3. A video analysis system according to claim 2, The output unit changes the output mode of the character object information according to the information on the change in the biological reaction analyzed by the analysis unit. A video analysis system.
  4. A video analysis system according to any one of claims 1 to 3, The character information identification unit identifies the character information based on the attributes of the user who is the subject of the analysis. A video analysis system.
  5. A video analysis system according to any one of claims 1 to 4, The output unit does not output information about the user who is the subject of the analysis. A video analysis system.
  6. A video analysis system according to claim 5, The output unit outputs the character information along with the information relating to the analysis results to the terminal of a user other than the user who is the subject of the analysis. The system further includes a feedback information acquisition unit that acquires feedback information for the information relating to the analysis results that is input to the terminal of the other user that acquired the character information, The output unit outputs a notification based on the feedback information acquired by the feedback information acquisition unit to the user's terminal associated with the character information, in a video analysis system.

Description

This invention relates to a video analysis system that analyzes participants' biological responses based on video footage obtained during online sessions conducted by multiple participants. Techniques are known for analyzing the emotions others feel in response to a speaker's remarks (see, for example, Patent Document 1). Techniques are also known for analyzing changes in a subject's facial expression over a long period and estimating the emotions they experienced during that time (see, for example, Patent Document 2). Techniques are known for identifying the factors that most influenced changes in emotion (see, for example, Patent Documents 3-5). Techniques are known for comparing a subject's usual facial expression with their current expression and issuing an alert if their expression is gloomy (see, for example, Patent Document 6). Techniques are known for comparing a subject's normal (expressionless) facial expression with their current expression to determine the degree of their emotions (see, for example, Patent Documents 7-9). Techniques are also known for analyzing organizational emotions and the atmosphere within a group as perceived by individuals (see, for example, Patent Documents 10 and 11). Japanese Patent Publication No. 2019-58625Japanese Patent Publication No. 2016-149063Japanese Patent Publication No. 2020-86559Japanese Patent Publication No. 2000-76421Japanese Patent Publication No. 2017-201499Japanese Patent Publication No. 2018-112831Japanese Patent Publication No. 2011-154665Japanese Patent Publication No. 2012-8949Japanese Patent Publication No. 2013-300Japanese Patent Publication No. 2011-186521WO15/174426 publication This figure shows an overall diagram of the system according to an embodiment of the present invention.This is an example of a functional block diagram of an evaluation terminal according to an embodiment of the present invention.This figure shows an example of a functional configuration 1 of an evaluation terminal according to an embodiment of the present invention.This figure shows an example of a functional configuration 2 of an evaluation terminal according to an embodiment of the present invention.This figure shows an example of a functional configuration 3 of an evaluation terminal according to an embodiment of the present invention.This is an example of a screen display based on the functional configuration example 3 in Figure 6.This is another screen display example based on the functional configuration example 3 in Figure 6.This figure shows another configuration of the functional configuration example 3 of the evaluation terminal according to an embodiment of the present invention.This figure shows another configuration of the functional configuration example 3 of the evaluation terminal according to an embodiment of the present invention.This figure shows an example of the system configuration according to this embodiment.This figure shows an example of the functional configuration of the system according to this embodiment.This figure shows an example of a list of analysis results data with character information attached.This figure shows an example of data output by the output unit.This figure shows an example of the display mode of the screen shown on the evaluator's terminal by the output unit according to this embodiment.This flowchart shows an example of the processing flow by the system according to this embodiment. The embodiments of this disclosure are described below. This disclosure comprises the following configuration. (Item 1) A video analysis system that analyzes the reactions of users based on video footage obtained by capturing users during an online session, regardless of whether the users are displayed on the screen, in an environment where multiple users are conducting online sessions, For each of the multiple users, a video acquisition unit acquires video images obtained by capturing the user during the online session, Based on the video images acquired by the video image acquisition unit, an analysis unit analyzes changes in the user's biological responses, A character information identification unit identifies character information having attributes corresponding to the information related to the analysis results by the aforementioned analysis unit, An output unit that outputs the identified character information, A video analysis system equipped with the following features. (Item 2) The video analysis system described in item 1, The aforementioned character information includes the character object information, The output unit outputs the object information of the character. Video and image analysis system. (Item 3) The video analysis system described in item 2, The output unit changes the output mode of the character object information according to the information on the change in the biological reaction analyzed by the analysis unit. Video and image analysis system. (Item 4) A video analysis system described in any one of items 1 to