Search

US-20260127738-A1 - EMOTION IDENTIFICATION OF INDIVIDUALS

US20260127738A1US 20260127738 A1US20260127738 A1US 20260127738A1US-20260127738-A1

Abstract

A computer-implemented method, system, and computer program product for emotion recognition. A deep convolutional neural network is trained to identify an emotion from images of facial expressions. Furthermore, an application in a computing device (e.g., mobile computing device, such as a smartphone) is utilized to capture an image of an individual (e.g., individual that is neurotypical, individual on the autism spectrum). The captured image of the individual is then analyzed using the trained deep convolutional neural network. The emotional state of the individual is then classified based on the analysis of the captured image of the individual using the trained deep convolutional neural network. The classified emotional state is then conveyed to a user (e.g., neurotypical individual, autistic individual) via an emoticon (emotional icon), such that the emoticon reflects the classified emotional state.

Inventors

  • Maria Resendiz
  • Damian Valles
  • Md Inzamam Ul Haque
  • Rezwan Matin
  • Tamima Rashid

Assignees

  • TEXAS STATE UNIVERSITY

Dates

Publication Date
20260507
Application Date
20251104

Claims (20)

  1. 1 . A computer-implemented method for emotion recognition, the method comprising: training a deep convolutional neural network to identify an emotion from images of facial expressions; capturing an image of an individual from a computing device; analyzing said captured image of said individual using said trained deep convolutional neural network; classifying an emotional state of said individual based on said analysis of said captured image of said individual using said trained deep convolutional neural network; and conveying said classified emotional state to a user of said computing device via an emoticon.
  2. 2 . The method as recited in claim 1 , wherein said deep convolutional neural network is trained on a sample data set comprising images of individual expressing seven emotions photographed from five different angles.
  3. 3 . The method as recited in claim 2 , wherein said deep convolutional neural network comprises one of the following; Naïve-CNN, VGG16, EfficientNetV2, and MobileNetV2.
  4. 4 . The method as recited in claim 1 , wherein said computing device comprises a mobile computing device.
  5. 5 . The method as recited in claim 1 , wherein said individual corresponds to an individual on an autism spectrum.
  6. 6 . The method as recited in claim 1 , wherein said individual corresponds to an individual that is neurotypical.
  7. 7 . The method as recited in claim 1 , wherein said user corresponds to an individual on an autism spectrum.
  8. 8 . The method as recited in claim 1 , wherein said user corresponds to an individual that is neurotypical.
  9. 9 . A computer program product for emotion recognition, the computer program product comprising one or more computer readable storage mediums having program code embodied therewith, the program code comprising programming instructions for: training a deep convolutional neural network to identify an emotion from images of facial expressions; capturing an image of an individual from a computing device; analyzing said captured image of said individual using said trained deep convolutional neural network; classifying an emotional state of said individual based on said analysis of said captured image of said individual using said trained deep convolutional neural network; and conveying said classified emotional state to a user of said computing device via an emoticon.
  10. 10 . The computer program product as recited in claim 9 , wherein said deep convolutional neural network is trained on a sample data set comprising images of individual expressing seven emotions photographed from five different angles.
  11. 11 . The computer program product as recited in claim 10 , wherein said deep convolutional neural network comprises one of the following; Naïve-CNN, VGG16, EfficientNetV2, and MobileNetV2.
  12. 12 . The computer program product as recited in claim 9 , wherein said computing device comprises a mobile computing device.
  13. 13 . The computer program product as recited in claim 9 , wherein said individual corresponds to an individual on an autism spectrum.
  14. 14 . The computer program product as recited in claim 9 , wherein said individual corresponds to an individual that is neurotypical.
  15. 15 . The computer program product as recited in claim 9 , wherein said user corresponds to an individual on an autism spectrum.
  16. 16 . The computer program product as recited in claim 9 , wherein said user corresponds to an individual that is neurotypical.
  17. 17 . A system, comprising: a memory for storing a computer program for emotion recognition; and a processor connected to the memory, wherein the processor is configured to execute program instructions of the computer program comprising: training a deep convolutional neural network to identify an emotion from images of facial expressions; capturing an image of an individual from a computing device; analyzing said captured image of said individual using said trained deep convolutional neural network; classifying an emotional state of said individual based on said analysis of said captured image of said individual using said trained deep convolutional neural network; and conveying said classified emotional state to a user of said computing device via an emoticon.
  18. 18 . The system as recited in claim 17 , wherein said deep convolutional neural network is trained on a sample data set comprising images of individual expressing seven emotions photographed from five different angles.
  19. 19 . The system as recited in claim 18 , wherein said deep convolutional neural network comprises one of the following; Naïve-CNN, VGG16, EfficientNetV2, and MobileNetV2.
  20. 20 . The system as recited in claim 17 , wherein said computing device comprises a mobile computing device.

Description

GOVERNMENT INTERESTS This invention was made with government support under Grant Numbers 2150135 and 2231794 awarded by the National Science Foundation. The government has certain rights in the invention. TECHNICAL FIELD The present disclosure relates generally to emotion recognition, and more particularly to identifying emotions of individuals, including individuals on the autism spectrum. BACKGROUND Emotion recognition is the process of identifying human emotion. People vary widely in their accuracy at recognizing the emotions of others. Use of technology to help people with emotion recognition is a relatively nascent research area. Generally, the most effective systems employ a multimodal approach, such as by analyzing various human expressions in context. For example, existing techniques focus on automating the recognition of facial expressions from video, spoken expressions from audio, written expressions from text, and physiology as measured by wearables. The accuracy of emotion recognition is usually improved when it combines the analysis of human expressions from multimodal forms, such as texts, physiology, audio, or video. Different emotion types are detected through the integration of information from facial expressions, body movement and gestures, and speech. The technology is said to contribute in the emergence of the so-called emotional or emotive Internet. The existing approaches in emotion recognition to classify certain emotion types can be generally classified into three main categories: knowledge-based techniques, statistical methods, and hybrid approaches. Unfortunately, the developmental process for these existing emotion recognition and teaching technologies fails to include the autistic perspective. Autism spectrum disorder (ASD) is a neurodevelopmental condition marked by challenges in social communication and a tendency towards repetitive, restrictive patterns of behavior and interests. Furthermore, ASD involves autistic individuals having a range of support needs. There is inconclusive information regarding how autistic individuals interpret and learn emotions. As a result, existing technological models are built without this crucial data making them largely neurotypical-centric. Hence, there is not currently a means for bidirectional teaching to provide information to neurotypical individuals about how autistic individuals learn emotions and vice-versa. SUMMARY In one embodiment of the present disclosure, a computer-implemented method for emotion recognition comprises training a deep convolutional neural network to identify an emotion from images of facial expressions. The method further comprises capturing an image of an individual from a computing device. The method additionally comprises analyzing the captured image of the individual using the trained deep convolutional neural network. Furthermore, the method comprises classifying an emotional state of the individual based on the analysis of the captured image of the individual using the trained deep convolutional neural network. Additionally, the method comprises conveying the classified emotional state to a user of the computing device via an emoticon. Other forms of the embodiment of the computer-implemented method described above are in a system and in a computer program product. The foregoing has outlined rather generally the features and technical advantages of one or more embodiments of the present disclosure in order that the detailed description of the present disclosure that follows may be better understood. Additional features and advantages of the present disclosure will be described hereinafter which may form the subject of the claims of the present disclosure. BRIEF DESCRIPTION OF THE DRAWINGS A better understanding of the present disclosure can be obtained when the following detailed description is considered in conjunction with the following drawings, in which: FIG. 1 illustrates an embodiment of the present disclosure of a computing environment for practicing the principles of the present disclosure; FIG. 2 is a diagram of the software components used by the computer to identify emotions of individuals, including individuals on the autism spectrum, in accordance with an embodiment of the present disclosure; and FIG. 3 is a flowchart of a method for assisting neurotypical people in identifying emotions from non-neurotypical people (e.g., people with ASD) and vice-versa in accordance with an embodiment of the present disclosure. DETAILED DESCRIPTION As stated above, emotion recognition is the process of identifying human emotion. People vary widely in their accuracy at recognizing the emotions of others. Use of technology to help people with emotion recognition is a relatively nascent research area. Generally, the most effective systems employ a multimodal approach, such as by analyzing various human expressions in context. For example, existing techniques focus on automating the recognition of facial expressions