KR-20260063954-A - Method and Device for Displaying Emotion Map Animation
Abstract
An emotional map animation display device and method are disclosed. The disclosed method comprises the steps of: maintaining a database that stores emotional indices for unit time intervals for pre-set unit regions (a); receiving information on an emotional display time interval and an emotional index update time interval set by a user, and obtaining information on unit times included in the emotional display interval (b); displaying a map based on user interface on a user screen and obtaining information on unit regions included in the map displayed on the user screen (c); and painting colors corresponding to the emotional indices of the unit regions included in the map using a painting function based on the information stored in the database and the emotional index update time interval (d), wherein step (d) paints colors corresponding to the emotional index of the next unit time after the current unit time at each emotional index update time. According to the disclosed device and method, it is possible to adaptively provide changes in emotion over time according to various user requirements, and even while emotions over time are being displayed, adaptive emotional display is possible in response to sudden changes in display methods, such as the user zooming in/out of the map.
Inventors
- 최용석
- 김근태
- 이범석
- 김현우
Assignees
- 한양대학교 산학협력단
Dates
- Publication Date
- 20260507
- Application Date
- 20241031
Claims (10)
- A method for displaying emotional animations performed on a computing device including a processor and memory, Step (a) of maintaining a database that stores unit-time sentiment indices for preset unit regions; A step (b) of receiving information on the emotion display time interval and emotion index update time interval set by the user, and obtaining information on unit times included in the emotion display interval; A step (c) of displaying a map based on user interface on a user screen and obtaining unit area information included in the map displayed on the user screen; The method includes the step (d) of painting colors corresponding to the sentiment indices of unit regions included in the map using a painting function based on information stored in the database and the sentiment index update time interval, wherein The above step (d) is a method for displaying an emotion map animation that paints a color corresponding to the emotion index of the next unit time of the current unit time at each emotion index update time.
- In paragraph 1, The above step (c) is a method for displaying an emotional map animation that determines a map area to be displayed on a screen based on the user's zoom in/zoom out and position movement, and displays the map.
- In paragraph 1, The above step (d) A step of extracting unit-time sentiment index information for unit regions requested by the user from the database and loading it into a queue, using information regarding unit regions included in the map and unit times included in the sentiment display intervals; A method for displaying an emotion map animation, comprising the step of performing painting based on the emotion index for unit regions per unit time that are sequentially loaded into a queue according to the emotion index update time intervals according to the control of the set timeout module.
- In paragraph 3, A method for displaying an emotion map animation in which emotion index information of the unit regions is loaded into the above queue based on the order of unit times included in the emotion display interval.
- In paragraph 3, A step of determining whether a user interrupt occurs that changes the region where the sentiment index is displayed; A step of clearing information loaded in the queue when the above user interrupt occurs; A method for displaying an emotion map animation that further includes the step of extracting unit-time emotion index information of unit regions corresponding to the display region changed by the user from the database and loading it into the queue.
- processor; and It includes memory connected to the above processor, The above processor is, Step (a) of maintaining a database that stores unit-time sentiment indices for preset unit regions; A step (b) of receiving information on the emotion display time interval and emotion index update time interval set by the user, and obtaining information on unit times included in the emotion display interval; A step (c) of displaying a map based on user interface on a user screen and obtaining unit area information included in the map displayed on the user screen; Step (d) of painting colors corresponding to the sentiment indices of unit regions included in the map using a painting function based on the information stored in the database and the sentiment index update time interval, and The above step (d) is an emotion map animation display device that paints a color corresponding to the emotion index of the next unit time of the current unit time at each emotion index update time.
- In paragraph 6, The above step (c) is an emotional map animation display device that determines a map area to be displayed on a screen based on the user's zoom in/zoom out and position movement and displays the map.
- In paragraph 6, The above step (d) A step of extracting unit-time sentiment index information for unit regions requested by the user from the database and loading it into a queue, using information regarding unit regions included in the map and unit times included in the sentiment display intervals; An emotion map animation display device comprising the step of performing painting based on emotion indices for unit regions per unit time that are sequentially loaded into a queue according to the emotion index update time intervals according to the control of the set timeout module.
- In paragraph 8, An emotion map animation display device in which emotion index information of the unit regions is loaded into the above queue based on the order of unit times included in the emotion display interval.
- In paragraph 8, A step of determining whether a user interrupt occurs that changes the region where the sentiment index is displayed; A step of clearing information loaded in the queue when the above user interrupt occurs; An emotion map animation display device further comprising the step of extracting unit-time emotion index information of unit regions corresponding to the display region changed by the user from the database and loading it into the queue.
Description
Method and Device for Displaying Emotion Map Animation The present invention relates to an emotion map animation display device and method, and more specifically, to a device and method for displaying changes in emotion over time in an animation manner. Various methods have been studied to analyze regional sentiment information by analyzing data from social media. Sentiment mapping, a method of displaying emotions, is an active area of research because it allows for the identification of emotional states in each region through local social media data. In particular, various artificial intelligence models are being used when analyzing social media data. There is a demand for time-varying emotion maps that not only display emotion maps of specific periods but also show chronological changes in emotions through animation; however, existing research on time-varying emotion maps has generally involved generating and providing these maps to users as videos. However, emotion maps provided via video had the problem of being difficult to satisfy the diverse needs of users. While there is a requirement to be able to set the time intervals and regions where emotions are displayed according to user needs, video-based emotion maps were provided only in a way that unilaterally displayed emotions changing at preset time intervals. In addition, the time-varying appraisal map provided in a video format had a problem in that it could not respond at all to sudden user demands, such as zooming in/out of the map and moving the display area. FIG. 1 is a block diagram illustrating the overall structure of an emotion animation display device according to one embodiment of the present invention. FIG. 2 is a diagram showing the field structure of an emotion information database according to an embodiment of the present invention. FIG. 3 is a flowchart illustrating the operation of a user operation interpretation module and a map generation module according to an embodiment of the present invention. FIG. 4 is a diagram showing the result of displaying an emotion index that changes over time in an emotion animation according to an embodiment of the present invention. FIG. 5 is a diagram showing the overall flow of an emotion animation display method according to an embodiment of the present invention. FIG. 6 is a flowchart showing the operation of the emotion animation display method when a user interrupt occurs during the display of the emotion animation. Hereinafter, specific embodiments according to embodiments of the present invention will be described with reference to the drawings. The following detailed description is provided to facilitate a comprehensive understanding of the methods, devices, and/or systems described herein. However, this is merely illustrative and the present invention is not limited thereto. In describing the embodiments of the present invention, if it is determined that a detailed description of known technology related to the present invention may unnecessarily obscure the essence of the embodiments, such detailed description will be omitted. Furthermore, the terms described below are defined in consideration of their functions in the present invention, and these may vary depending on the intentions or practices of the user or operator. Therefore, such definitions should be based on the content throughout this specification. Terms used in the detailed description are intended merely to describe specific embodiments and should not be limiting. Unless explicitly stated otherwise, expressions in the singular form include the meaning of the plural form. In this description, expressions such as “include” or “comprising” are intended to refer to certain characteristics, numbers, steps, actions, elements, parts thereof, or combinations thereof, and should not be interpreted to exclude the existence or possibility of one or more other characteristics, numbers, steps, actions, elements, parts thereof, or combinations thereof other than those described. FIG. 1 is a block diagram illustrating the overall structure of an emotion animation display device according to one embodiment of the present invention. Referring to FIG. 1, an emotion animation display device according to one embodiment of the present invention includes an emotion information database (100), a user operation information interpretation module (110), a map/display area information generation module (120), and an emotion animation display module (120). Regional emotional information is stored in the emotional information database (100). According to one embodiment of the present invention, regional content can be collected using SNS and various online communities, and regional emotional information can be obtained through analysis of the collected content, and the obtained emotional information is stored in the emotional information database (100). For example, texts posted on social media and various online communities can be input into a generative