Search

KR-20260065486-A - A METHOD FOR GENERATING AVATAR ANIMATION BASED ON GAME INTERACTION AND AN ELECTRONIC DEVICE ON WHICH THE METHOD IS IMPLEMENTED

KR20260065486AKR 20260065486 AKR20260065486 AKR 20260065486AKR-20260065486-A

Abstract

According to one embodiment of the present disclosure, an avatar interaction method linked to a game environment may be provided, comprising: an operation of obtaining a user input for controlling a first character corresponding to a content provider providing broadcast content within a game environment executed by a game program by an input module; an operation of detecting at least one event associated with the operation of the first character based on the user input; an operation of obtaining first operation information of a first avatar corresponding to the detected event—the first avatar being implemented to reproduce the content provider—based on avatar operation matching information; an operation of determining a first character interaction directed according to the detected event and determining a first avatar interaction configured such that the first avatar performs a predetermined operation based on the first operation information; and an operation of transmitting broadcast content including a game environment in which the first character interaction is executed and a first avatar animation in which the first avatar interaction is implemented through a broadcasting platform, wherein the first avatar interaction is executed in conjunction with the first character interaction.

Inventors

  • 기준수
  • 이성철
  • 김광진

Assignees

  • 주식회사 스콘

Dates

Publication Date
20260508
Application Date
20250723

Claims (8)

  1. In a method for avatar interaction linked to a game environment, An operation to obtain user input for controlling a first character corresponding to a content provider providing broadcast content within a game environment executed by a game program, by means of an input module; An operation to detect at least one event associated with the action of the first character based on the above user input; An action of obtaining first action information of a first avatar corresponding to the detected event based on avatar action matching information—the first avatar is implemented to reproduce the content provider—; An action of determining a first character interaction directed according to the detected event, and determining a first avatar interaction configured such that the first avatar performs a predetermined action based on the first action information; The operation of transmitting broadcast content through a broadcast platform, including a game environment in which the first character interaction is executed and a first avatar animation in which the first avatar interaction is implemented; An action of determining a second avatar interaction that reproduces the action of the content provider by tracking the action of the content provider; and The operation of generating a first avatar animation including a first avatar interaction linked to a first character interaction within a game environment and a second avatar interaction that reproduces the action of the content provider; The first avatar interaction is executed in conjunction with the first character interaction, and is characterized by generating a first avatar animation so that the second avatar interaction is not performed while the first avatar interaction is being performed. Avatar interaction method.
  2. In paragraph 1, The operation of detecting at least one event described above is, An avatar interaction method comprising detecting at least one event corresponding to the user input among a plurality of pre-stored events.
  3. In paragraph 1, The operation of detecting at least one event described above is, An action of obtaining information about a first game program being executed among a plurality of game programs; An operation to determine a first event list corresponding to the first game program based on information regarding the first game program—the first event list includes a plurality of events that control a character during the execution of the first game program—and An avatar interaction method comprising: an action of detecting at least one event based on the first event list above.
  4. In paragraph 1, The operation of acquiring the above-mentioned first operation information is, An avatar interaction method comprising: an action of obtaining first action information that instructs the first avatar to perform an action corresponding to the specific action of the first character when an event corresponding to the specific action of the first character is detected.
  5. In paragraph 1, The operation of acquiring the above-mentioned first operation information is, An action confirming that a first setting value has been assigned to a detected event; An action of identifying matching information corresponding to the detected event based on avatar action matching information that defines the relationship between the event type and the avatar action in response to a confirming action; and An avatar interaction method comprising: an action of obtaining first action information based on identified matching information.
  6. In paragraph 1, An avatar interaction method further comprising: an operation to check whether a setting value corresponding to the state in which the detected event activates the first avatar interaction has been assigned.
  7. In paragraph 1, An avatar interaction method characterized in that the first character interaction is determined to cause the first character to perform a first action based on a first action command generated in response to the detected event.
  8. In paragraph 1, An avatar interaction method characterized by determining the first avatar interaction so that the first avatar performs the same action as the first character.

Description

A method for generating avatar animation based on game interaction and an electronic device on which such method is implemented The present disclosure relates to a method for implementing avatar interaction linked to a game environment and an electronic device in which such method is implemented. More specifically, it relates to a user tracking-based 3D avatar interaction implementation method capable of responding in real time to events occurring within a game by linking with a user's game environment. With the recent advancements in virtual reality (VR) and augmented reality (AR) technologies, 3D avatar implementation technology based on user tracking is garnering attention. Avatars that reflect users' movements and facial expressions in real time are being developed using technologies such as face sensing, hand tracking, and motion tracking, significantly enhancing immersion in virtual environments. These technologies are being utilized in various fields, including entertainment, education, and social media. However, existing avatar interaction technologies primarily focus on direct synchronization between the user and the avatar, which limits their ability to link in-game events with avatar movements. In other words, the movements of the in-game character and the avatar occur separately, failing to provide a consistent experience to the user and potentially degrading gameplay immersion. Furthermore, implementing real-time avatar reactions to game events can lead to latency issues, negatively impacting the user experience. To overcome these limitations, real-time avatar rendering technology that links the game environment with the avatar rendering program is required. FIG. 1 relates to a system for providing an avatar interaction method linked to a game environment according to various embodiments. FIG. 2 is a diagram illustrating the configuration of an electronic device included in the exemplary system of FIG. 1 according to various embodiments. FIG. 3 is a diagram illustrating the detailed operation of a system for providing an avatar interaction method linked to a game environment according to various embodiments. FIG. 4 is a flowchart illustrating an example of an electronic device performing an avatar interaction method linked to a game environment according to various embodiments. FIG. 5 is a diagram illustrating a method for an electronic device to detect an event according to various embodiments. FIG. 6 is a diagram illustrating examples of information stored by an electronic device to determine an operation according to an event, according to various embodiments. FIG. 7 is a diagram illustrating a specific method for an electronic device to acquire first operation information according to various embodiments. FIG. 8 is a drawing for illustrating a specific method for an electronic device to determine a first character interaction according to various embodiments. Hereinafter, embodiments of the present disclosure will be described in detail with reference to the accompanying drawings. In describing the embodiments, technical details that are well known in the art to which the present disclosure belongs and are not directly related to the present disclosure will be omitted. This is intended to convey the essence of the present disclosure more clearly without obscuring it by omitting unnecessary explanations. The embodiments described in this specification are intended to clearly explain the concept of the invention to those skilled in the art to which the invention pertains, and therefore the invention is not limited to the embodiments described in this specification, and the scope of the invention should be interpreted to include modifications or variations that do not deviate from the concept of the invention. The terms used in this specification have been selected to be as widely used as possible, taking into account their functions in the present invention; however, these terms may vary depending on the intent of those skilled in the art to which the present invention pertains, case law, or the emergence of new technologies. However, if a specific term is defined and used with an arbitrary meaning, the meaning of that term will be described separately. Accordingly, the terms used in this specification should be interpreted based on their actual meaning and the content throughout this specification, rather than merely their names. The drawings attached to this specification are intended to facilitate the explanation of the present invention. The shapes depicted in the drawings may be exaggerated as necessary to aid in understanding the present invention, and therefore the present invention is not limited by the drawings. In this specification, each of the phrases such as “A or B”, “at least one of A and B”, “at least one of A or B”, “A, B or C”, “at least one of A, B and C”, and “at least one of A, B, or C” may include any one of the items listed together in the corresponding phrase, or all p