KR-20260063229-A - Method and apparatus for providing extended reality services
Abstract
The present invention discloses a method and apparatus for providing an augmented reality service. According to the present invention, an augmented reality service providing apparatus is provided, comprising: a processor; and a memory connected to the processor, wherein the memory stores program instructions executed by the processor to acquire first emotion-related information including a first user's face, continuous motion, voice, and biometric information, determine the emotion of the first user using the acquired first emotion-related information and second emotion information of surrounding users, and determine the facial expression and motion of a virtual character according to the emotion of the first user.
Inventors
- 송광헌
- 이금탁
- 양승남
- 김창모
- 채화종
- 이진웅
Assignees
- 주식회사 피씨엔
Dates
- Publication Date
- 20260507
- Application Date
- 20241030
Claims (10)
- As an extended reality service providing device, processor; and It includes memory connected to the above processor, The above memory is, Acquire first emotion-related information including the first user's face, continuous motion, voice, and biometric information, and Determining the emotion of the first user using the first emotion-related information obtained above and the second emotion information of surrounding users, and To determine the facial expressions and motions of the virtual character according to the emotions of the first user, An augmented reality service providing device that stores program instructions executed by the above-mentioned processor.
- In paragraph 1, The above program instructions are, An extended reality service providing device that determines one or more inflection points in the continuous motion of the first user and determines the emotion of the first user by combining the motions before and after the one or more inflection points.
- In paragraph 1, The above program instructions are, An augmented reality service providing device that determines the emotions of the first user by analyzing the voice tone, speed, and intonation of the first user.
- In paragraph 1, The above program instructions are, An extended reality service providing device that acquires biosignals of a first user, including heart rate, skin electrical response (EDA), and body temperature, through a wearable device.
- In paragraph 1, The above program instructions are, Determining a plurality of candidate emotions of the first user using the first emotion-related information obtained above, and An augmented reality service providing device that determines one of the plurality of candidate emotions as the first user's emotion using second emotion information predetermined for the surrounding user.
- In paragraph 1, The above program instructions are, An augmented reality service providing device that checks whether a first user's motion is a pre-registered emotion-related motion, and if it is an emotion-related motion, determines a user emotion corresponding to the user motion.
- In paragraph 1, The above program instructions are, An extended reality service providing device that determines whether an emotion-related motion is a motion by utilizing the duration after the end of the motion of the first user.
- In paragraph 1, An extended reality service providing device that checks whether there is a subsequent motion within a certain time if the motion of the first user is an emotion-related motion, and determines that the motion of the first user is not an emotion-related motion if the subsequent motion is an emotion-independent motion.
- A method for providing an augmented reality service in a device including a processor and memory, A step of acquiring first emotion-related information including the first user's face, continuous motion, voice, and biometric information; A step of determining the emotion of the first user using the first emotion-related information obtained above and the second emotion information of surrounding users; and A method for providing an augmented reality service including the step of determining the facial expression and motion of a virtual character according to the emotions of the first user.
- A computer program stored on a computer-readable recording medium that performs the method according to paragraph 9.
Description
Method and apparatus for providing extended reality services The present invention relates to a method and apparatus for providing augmented reality services. Recently, services related to augmented reality, such as the metaverse, are expanding beyond virtual reality. Conventional user interfaces for augmented reality were extremely limited in reflecting user expressions (emotions) because they utilized only linguistic input devices such as keyboards and mice. In addition, professional personnel such as game developers and broadcasters use multiple expensive devices to perform gesture recognition and facial expression recognition to recognize user emotions, but this has limitations in being applied to metaverse services for various users. Furthermore, with the recent diversification of video-based services, such as online classes and video chat-based customer consultations, there is a growing need to provide customized services by automatically recognizing the other party's emotions through video analysis. FIG. 1 is a functional block diagram illustrating the configuration of an augmented reality service system according to one embodiment of the present invention. FIG. 2 is a flowchart schematically illustrating the process of providing an augmented reality service according to one embodiment of the present invention. FIG. 3 is a flowchart illustrating a user emotion recognition process using the connectivity of user motions according to an embodiment of the present invention. FIG. 4 is a table illustrating emotion-related motion information according to an embodiment of the present invention. FIG. 5 is a flowchart illustrating the process of applying user emotion using intensity based on user facial expressions according to one embodiment of the present invention. FIG. 6 is a diagram illustrating the process of providing an augmented reality service according to an embodiment of the present invention. The present invention is capable of various modifications and may have various embodiments, and specific embodiments are illustrated in the drawings and described in detail in the detailed description. However, this is not intended to limit the invention to specific embodiments, and it should be understood that the invention includes all modifications, equivalents, and substitutions that fall within the spirit and scope of the invention. When it is stated that one component is "connected" or "connected" to another component, it should be understood that while it may be directly connected or connected to that other component, there may also be other components in between. On the other hand, when it is stated that one component is "directly connected" or "directly connected" to another component, it should be understood that there are no other components in between. Terms such as "first," "second," etc., may be used to describe various components, but said components should not be limited by said terms. These terms are used solely for the purpose of distinguishing one component from another. For example, terms such as "first threshold," "second threshold," etc., to be described later may be pre-designated as thresholds that are substantially different or partially identical; however, since there is a possibility of confusion when expressed using the same word "threshold," the terms "first," "second," etc., will be used together for the convenience of distinction. The terms used herein are merely for describing specific embodiments and are not intended to limit the invention. Singular expressions include plural expressions unless the context clearly indicates otherwise. In this specification, terms such as “comprising” or “having” are intended to indicate the presence of the features, numbers, steps, actions, components, parts, or combinations thereof described in the specification, and should be understood as not precluding the existence or addition of one or more other features, numbers, steps, actions, components, parts, or combinations thereof. Furthermore, the components of the embodiments described with reference to each drawing are not limited to the respective embodiments and may be implemented to be included in other embodiments within the scope of maintaining the technical spirit of the present invention. It is also obvious that multiple embodiments may be re-implemented as a single embodiment that integrates multiple embodiments, even if a separate description is omitted. Furthermore, in the description referring to the attached drawings, identical components are assigned the same or related reference numerals regardless of drawing symbols, and redundant descriptions thereof are omitted. In describing the present invention, if it is determined that a detailed description of related prior art could unnecessarily obscure the essence of the present invention, such detailed description is omitted. FIG. 1 is a functional block diagram illustrating the configuration of an augmented reality service providing