Search

US-12616903-B2 - System and method for generating and displaying avatars

US12616903B2US 12616903 B2US12616903 B2US 12616903B2US-12616903-B2

Abstract

Among other things, embodiments of the present disclosure provide systems and methods for modifying avatar components of avatar datasets for multiple users, generating avatars based on the datasets, and displaying multiple avatars on a display screen of a graphical user interface.

Inventors

  • Jacob Edward Blackstock
  • David James Kennedy
  • Shahan Panth
  • Dorian Franklin Baldwin

Assignees

  • SNAP INC.

Dates

Publication Date
20260505
Application Date
20240220

Claims (20)

  1. 1 . A system for managing user representations across multiple digital environments, the system comprising: one or more processors; and one or more memory storage devices storing instructions thereon, which, when executed by the one or more processors, cause the system to perform operations comprising: maintaining, by a first computing device, a unified avatar profile for a user, the unified avatar profile comprising a dataset of avatar components including discrete facial and bodily features and their relative positions; receiving, at the first computing device, a plurality of environment identifiers corresponding to a plurality of digital environments in which the user participates; for each of the plurality of environment identifiers, generating, by the first computing device, an environment-specific avatar by selecting a subset of the avatar components from the unified avatar profile based on predefined criteria associated with the respective digital environment, wherein the predefined criteria include at least one of: a visual style of the digital environment, user interaction mode within the digital environment, and avatar display requirements of the digital environment; linking, for each environment-specific avatar, the avatar profile to a user account in the respective digital environment, wherein the linking enables automatic adaptation of the environment-specific avatar in response to updates made to the unified avatar profile; transmitting, from the first computing device to each of the plurality of digital environments, a dataset for the environment-specific avatar for rendering the environment-specific avatar within the respective digital environment, wherein the rendering includes adapting visual characteristics of the environment-specific avatar to match the visual style and display requirements of the respective digital environment; and updating, by the first computing device, the unified avatar profile in response to user-initiated changes to any of the environment-specific avatars, wherein the updating includes modifying the dataset of avatar components based on user-initiated modifications and synchronizing the user-initiated modifications across the plurality of digital environments to maintain consistency of the user representation.
  2. 2 . The system of claim 1 , wherein the maintaining of the unified avatar profile further comprises storing customization settings for each of the avatar components, the customization settings including color, texture, and accessories, and wherein the generating of the environment-specific avatar includes applying the customization settings to the selected subset of avatar components to enhance the visual integration of the environment-specific avatar within the respective digital environment.
  3. 3 . The system of claim 1 , wherein the transmitting of the dataset for the environment-specific avatar further comprises encoding the dataset with compatibility metadata to facilitate the rendering of discrete facial and bodily features of the environment-specific avatar according to the visual style and display requirements specific to each of the plurality of digital environments.
  4. 4 . The system of claim 1 , wherein the updating of the unified avatar profile includes aggregating feedback data from the plurality of digital environments regarding user interactions with the environment-specific avatars, and wherein the feedback data influences the automatic adaptation of the environment-specific avatars by adjusting at least one of the avatar components to enhance user engagement within each respective digital environment.
  5. 5 . The system of claim 1 , wherein the predefined criteria further include processing capabilities of a client computing device, and wherein the generating of the environment-specific avatar further comprises optimizing the selection of the subset of avatar components to ensure compatibility with the processing capabilities of the client computing device, thereby facilitating smooth rendering and interaction within each respective digital environment.
  6. 6 . The system of claim 1 , wherein the linking of the avatar profile to a user account in the respective digital environment includes associating a unique avatar identifier with the user account, and wherein the unique avatar identifier enables retrieval and display of the environment-specific avatar across different user devices associated with the user account, thereby providing a consistent user representation irrespective of the user device utilized to access the digital environment.
  7. 7 . The system of claim 1 , wherein the generating of the environment-specific avatar includes a step of simulating the avatar within the respective digital environment to ensure that the selected subset of avatar components interact correctly with virtual physics and environmental conditions of the digital environment, and wherein any inconsistencies detected during the simulation are addressed by automatically adjusting the avatar components prior to finalizing the environment-specific avatar for transmission.
  8. 8 . A computer-implemented method for managing user representations across multiple digital environments, the method comprising: maintaining, by a first computing device, a unified avatar profile for a user, the unified avatar profile comprising a dataset of avatar components including discrete facial and bodily features and their relative positions; receiving, at the first computing device, a plurality of environment identifiers corresponding to a plurality of digital environments in which the user participates; for each of the plurality of environment identifiers, generating, by the first computing device, an environment-specific avatar by selecting a subset of the avatar components from the unified avatar profile based on predefined criteria associated with the respective digital environment, wherein the predefined criteria include at least one of: a visual style of the digital environment, user interaction mode within the digital environment, and avatar display requirements of the digital environment; linking, for each environment-specific avatar, the avatar profile to a user account in the respective digital environment, wherein the linking enables automatic adaptation of the environment-specific avatar in response to updates made to the unified avatar profile; transmitting, from the first computing device to each of the plurality of digital environments, a dataset for the environment-specific avatar for rendering the environment-specific avatar within the respective digital environment, wherein the rendering includes adapting visual characteristics of the environment-specific avatar to match the visual style and display requirements of the respective digital environment; and updating, by the first computing device, the unified avatar profile in response to user-initiated changes to any of the environment-specific avatars, wherein the updating includes modifying the dataset of avatar components based on user-initiated modifications and synchronizing the user-initiated modifications across the plurality of digital environments to maintain consistency of the user representation.
  9. 9 . The computer-implemented method of claim 8 , wherein the maintaining of the unified avatar profile further comprises storing customization settings for each of the avatar components, the customization settings including color, texture, and accessories, and wherein the generating of the environment-specific avatar includes applying the customization settings to the selected subset of avatar components to enhance the visual integration of the environment-specific avatar within the respective digital environment.
  10. 10 . The computer-implemented method of claim 8 , wherein the transmitting of the dataset for the environment-specific avatar further comprises encoding the dataset with compatibility metadata to facilitate the rendering of discrete facial and bodily features of the environment-specific avatar according to the visual style and display requirements specific to each of the plurality of digital environments.
  11. 11 . The computer-implemented method of claim 8 , wherein the updating of the unified avatar profile includes aggregating feedback data from the plurality of digital environments regarding user interactions with the environment-specific avatars, and wherein the feedback data influences the automatic adaptation of the environment-specific avatars by adjusting at least one of the avatar components to enhance user engagement within each respective digital environment.
  12. 12 . The computer-implemented method of claim 8 , wherein the predefined criteria further include processing capabilities of a client computing device, and wherein the generating of the environment-specific avatar further comprises optimizing the selection of the subset of avatar components to ensure compatibility with the processing capabilities of the client computing device, thereby facilitating smooth rendering and interaction within each respective digital environment.
  13. 13 . The computer-implemented method of claim 8 , wherein the linking of the avatar profile to a user account in the respective digital environment includes associating a unique avatar identifier with the user account, and wherein the unique avatar identifier enables retrieval and display of the environment-specific avatar across different user devices associated with the user account, thereby providing a consistent user representation irrespective of the user device utilized to access the digital environment.
  14. 14 . The computer-implemented method of claim 8 , wherein the generating of the environment-specific avatar includes a step of simulating the avatar within the respective digital environment to ensure that the selected subset of avatar components interact correctly with virtual physics and environmental conditions of the digital environment, and wherein any inconsistencies detected during the simulation are addressed by automatically adjusting the avatar components prior to finalizing the environment-specific avatar for transmission.
  15. 15 . A non-transitory computer-readable storage medium storing instructions that, when executed by at least one processor, cause the at least one processor to perform operations for managing user representations across multiple digital environments, the operations comprising: maintaining, by a first computing device, a unified avatar profile for a user, the unified avatar profile comprising a dataset of avatar components including discrete facial and bodily features and their relative positions; receiving, at the first computing device, a plurality of environment identifiers corresponding to a plurality of digital environments in which the user participates; for each of the plurality of environment identifiers, generating, by the first computing device, an environment-specific avatar by selecting a subset of the avatar components from the unified avatar profile based on predefined criteria associated with the respective digital environment, wherein the predefined criteria include at least one of: a visual style of the digital environment, user interaction mode within the digital environment, and avatar display requirements of the digital environment; linking, for each environment-specific avatar, the avatar profile to a user account in the respective digital environment, wherein the linking enables automatic adaptation of the environment-specific avatar in response to updates made to the unified avatar profile; transmitting, from the first computing device to each of the plurality of digital environments, a dataset for the environment-specific avatar for rendering the environment-specific avatar within the respective digital environment, wherein the rendering includes adapting visual characteristics of the environment-specific avatar to match the visual style and display requirements of the respective digital environment; and updating, by the first computing device, the unified avatar profile in response to user-initiated changes to any of the environment-specific avatars, wherein the updating includes modifying the dataset of avatar components based on user-initiated modifications and synchronizing the user-initiated modifications across the plurality of digital environments to maintain consistency of the user representation.
  16. 16 . The non-transitory computer-readable storage medium of claim 15 , wherein the maintaining of the unified avatar profile further comprises storing customization settings for each of the avatar components, the customization settings including color, texture, and accessories, and wherein the generating of the environment-specific avatar includes applying the customization settings to the selected subset of avatar components to enhance the visual integration of the environment-specific avatar within the respective digital environment.
  17. 17 . The non-transitory computer-readable storage medium of claim 15 , wherein the transmitting of the dataset for the environment-specific avatar further comprises encoding the dataset with compatibility metadata to facilitate the rendering of discrete facial and bodily features of the environment-specific avatar according to the visual style and display requirements specific to each of the plurality of digital environments.
  18. 18 . The non-transitory computer-readable storage medium of claim 15 , wherein the updating of the unified avatar profile includes aggregating feedback data from the plurality of digital environments regarding user interactions with the environment-specific avatars, and wherein the feedback data influences the automatic adaptation of the environment-specific avatars by adjusting at least one of the avatar components to enhance user engagement within each respective digital environment.
  19. 19 . The non-transitory computer-readable storage medium of claim 15 , wherein the predefined criteria further include processing capabilities of a client computing device, and wherein the generating of the environment-specific avatar further comprises optimizing the selection of the subset of avatar components to ensure compatibility with the processing capabilities of the client computing device, thereby facilitating smooth rendering and interaction within each respective digital environment.
  20. 20 . The non-transitory computer-readable storage medium of claim 15 , wherein the linking of the avatar profile to a user account in the respective digital environment includes associating a unique avatar identifier with the user account, and wherein the unique avatar identifier enables retrieval and display of the environment-specific avatar across different user devices associated with the user account, thereby providing a consistent user representation irrespective of the user device utilized to access the digital environment.

Description

CROSS REFERENCE TO RELATED APPLICATIONS This application is a continuation of U.S. patent application Ser. No. 17/450,040, filed Oct. 5, 2021, which application is a continuation of U.S. patent application Ser. No. 15/401,926, filed on Jan. 9, 2017, now issued as U.S. Pat. No. 11,229,849, which is a continuation of U.S. patent application Ser. No. 13/979,974, filed on Jul. 16, 2013, which is a U.S. national-phase application filed under 35 U.S.C. § 371 from International Application Serial No. PCT/CA2013/000454, filed on May 8, 2013 and published as WO 2013/166588 on Nov. 14, 2013, which claims the benefit of priority to U.S. Provisional Application Ser. No. 61/644,057, filed on May 8, 2012, each of which are incorporated herein by reference in their entireties. FIELD OF THE INVENTION The present invention relates generally to providing user representations in computing environments. The present invention further relates to managing a particular user's representation in avatar form in computing environments. BACKGROUND OF THE INVENTION Environments that use avatars to represent users typically provide their own avatar creation tools. An avatar created by a user in one online environment is usually confined to that environment, so for each new environment, the user typically must create a separate, different avatar. To update characteristics of multiple avatars, the user must change each avatar separately within each environment, which can be time consuming. Despite the apparent inefficiency of such a system, having a multiplicity of avatars may serve a practical purpose. Just as in real life, digital users exist in multiple contexts and may require different identities in different environments; for example, one identity for work, another for family and friends, another for video games, others for interests and hobbies, and so on. A different avatar in each situation allows the user to present a contextually relevant appearance. Nevertheless, as the number of digital environments grows, the user is compelled to create and manage an ever-increasing number of avatars, which creates a disincentive to design a separate avatar for each new environment. This reduces the value of avatars generally, and for environments that use avatars, adds a barrier to adoption. There are mechanisms that attempt to solve this problem by enabling users to use the same avatar in multiple environments, such as one disclosed by Mason et al. in U.S. patent application Ser. No. 12/279,643 (published as US 2010/0011422 A1). However, such mechanisms require an avatar to be rendered identically in each environment, and therefore fail to provide the avatar's fundamental benefits, which include giving the user a contextually relevant identity, and each environment a consistent look and feel. Therefore, what is needed is a solution to address at least some of these limitations. SUMMARY OF THE INVENTION In accordance with an aspect of the present invention, the present disclosure relates to a system and method for providing avatars adaptable to multiple environments. Avatar means any representation of a user, which may be manifested visually in an environment. For example, an avatar may be manifested as a character in a video game, a user profile picture in a social networking website, or an emoticon in a messaging application, etc. Environment in this disclosure broadly means any environment where an avatar may be manifested. For example, an environment may be an avatar creation application, video game, social networking website, messaging application, smartphone address book, or any other application where a user may want to have a representation. In accordance with an aspect of the present invention, more particularly, the system and method provides data about an avatar's components, such as discrete facial and bodily features, and the relative positions of the features to another, for the purpose of rendering different versions of a user's avatar which may be visually adapted to suit multiple environments. In accordance with an aspect of the present invention, the system and method enables data about a single avatar to be adapted and re-rendered in multiple environments in a virtually unlimited number of ways. This allows service providers (such as operators of a service environment) to reap the benefits of providing avatars designed to visually match their environments, while relieving the user of the need to create a new avatar specifically for each service. The user can instead create an avatar in one environment, and link the avatar data to a user account in a second environment. The avatar can then be automatically re-rendered with the second environment's art set using avatar data from the first environment. The second environment can therefore choose not to offer an avatar creation tool at all, yet still give the user the full experience of interacting in its environment with a relevant and personalized avatar. In accordance with