US-12620262-B2 - Using artificial entities for generating personalized responses
Abstract
Systems, methods and non-transitory computer readable media for generating and operating artificial entities are provided. Some disclosed embodiments may involve receiving information related to a source individual; generating an artificial entity associated with the source individual based on the received information; receiving data reflecting an interaction with the artificial entity; and determining a manner for the artificial entity to respond to the interaction based on the collected information.
Inventors
- Ben Avi Ingel
- Ron Zass
Assignees
- Ben Avi Ingel
- Ron Zass
Dates
- Publication Date
- 20260505
- Application Date
- 20250602
Claims (20)
- 1 . A non-transitory computer readable medium containing instructions that when executed by at least one processor cause the at least one processor to perform operations for operating artificial entities, the operations comprising: receiving information related to a source individual; analyzing the received information to determine behavior patterns of the source individual when encountering a plurality of situations, the plurality of situations includes a first situation including an interaction with a first individual that triggered first behavior patterns from the source individual and a second situation that triggered second behavior patterns from the source individual, wherein the first behavior patterns differ from the second behavior patterns; generating an artificial entity associated with the source individual based on the received information; receiving data reflecting a current situation that includes an interaction between the artificial entity and a target individual faces, wherein the current situation is closer to the first situation than the second situation; determining a manner for the artificial entity to respond to the current situation based on the first behavior patterns of the source individual, wherein a determination to use the first behavior patterns for determining the manner for the artificial entity to respond to the current situation is based on an association of the target individual to the first individual; and causing the artificial entity to respond to the current situation using the determined manner.
- 2 . The non-transitory computer readable medium of claim 1 , wherein the determined behavior patterns include a reaction of the source individual to a statement that the first individual said, and the determined manner for the artificial entity is associated with the reaction of the source individual.
- 3 . The non-transitory computer readable medium of claim 1 , wherein determining the manner for the artificial entity to respond to the current situation includes predicting a probable reaction of the source individual to the target individual in current situation based on how the source individual behaved when interacting with the first individual.
- 4 . The non-transitory computer readable medium of claim 1 , wherein determining the manner for the artificial entity to respond to the current situation further includes determining context to the interaction between the artificial entity and the target individual, and using determined context to determine the manner to respond to the current situation.
- 5 . The non-transitory computer readable medium of claim 1 , wherein the received information includes image data associated with the interaction of the source individual with the first individual, the determined behavior patterns include preferences of the source individual determined from the image data, and the manner for the artificial entity to respond to the current situation is based on the preferences of the source individual.
- 6 . The non-transitory computer readable medium of claim 1 , wherein the received information includes positioning data associated with the interaction of the source individual with the first individual, and the operations further include determining context of the interaction of the source individual with the first individual inferred from the positioning data, and the manner for the artificial entity to respond to the current situation is based on the determined context.
- 7 . The non-transitory computer readable medium of claim 1 , wherein the received information includes chat history with the first individual, the determined behavior patterns include at least one favorite emoji that the source individual uses when communicating with the first individual, and the manner for the artificial entity to respond to the current situation is based on the at least one favorite emoji of the source individual.
- 8 . The non-transitory computer readable medium of claim 1 , wherein the received information includes social media posts associated with the first individual, the determined behavior patterns include a style of the source individual inferred from the social media posts, and the manner for the artificial entity to respond to the current situation is based on the style of the source individual.
- 9 . The non-transitory computer readable medium of claim 1 , wherein the determined behavior patterns of the source individual are associated with at least one of the following: personal bias of the source individual, cultural influence associated with the source individual, a level of interest of the source individual in a particular topic, a level of expertise of the source individual in a particular topic, a communication style of the source individual, a level of openness of the source individual to new content suggested by the first individual, a level of trust of the source individual, or a level of politeness of the source individual toward the first individual.
- 10 . The non-transitory computer readable medium of claim 1 , wherein the second situation includes an additional interaction with the first individual, but the first situation and the second situation are characterized by different environmental or social factors, and the current situation has environmental or social factors closer to the first situation than to the second situation.
- 11 . The non-transitory computer readable medium of claim 1 , wherein the target individual is the first individual or someone from a same social circle as the first individual.
- 12 . The non-transitory computer readable medium of claim 1 , wherein the target individual is another artificial entity associated with the first individual.
- 13 . The non-transitory computer readable medium of claim 1 , wherein the association of the target individual to the first individual is determined based on at least one of social network friends, bibliographic details, professional connections, or a common trait between the target individual and the first individual.
- 14 . The non-transitory computer readable medium of claim 1 , wherein the operations further include using at least one psychometric model on the behavior patterns of the source individual when encountering a plurality of situations to determine a personality profile of the source individual, and based on the personality profile of the source individual determining the manner for the artificial entity to respond to the current situation.
- 15 . The non-transitory computer readable medium of claim 1 , wherein the second situation includes an additional interaction with a second individual, wherein the first individual is associated with a first level of intimacy greater than a second level of intimacy associated with the second individual, and the operations further include determining that the current situation is closer to the first situation than the second situation based on a determined level of intimacy associated with the target individual.
- 16 . The non-transitory computer readable medium of claim 15 , wherein determined manner for the artificial entity to respond to the current situation involve at least one of: using a more casual language than it would have used when interacting with another individual associated with the second level of intimacy.
- 17 . The non-transitory computer readable medium of claim 15 , wherein determined manner for the artificial entity to respond to the current situation involve sharing at least one private detail that it would not have shared used when interacting with another individual associated with the second level of intimacy.
- 18 . The non-transitory computer readable medium of claim 1 , wherein the second situation includes an additional interaction with a second individual, wherein the first individual is associated with a first age and the second individual is associated with a second age differs than the first age, and the operations further include determining that the current situation is closer to the first situation than the second situation based on an estimated age of the target individual, and wherein the determined manner for the artificial entity to respond to the current situation involve using age-appropriate language.
- 19 . The non-transitory computer readable medium of claim 1 , wherein causing the artificial entity to respond to the current situation includes using a generative model to generate a visual display of the artificial entity being an interactive representation of the source individual that mimics visual and auditory attributes of the source individual.
- 20 . The non-transitory computer readable medium of claim 1 , wherein the received information is part of a personal archive of the source individual stored in at least one data structure separated from a server configured to generate the artificial entity, and wherein the personal archive includes digital representations of at least some of the following: correspondence, image data, journals, certificates, audio recordings, social media content, family history records, career materials, personal projects, medical records, and food-related memos.
Description
CROSS-REFERENCE TO RELATED APPLICATIONS This application is a continuation of U.S. patent application Ser. No. 18/813,903, filed on Aug. 23, 2024 (pending), which claims the benefit of priority of U.S. Provisional Patent Application No. 63/535,234 (filed on Aug. 29, 2023), U.S. Provisional Patent Application No. 63/549,534 (filed on Feb. 4, 2024), U.S. Provisional Patent Application No. 63/685,978 (filed on Aug. 22, 2024), and U.S. Provisional Patent Application No. 63/685,988 (filed on Aug. 22, 2024), the disclosures of which are incorporated herein by reference in their entirety. BACKGROUND OF THE INVENTION Technological Field Some disclosed embodiments generally relate to systems and methods for generating and operating artificial entities. Background Information In today's world, artificial entities based on the innovative Generative Pre-trained Transformer (GPT) architecture and other innovative natural language processing (NLP) models respond to users' questions using generic databases and their conversation records. Yet, the relentless march of technology has shattered the boundaries of possibility, making the dream of personalized artificial entities a feasible solution. Personalized artificial entities can harness the power of deep-learning algorithms to meticulously process an individual's data, be it text, audio, photos, or videos. By doing so, the personalized artificial entities can mirror or adjust to the unique cognitive traits, preferences, and unique manner of interactions of their source individuals. This provides the personalized artificial entities with an uncanny ability to offer functionalities with unprecedented authenticity and engagement with the source individuals and other individuals. This groundbreaking technology will revolutionize how humans interact in digital environments, ushering in a new era of innovative ways for communication, productivity, entertainment, and social engagement. SUMMARY OF THE INVENTION Systems, methods and non-transitory computer readable media for generating and operating artificial entities are provided. Some disclosed embodiments may involve receiving information related to a source individual; generating an artificial entity associated with the source individual based on the received information; receiving data reflecting an interaction with the artificial entity; and determining a manner for the artificial entity to respond to the interaction based on the collected information. The foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the claims. BRIEF DESCRIPTION OF DRAWINGS The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate various disclosed embodiments. In the drawings: FIG. 1 is a block diagram illustrating a system that enables generation of artificial entities, consistent with some embodiments of the present disclosure. FIG. 2 is a block diagram of an exemplary computing device and exemplary server, consistent with some embodiments of the present disclosure. FIG. 3A is a diagram illustrating examples of input data of the system of FIG. 1, the consistent with some embodiments of the present disclosure. FIG. 3B is a flowchart of an example process for generating and operating artificial entities, consistent with some embodiments of the present disclosure. FIG. 4A is an illustration of a first use case for using an artificial entity, consistent with some embodiments of the present disclosure. FIG. 4B is a flowchart of an example process associated with the first use case, consistent with some embodiments of the present disclosure. FIG. 5A is an illustration of a second use case for using an artificial entity, consistent with some embodiments of the present disclosure. FIG. 5B is a flowchart of an example process associated with the second use case, consistent with some embodiments of the present disclosure. FIG. 6A is an illustration of a third use case for using an artificial entity, consistent with some embodiments of the present disclosure. FIG. 6B is a flowchart of an example process associated with the third use case, consistent with some embodiments of the present disclosure. FIG. 7A is an illustration of a fourth use case for using an artificial entity, consistent with some embodiments of the present disclosure. FIG. 7B is a flowchart of an example process associated with the fourth use case, consistent with some embodiments of the present disclosure. FIGS. 8, 10, 12, 14, 16, 18, 20, 22, 24, and 26 are illustrations of different features of artificial entity consistent with some embodiments of the present disclosure. FIGS. 9, 11, 13, 15, 17, 19, 21, 23, 25, and 27 are flowcharts of example processes for operating artificial entities according to different embodiments of the present disclosure. DETAILED DESCRIPTION OF THE INVENTION Exemplary embodiments are described with reference to the accompanying drawings