Search

US-20260126852-A1 - INFORMATION PROCESSING APPARATUS, METHOD, PROGRAM, AND INFORMATION PROCESSING SYSTEM

US20260126852A1US 20260126852 A1US20260126852 A1US 20260126852A1US-20260126852-A1

Abstract

An information processing apparatus of the present invention includes an acquisition unit that acquires space information indicative of a position of a physical object in a first space around a first user, a space construction unit that constructs, on a basis of the space information, a shared space in which movements of the first user and a second user who exists in a second space different from the first space are reflected, and a determination unit that determines a position of the second user in the shared space.

Inventors

  • Yoshinori Ohashi

Assignees

  • SONY INTERACTIVE ENTERTAINMENT INC.

Dates

Publication Date
20260507
Application Date
20260105
Priority Date
20200623

Claims (20)

  1. 1 . An information processing apparatus comprising: circuitry configured to: acquire space information of a first physical space and a second physical space, the space information being indicative of a position of a physical object in the first physical space around a first user; construct, based on the space information, a shared virtual space in which movements of the first user in the first physical space and movements of a second user who exists in the second physical space are reflected, the shared virtual space comprising a first virtual space corresponding to the first physical space; and determine a position of a virtual avatar of the second user in the shared virtual space, the position of the virtual avatar of the second user in the shared virtual space being outside of the first virtual space corresponding to the first physical space.
  2. 2 . The information processing apparatus of claim 1 , wherein the position of the second user in the shared virtual space is determined based on the movements of the second user in the second physical space.
  3. 3 . The information processing apparatus of claim 1 , wherein the circuitry is further configured to: acquire updated space information of the first physical space and the second physical space, wherein the updated space information indicates a position of the first user in the first physical space and a position of the second user in the second physical space; and update the shared virtual space based on the updated space information.
  4. 4 . The information processing apparatus of claim 1 , wherein the circuitry is further configured to determine a size of the virtual avatar of the second user in the shared virtual space.
  5. 5 . The information processing apparatus of claim 1 , wherein the shared virtual space is shared by a virtual avatar of the first user and the virtual avatar of the second user, and wherein the position of the physical object in the first physical space is reflected in the first virtual space.
  6. 6 . The information processing apparatus of claim 1 , wherein the circuitry is further configured to: acquire scale information for setting a scale for the virtual avatar of the second user in the shared virtual space; and determine the scale for the virtual avatar of the second user based on the scale information.
  7. 7 . The information processing apparatus of claim 1 , wherein the space information is further indicative of a color and a texture of the physical object in the first physical space, and wherein the circuitry is further configured to determine a color and a texture of a virtual object such that the color and the texture of the virtual object reflects the color and texture of the physical object in the first physical space.
  8. 8 . The information processing apparatus of claim 1 , wherein, in a case where the virtual avatar of the second user is brought into contact with a contact portion of a first virtual space corresponding to the first physical space, the circuitry is further configured to change a state of the contact portion of the first virtual space corresponding to the first physical space.
  9. 9 . The information processing apparatus of claim 8 , wherein the change of the state of the contact portion is destruction of the contact portion of the first virtual space, and, in a case where the virtual avatar of the second user is positioned at the destroyed contact portion, the circuitry is further configured to determine a position of the virtual avatar of the second user at the destroyed contact portion in such a manner that the virtual avatar of the second user falls from the position of the destroyed contact portion.
  10. 10 . A computer-implemented method comprising: acquiring space information of a first physical space and a second physical space, the space information being indicative of a position of a physical object in the first physical space around a first user; constructing, based on the space information, a shared virtual space in which movements of the first user in the first physical space and movements of a second user who exists in the second physical space are reflected, the shared virtual space comprising a first virtual space corresponding to the first physical space; and determining a position of a virtual avatar of the second user in the shared virtual space, the position of the virtual avatar of the second user in the shared virtual space being outside of the first virtual space corresponding to the first physical space.
  11. 11 . The computer-implemented method of claim 10 , wherein the position of the second user in the shared virtual space is determined based on the movements of the second user in the second physical space.
  12. 12 . The computer-implemented method of claim 10 , further comprising: acquiring updated space information of the first physical space and the second physical space, wherein the updated space information indicates a position of the first user in the first physical space and a position of the second user in the second physical space; and updating the shared virtual space based on the updated space information.
  13. 13 . The computer-implemented method of claim 10 , further comprising determining a size of the virtual avatar of the second user in the shared virtual space.
  14. 14 . The computer-implemented method of claim 10 , wherein the shared virtual space is shared by a virtual avatar of the first user and the virtual avatar of the second user, and wherein the position of the physical object in the first physical space is reflected in the first virtual space.
  15. 15 . The computer-implemented method of claim 10 , further comprising: acquiring scale information for setting a scale for the virtual avatar of the second user in the shared virtual space; and determining the scale for the virtual avatar of the second user based on the scale information.
  16. 16 . The computer-implemented method of claim 10 , wherein the space information is further indicative of a color and a texture of the physical object in the first physical space, and wherein the computer-implemented method further comprises determining a color and a texture of a virtual object such that the color and the texture of the virtual object reflects the color and texture of the physical object in the first physical space.
  17. 17 . The computer-implemented method of claim 10 , wherein, in a case where the virtual avatar of the second user is brought into contact with a contact portion of a first virtual space corresponding to the first physical space, the computer-implemented method further comprises changing a state of the contact portion of the first virtual space corresponding to the first physical space.
  18. 18 . The computer-implemented method of claim 17 , wherein the change of the state of the contact portion is destruction of the contact portion of the first virtual space, and, in a case where the virtual avatar of the second user is positioned at the destroyed contact portion, the computer-implemented method further comprises determining a position of the virtual avatar of the second user at the destroyed contact portion in such a manner that the virtual avatar of the second user falls from the position of the destroyed contact portion.
  19. 19 . A non-transitory, computer readable storage medium containing a computer program, which when executed by a computer, causes the computer to perform a method, comprising: acquiring space information of a first physical space and a second physical space, the space information being indicative of a position of a physical object in the first physical space around a first user; constructing, based on the space information, a shared virtual space in which movements of the first user in the first physical space and movements of a second user who exists in the second physical space are reflected, the shared virtual space comprising a first virtual space corresponding to the first physical space; and determining a position of a virtual avatar of the second user in the shared virtual space, the position of the virtual avatar of the second user in the shared virtual space being outside of the first virtual space corresponding to the first physical space.
  20. 20 . The non-transitory, computer readable storage medium of claim 19 , wherein: the space information is further indicative of a color and a texture of the physical object in the first physical space, and wherein the method further comprises determining a color and a texture of a virtual object such that the color and the texture of the virtual object reflects the color and texture of the physical object in the first physical space; or in a case where the virtual avatar of the second user is brought into contact with a contact portion of a first virtual space corresponding to the first physical space, the method further comprises changing a state of the contact portion of the first virtual space corresponding to the first physical space.

Description

CROSS REFERENCE TO RELATED APPLICATIONS This application is a continuation application of U.S. patent application Ser. No. 18/788,232, accorded a filing date of Jul. 30, 2024, which is a continuation application of U.S. patent application Ser. No. 18/001,094, accorded a filing date of Dec. 8, 2022, now U.S. Pat. No. 12,086,302, which is a U.S. National Stage application of International Application No. PCT/JP2021/022794, filed Jun. 16, 2021, which claims priority to Japanese Patent Application No. 2020-107907, filed Jun. 23, 2020, and to Japanese Patent Application No. 2020-152094, filed Sep. 10, 2020, the entire disclosures of which are hereby incorporated by reference. TECHNICAL FIELD The present invention relates to an information processing apparatus, a method, a program, and an information processing system. BACKGROUND ART In recent years, a technology has been examined by which a movement of a body and a space are shared on a real time basis by a plurality of users who are present in remote places spaced away from each other in such a manner as to allow the users to have such an experience that they feel as if they were in the same place. For example, a technology called Telexistence provides an environment in which an operation and so forth are performed on a real time basis while a user is allowed to feel, through a head-mounted display or the like, as if something or some person in a remote place were present near the user. SUMMARY Technical Problem For the technology described above, it is demanded to provide a novel viewing experience to a user. Taking the problem described above into consideration, it is one of objects of the present invention to provide a novel viewing experience to a user. Solution to Problem In order to solve the problem described above, an information processing apparatus of one aspect of the present invention includes an acquisition unit that acquires space information indicative of a position of a physical object in a first space around a first user, a space construction unit that constructs, on the basis of the space information, a shared space that is shared by the first user and a second user who exists in a second space different from the first space and in which the position of the physical object in the first space is reflected, and a determination unit that determines a position of the second user in the shared space. It is to be noted that any combinations of the foregoing as well as the components and representations of the present invention as they are converted between methods, apparatuses, programs, transitory or non-transitory storage media in which a program is stored, systems, and so forth are also effective as aspects of the present invention. Advantageous Effect of Invention According to the present invention, a novel viewing experience can be provided to the user. BRIEF DESCRIPTION OF DRAWINGS FIG. 1 is an overview diagram of an information processing system. FIG. 2 depicts an example of utilization of an embodiment by a user. FIG. 3 is a functional block diagram of the information processing system. FIG. 4A exemplifies an AR space video displayed by an HMD of a user A. FIG. 4B exemplifies another AR space video displayed by the HMD of the user A. FIG. 5 is a sequence diagram depicting a flow of processing in the information processing system. FIG. 6A exemplifies an AR space video displayed by the HMD of the user A. FIG. 6B exemplifies another AR space video displayed by the HMD of the user A. FIG. 7 exemplifies an AR space video displayed by an HMD of a user B. FIG. 8 is a sequence diagram depicting a flow of processing in the information processing system. FIG. 9 is an overview diagram of the information processing system. FIG. 10 is a functional block diagram of the information processing system. FIG. 11 is a sequence diagram depicting a flow of processing in the information processing system. DESCRIPTION OF EMBODIMENTS First Embodiment FIG. 1 is an overview diagram of an information processing system 10 according to an embodiment. The information processing system 10 of FIG. 1 includes a plurality of information processing terminals 100. The plurality of information processing terminals 100 are individually connected for data communication to each other through a communication network 5 such as the Internet. The information processing system 10 of the present embodiment includes two information processing terminals, i.e., an information processing terminal 100A used by a user A and another information processing terminal 100B used by a user B. However, the information processing system 10 is not limited to this and may include three or more information processing terminals 100. The information processing terminals 100A and 100B each include a control unit 11, a storage unit 12, a communication unit 13, and an interface unit 14. The information processing terminals 100A and 100B are each connected to a head-mounted display (HMD) 15, a stereo camera 16,