US-20260127877-A1 - SHARED AUGMENTED REALITY SYSTEM
Abstract
An augmented reality system to perform operations that include: accessing image data at a client device; determining a position of a user of the client device based on the image data; causing display of a projection that extends from the position of the user upon a presentation of the image data at the client device; detecting an intersection of the projection and a surface of an object; generating a request that includes an identification of the portion of the surface of the object at the client device; and presenting the portion of the surface of the object based on the graphical property of the projection at the client device in response to the request that includes the identification of the portion of the surface of the object.
Inventors
- David Li
- Isac Andreas Müller Sandvik
- Qi Pan
- Rastan Boroujerdi
- Kevin Yimeng Hong
- Peng DENG
- Piers George Cowburn
- Jonathan Tang
- Junjie Wei
Assignees
- SNAP INC.
Dates
- Publication Date
- 20260507
- Application Date
- 20251222
Claims (20)
- 1 . A method comprising: detecting a plurality of client devices within a geo-fence that encompasses a location of interest; accessing a shared Augmented-Reality (AR) communication session associated with the location of interest responsive to detecting the plurality of client devices within the geo-fence, the shared AR communication session comprising AR content; identifying objects within the location of interest available for AR interaction within the shared AR communication session; and causing display of notifications at the client devices, the notifications including an identification of the shared AR communication session and the objects available for AR interaction.
- 2 . The method of claim 1 , wherein detecting the plurality of client devices within the geo-fence comprises receiving location data from the plurality of client devices, and wherein the location data comprises at least one of GPS coordinates, Wi-Fi positioning data, or Bluetooth beacon data.
- 3 . The method of claim 1 , wherein identifying objects within the location of interest comprises accessing a database of pre-registered objects associated with the location of interest, the database including metadata associated with each object.
- 4 . The method of claim 1 , wherein the notifications include selectable elements that, when selected, join a respective client device to the shared AR communication session.
- 5 . The method of claim 1 , further comprising: receiving user input from at least one of the client devices indicating selection of an object for AR interaction; and rendering AR content associated with the selected object on a display of the at least one client device as an overlay on a camera view of the selected object.
- 6 . The method of claim 1 , further comprising enabling users of the plurality of client devices to collaboratively interact with the objects within the shared AR communication session, wherein collaborative interaction comprises enabling multiple users to simultaneously manipulate a shared virtual object.
- 7 . The method of claim 1 , further comprising synchronizing AR content display across the plurality of client devices such that users see consistent AR content within the shared AR communication session.
- 8 . The method of claim 1 , further comprising detecting when a client device exits the geo-fence and removing the client device from the shared AR communication session.
- 9 . The method of claim 1 , wherein the shared AR communication session enables real-time communication between users of the plurality of client devices, the real-time communication comprising at least one of text messaging, voice communication, video communication, or gesture-based communication.
- 10 . A system comprising: one or more processors; and memory storing instructions that, when executed by the one or more processors, cause the system to: detect a plurality of client devices within a geo-fence that encompasses a location of interest; access a shared Augmented-Reality (AR) communication session associated with the location of interest responsive to detecting the plurality of client devices within the geo-fence, the shared AR communication session comprising AR content; identify objects within the location of interest available for AR interaction within the shared AR communication session; and cause display of notifications at the client devices, the notifications including an identification of the shared AR communication session and the objects available for AR interaction.
- 11 . The system of claim 10 , wherein the instructions further cause the system to receive location data from the plurality of client devices, the location data comprising at least one of GPS coordinates, Wi-Fi positioning data, or Bluetooth beacon data.
- 12 . The system of claim 10 , wherein the instructions further cause the system to access a database of pre-registered objects associated with the location of interest, the database including spatial coordinates of each object within the location of interest.
- 13 . The system of claim 10 , wherein the notifications include selectable elements that, when selected, join a respective client device to the shared AR communication session, and wherein the instructions further cause the system to enable collaborative interaction between users of the plurality of client devices.
- 14 . The system of claim 10 , wherein the instructions further cause the system to synchronize AR content display across the plurality of client devices and detect when a client device exits the geo-fence and remove the client device from the shared AR communication session.
- 15 . The system of claim 10 , wherein the instructions further cause the system to render AR content associated with a selected object on a display of at least one client device as an overlay on a camera view of the selected object.
- 16 . A non-transitory computer-readable medium storing instructions that, when executed by one or more processors, cause the one or more processors to perform operations comprising: detecting a plurality of client devices within a geo-fence that encompasses a location of interest; accessing a shared Augmented-Reality (AR) communication session associated with the location of interest responsive to detecting the plurality of client devices within the geo-fence, the shared AR communication session comprising AR content; identifying objects within the location of interest available for AR interaction within the shared AR communication session; and causing display of notifications at the client devices, the notifications including an identification of the shared AR communication session and the objects available for AR interaction.
- 17 . The non-transitory computer-readable medium of claim 16 , wherein detecting the plurality of client devices comprises receiving location data from the plurality of client devices, and wherein identifying objects comprises accessing a database of pre-registered objects associated with the location of interest.
- 18 . The non-transitory computer-readable medium of claim 16 , wherein the notifications include selectable elements that, when selected, join a respective client device to the shared AR communication session.
- 19 . The non-transitory computer-readable medium of claim 16 , wherein the operations further comprise: enabling users of the plurality of client devices to collaboratively interact with the objects within the shared AR communication session; and synchronizing AR content display across the plurality of client devices.
- 20 . The non-transitory computer-readable medium of claim 16 , wherein the operations further comprise enabling real-time communication between users of the plurality of client devices, the real-time communication comprising at least one of text messaging, voice communication, video communication, or gesture-based communication.
Description
CROSS-REFERENCE TO RELATED APPLICATION This application is a continuation of U.S. patent application Ser. No. 18/203,876, filed May 31, 2023, which application is a continuation of U.S. patent application Ser. No. 17/584,946, filed Jan. 26, 2022, now issued as U.S. patent Ser. No. 11/776,256, which application is a continuation of U.S. patent application Ser. No. 17/119,597, filed Dec. 11, 2020, now issued as U.S. Pat. No. 11,263,459, which application is a continuation of U.S. patent application Ser. No. 16/833,087, filed Mar. 27, 2020, now issued as U.S. Pat. No. 10,956,743, which are incorporated by reference herein in their entirety. TECHNICAL FIELD Embodiments of the present disclosure relate generally to mobile computing technology and, more particularly, but not by way of limitation, to systems for generating and causing display of augmented reality media. BACKGROUND Augmented reality (AR), is a live direct or indirect view of a physical, real-world environment whose elements are augmented by computer-generated sensory inputs. BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWINGS To easily identify the discussion of any particular element or act, the most significant digit or digits in a reference number refer to the figure number in which that element is first introduced. FIG. 1 is a block diagram showing an example messaging system for exchanging data (e.g., messages and associated content) over a network in accordance with some embodiments, wherein the messaging system includes an augmented reality system. FIG. 2 is block diagram illustrating further details regarding a messaging system, according to example embodiments. FIG. 3 is a block diagram illustrating various modules of an augmented reality system, according to certain example embodiments. FIG. 4 is a flowchart depicting a method of presenting a shared augmented reality interface, according to certain example embodiments. FIG. 5 is a flowchart depicting a method of presenting a shared augmented reality interface, according to certain example embodiments. FIG. 6 is a flowchart depicting a method of presenting augmented reality content in a shared augmented reality interface, according to certain example embodiments. FIG. 7 is a flowchart depicting a method of presenting augmented reality content in a shared augmented reality interface, according to certain example embodiments. FIG. 8 is an interface diagram depicting a shared augmented reality interface, according to certain example embodiments. FIG. 9 is a block diagram illustrating a representative software architecture, which may be used in conjunction with various hardware architectures herein described and used to implement various embodiments. FIG. 10 is a block diagram illustrating components of a machine, according to some example embodiments, able to read instructions from a machine-readable medium (e.g., a machine-readable storage medium) and perform any one or more of the methodologies discussed herein. DETAILED DESCRIPTION As discussed above, AR systems provide users within graphical user interfaces (GUI) to display a live direct or indirect view of a physical, real-world environment, wherein elements of the view are augmented by computer-generated sensory inputs. For example, an AR interface may present media content at positions within a display of a view of a real-world environment, such that the media content appears to interact with elements in the real-world environment. Certain example embodiments discussed herein therefore provide an AR system to generate and present a “shared” AR experience, wherein multiple users may take part in the same AR session in real-time. Accordingly, a shared AR system is disclosed which performs operations that include: accessing image data at a client device, the image data comprising a set of image features, the set of image features defining a surface of an object; determining a position of a user of the client device based on the set of image features; causing display of a projection that extends from the position of the user upon a presentation of the image data at the client device, the projection comprising a graphical property and having a trajectory based on the position of the user of the client device; detecting an intersection of the projection and the surface of the object based on the trajectory, the intersection corresponding with a portion of the surface of the object; generating a request that includes an identification of the portion of the surface of the object at the client device; and presenting the portion of the surface of the object based on the graphical property of the projection at the client device in response to the request that includes the identification of the portion of the surface of the object. In some embodiments, the display of the projection may be responsive to a user input received at the client device. For example, a user of the client device may provide a tactile input at the client device, wherein t