Search

EP-3924073-B1 - ALIGNING LOCATION FOR A SHARED AUGMENTED REALITY EXPERIENCE

EP3924073B1EP 3924073 B1EP3924073 B1EP 3924073B1EP-3924073-B1

Inventors

  • CAHILL, JASON MATTHEW
  • OLAFSSON, Torfi Frans
  • MERRIAM, Jesse Dylan
  • PERSSON, MICHAEL MEINCKE
  • SHUBER, Bradley Reid

Dates

Publication Date
20260506
Application Date
20200131

Claims (15)

  1. A second device (120), comprising: a computer comprising a camera, a processor and a memory having computer-executable instructions stored thereupon which, when executed by the processor, cause the second device to: perform augmented reality tracking to establish a relative coordinate space of the second device (510); recognize a spatial alignment image (210) displayed on a first device using the camera and receive an identifier of an augmented reality session (520); record a location of the second device within the coordinate space of the second device and a timestamp associated with a clock of the second device associated with recognition of the spatial alignment image (530); send a request for information to the first device, the request including the timestamp (540); in response to the request, receive information from the first device comprising the first device's location within a relative coordinate space of the first device at or about the timestamp, and, a spatial origin of the first device (550); calculate an offset between the coordinate space of the second device and the coordinate space of the first device to create a shared coordinate space based, at least in part, upon the received information (560); and utilize the shared coordinate space and the identifier to display the augmented reality session (570).
  2. The second device of claim 1, wherein the spatial alignment image is displayed at a predetermined size and with a plurality of features comprising predefined specific groupings of pixels of predefined colors and predefined intensities that allow the second device to determine its location in six degrees of position relative to the first device.
  3. The second device of claim 1, the memory having further computer-executable instructions stored thereupon which, when executed by the processor, cause the second device to: synchronize the clock of the second device to a clock of the first device.
  4. The second device of claim 1, wherein the session identifier is displayed on the first device and comprises a multi-dimensional barcode (220).
  5. The second device of claim 1, wherein the augmented reality session comprises a multi-party augmented reality building video game.
  6. The second device of claim 1, the memory having further computer-executable instructions stored thereupon which, when executed by the processor, cause the second device to: display a virtual object associated with the augmented reality session.
  7. The second device of claim 1, wherein the second device comprises a mobile phone.
  8. A method of creating a shared coordinate space in an augmented reality session between a first device (110) and a second device (120), each initially with disjoint relative coordinate spaces, comprising: by the first device, performing augmented reality tracking to establish a relative coordinate space of the first device (604); by the first device, displaying a spatial alignment image (210, 612); by the first device, storing location information regarding the first device and associated timestamps for at least a portion of a time the spatial alignment image is displayed (616); by the first device, receiving a request for information from the second device, the request including a timestamp associated with a clock of the second device associated with recognition of the spatial alignment image (632); and by the first device, providing location information regarding the first device at or about the timestamp, and, a spatial origin of the first device (636).
  9. The method of claim 8, further comprising: by the second device, recognizing the spatial alignment image displayed on the first device and receiving the identifier of the augmented reality session (620); by the second device, recording a location of the second device within the coordinate space of the second device and a timestamp associated with a clock of the second device associated with recognition of the spatial alignment image (624); by the second device, sending a request for information to the first user device, the request including the timestamp (628); by the second device, in response to the request, receiving information from the first device comprising the first device's location within a relative coordinate space of the first device at or about the timestamp, and, the spatial origin of the first device (640); by the second device, calculating an offset between the coordinate space of the second device and the coordinate space of the first device to create the shared coordinate space based, at least in part, upon the received information (644); and by the second device, utilizing the shared coordinate space and the identifier to display the augmented reality session (648).
  10. The method of claim 8 or 9, wherein the spatial alignment image is displayed at a predetermined size and with a plurality of features comprising predefined specific groupings of pixels of predefined colors and predefined intensities that allow the second device to determine its location in six degrees of position relative to the first device.
  11. The method of claim 8 or 9, further comprising: synchronizing the clock of the second device to a clock of the first device.
  12. The method of claim 8 or 9, further comprising: displaying a virtual object associated with the augmented reality session on the first device.
  13. The method of claim 8 or 9, wherein the first device comprises a mobile phone.
  14. A computer storage media storing computer-readable instructions that when executed cause a computing device to perform the steps of the first device of the method of claim 8.
  15. The computer storage media of claim 14, wherein the spatial alignment image is displayed at a predetermined size and with a plurality of features comprising predefined specific groupings of pixels of predefined colors and predefined intensities that allow the second device to determine its location in six degrees of position relative to the first device, in particular, storing further computer-readable instructions that when executed cause a computing device to: display a virtual object associated with the augmented reality session on the first device.

Description

BACKGROUND Augmented reality (AR) systems such as video games display real world images overlaid with a virtual experience (e.g., interactive three-dimensional object(s)). An AR system thus enables a participant to view real-world imagery in combination with context-relevant, computer-generated imagery. Imagery from the real-world and the computer-generated are combined and presented to a user such that they appear to share the same physical space. In AR applications in which multiple participants share the same physical environment, inconsistent positioning of the computer-generated imagery relative to the real-world imagery can be a noticeable distraction that degrades the AR experience. US 2018/0374269 A1 publishes an augmented reality and virtual reality mobile device user interface automatic calibration. "SwiftShot: Creating a Game for Augmented Reality | Apple Developer Documentation", 19 January 2019, Retrieved from the Internet: URL:https:// web. archive. org/web/20190119151716/https://developer. apple. com/documentation/arkit/ swiftshot_creating_a_game_for_augmented_reality shows the set-up of a shared augmented reality space using ARKit. Ron Amadeo: "Google's ARCore 1.2 enables multiplayer AR across Android and iOS", 8 May 2018 (2018-05-08), Retrieved from the Internet: URL:https://arstechnica.com/ gadgets/2018/05/google-arcore- 1-2-enables-shared-ar-experiences-across-android-and-ios/ shows how to establish a common coordinate system in a multplayer augmented reality application by defining real world objects as AR anchors. SUMMARY The invention is defined by the independent claims. The dependent claims concern optional features of some embodiments of the invention. BRIEF DESCRIPTION OF THE DRAWINGS Fig. 1 is a functional block diagram that illustrates a system for creating a shared coordinate space in an augmented reality session between at least two devices with disjoint relative coordinate spaces.Figs. 2 and 3 are exemplary user interfaces.Fig. 4 is a flow chart that illustrates a method of creating a shared coordinate space in an augmented reality session between at least two devices with disjoint relative coordinate spaces by a first user gaming device.Fig. 5 is a flow chart that illustrates a method of creating a shared coordinate space in an augmented reality session between at least two devices with disjoint relative coordinate spaces by second first user gaming device.Figs. 6 and 7 are flow charts that illustrate a method of creating a shared coordinate space in an augmented reality session between at least two devices with disjoint relative coordinate spaces.Fig. 8 is a functional block diagram that illustrates an exemplary computing system. DETAILED DESCRIPTION Various technologies pertaining to aligning location for a shared augmented reality experience are now described with reference to the drawings, wherein like reference numerals are used to refer to like elements throughout. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of one or more aspects. It may be evident, however, that such aspect(s) may be practiced without these specific details. In other instances, well-known structures and devices are shown in block diagram form in order to facilitate describing one or more aspects. Further, it is to be understood that functionality that is described as being carried out by certain system components may be performed by multiple components. Similarly, for instance, a component may be configured to perform functionality that is described as being carried out by multiple components. The subject disclosure supports various products and processes that perform, or are configured to perform, various actions regarding aligning location for a shared augmented reality experience. What follows are one or more exemplary systems and methods. Aspects of the subject disclosure pertain to the technical problem of aligning location for a shared augmented reality experience. The technical features associated with addressing this problem involve creating a shared coordinate space in an augmented reality session between a first user gaming device and a second user gaming device, each initially with disjoint relative coordinate spaces using a displayed spatial alignment image and an AR session identifier. Accordingly, aspects of these technical features exhibit technical effects of more efficiently and effectively establishing a shared coordinate space for a plurality of AR devices. Moreover, the term "or" is intended to mean an inclusive "or" rather than an exclusive "or." That is, unless specified otherwise, or clear from the context, the phrase "X employs A or B" is intended to mean any of the natural inclusive permutations. That is, the phrase "X employs A or B" is satisfied by any of the following instances: X employs A; X employs B; or X employs both A and B. In addition, the articles "a" and "an" as use