Search

US-12626470-B1 - Artificial reality room capture realignment

US12626470B1US 12626470 B1US12626470 B1US 12626470B1US-12626470-B1

Abstract

Example implementations are directed to recapturing or realigning a room or scene in an artificial reality (XR) environment as a three dimensional (3D) space. To initially capture a room, a user manually annotates, using an XR system, one or more walls around the XR system. The XR system then aligns or localizes itself relative to the room. Once a room is captured and stored, and the user in the future enters the room again, the XR system may not be able to align itself in the room due to, for example, lighting conditions. Rather than require the user to manually recapture every wall captured during the initial room capture, the XR system can select one wall for recapture and, after receiving an annotation or marking of the one wall from the user, use the selection to realign the XR system relative to the room.

Inventors

  • Shaik Shabnam Nizamudeen Basha
  • Lu Zhou
  • Eugene Lee
  • Kelly Rui-Ying Wang
  • Sony Nguyen
  • Hongbing Hu
  • Sean Finn

Assignees

  • META PLATFORMS TECHNOLOGIES, LLC

Dates

Publication Date
20260512
Application Date
20230703

Claims (20)

  1. 1 . A method for realignment of an artificial reality (XR) captured room for an XR system comprising one or more XR controllers, the method comprising: determining to realign an XR room corresponding to a real world room that previously has been captured by the XR system, the real world room comprising one or more real walls and the XR room comprising one or more XR walls that correspond to the one or more real walls; displaying, to a user, the XR room and identifying a first wall of the one or more XR walls; receiving, via at least one of the XR controllers, an annotation of the first wall indicating an alignment correction; and realigning, using the alignment correction from the annotation, the XR room to the real world room.
  2. 2 . The method of claim 1 , wherein the identifying the first wall comprises generating and displaying a miniaturized three-dimensional model of the XR room that comprises a highlighted wall corresponding to the first wall.
  3. 3 . The method of claim 2 , wherein the first wall is a key wall and the remaining one or more XR walls are mapped to the key wall.
  4. 4 . The method of claim 3 , wherein the real world room comprises one or more real objects, and the XR room comprises one or more virtual XR objects that correspond to the one or more real objects, wherein the XR objects are mapped to the key wall.
  5. 5 . The method claim 3 , wherein the key wall is an initial wall that was captured when the real world room was previously captured or is a widest wall of the real world room.
  6. 6 . The method of claim 2 , wherein the miniaturized three-dimensional model rotates to correspond to an orientation of a first XR controller and a second XR controller relative to the XR room.
  7. 7 . The method of claim 6 , wherein the annotation of the first wall is received and implemented by the second XR controller.
  8. 8 . The method of claim 1 , wherein the annotation comprises scanning a height and a width of the first wall using a second XR controller.
  9. 9 . The method of claim 8 , wherein the scanning comprises casting a ray that extends from the second controller.
  10. 10 . The method of claim 1 , further comprising: in response to the user entering the real world room, displaying a selectable list of previously captured real world rooms and receiving a selection of one previously captured real world rooms for the realignment.
  11. 11 . The method of claim 1 , wherein the determining is in response to the user requesting the realignment.
  12. 12 . The method of claim 1 , wherein the determining is in response to an automated determination that the XR room needs to be realigned with the real room based on at least lighting conditions of the real world room.
  13. 13 . The method of claim 1 , further comprising placing a second XR controller on a floor of the real world room to establish a floor level.
  14. 14 . A non-transitory computer-readable storage medium storing instructions that, when executed by a computing system, cause the computing system to perform a process for realignment of an artificial reality (XR) captured room for an XR system, the process comprising: determining to realign an XR room corresponding to a real world room that previously has been captured by the XR system, the real world room comprising one or more real walls and the XR room comprising one or more XR walls that correspond to the one or more real walls; displaying, to a user, the XR room and identifying a first wall of the one or more XR walls; receiving an annotation of the first wall indicating an alignment correction; and realigning, using the alignment correction from the annotation, the XR room to the real world room.
  15. 15 . The non-transitory computer-readable storage medium of claim 14 , wherein the identifying the first wall comprises generating and displaying a miniaturized three-dimensional model of the XR room that comprises a highlighted wall corresponding to the first wall.
  16. 16 . The non-transitory computer-readable storage medium of claim 15 , wherein the first wall is a key wall and the remaining one or more XR walls are mapped to the key wall.
  17. 17 . The non-transitory computer-readable storage medium of claim 16 , wherein the real world room comprises one or more real objects, and the XR room comprises one or more XR objects that correspond to the one or more real objects, wherein the XR objects are mapped to the key wall.
  18. 18 . The non-transitory computer-readable storage medium of claim 14 , wherein the identifying a first wall of the one or more XR walls comprises scanning a height and a width of the first wall.
  19. 19 . The non-transitory computer-readable storage medium of claim 14 , wherein the process further comprises: in response to the user entering the real world room, displaying a selectable list of previously captured real world rooms and receiving a selection of one previously captured real world rooms for the realignment.
  20. 20 . A computing system for realignment of an artificial reality (XR) captured room for an XR system, the computing system comprising: one or more processors; and one or more memories storing instructions that, when executed by the one or more processors, cause the computing system to perform a process comprising: determining to realign an XR room corresponding to a real world room that previously has been captured by the XR system, the real world room comprising one or more real walls and the XR room comprising one or more XR walls that correspond to the one or more real walls; displaying, to a user, the XR room and identifying a first wall of the one or more XR walls; receiving an annotation of the first wall indicating an alignment correction; and realigning, using the alignment correction from the annotation, the XR room to the real world room.

Description

TECHNICAL FIELD The present disclosure is directed to artificial reality and capturing a real world room and scene for use in artificial reality. BACKGROUND Artificial reality systems are becoming increasingly ubiquitous with applications in many fields such as computer gaming, health and safety, industrial, and education. As a few examples, artificial reality systems are being incorporated into mobile devices, gaming consoles, personal computers, movie theaters, and theme parks. In general, artificial reality is a form of reality that has been adjusted in some manner before presentation to a user, which may include, e.g., a virtual reality (VR), an augmented reality (AR), a mixed reality (MR), a hybrid reality, or some combination and/or derivatives thereof. Some types of artificial reality incorporate real world elements, such as real world rooms and spaces, by digitally reconstructing or capturing those elements as a three dimensional space or an artificial reality space. In addition to capturing the elements, it is necessary to align the artificial reality space to the corresponding real world space so that the location where a user is viewing in the artificial reality space is approximately the same location in the real world space. BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 is a block diagram illustrating an overview of devices on which some implementations of the present technology can operate. FIG. 2A is a wire diagram illustrating a virtual reality headset which can be used in some implementations of the present technology. FIG. 2B is a wire diagram illustrating a mixed reality headset which can be used in some implementations of the present technology. FIG. 2C is a wire diagram illustrating controllers which, in some implementations, a user can hold in one or both hands to interact with an artificial reality environment. FIG. 3 is a block diagram illustrating an overview of an environment in which some implementations of the present technology can operate. FIG. 4 is a block diagram illustrating components which, in some implementations, can be used in a system employing the disclosed technology. FIG. 5 illustrates when a user initially re-enters a previously captured room in example implementations. FIG. 6 illustrates an example implementation of a workflow that instructs a user to how to realign a previously captured room. FIG. 7 illustrates that, as the user moves the controllers to move to the highlighted wall, the room rotates to attempt to correspond the room with the actual room in example implementations. FIG. 8 illustrates the wall that corresponds to the highlighted wall when in view of the user that corresponds to the highlighted wall in example implementations. FIG. 9 illustrates the process to outline or mark or annotate the actual wall in accordance with example implementations. FIG. 10 illustrates that, after the wall is successfully marked or captured, the remaining walls and objects in the room are outlined in accordance with example implementations. FIG. 11 illustrates an adjust room placement functionality that is entered after the key wall is successfully marked in accordance with example implementations. FIG. 12 is a flow diagram illustrating a process used in some implementations for realigning a captured room. The techniques introduced here may be better understood by referring to the following Detailed Description in conjunction with the accompanying drawings, in which like reference numerals indicate identical or functionally similar elements. DETAILED DESCRIPTION Aspects of the present disclosure are directed to recapturing or realigning a room or scene or space that has previously been digitally captured as an artificial reality room or scene (i.e., the digital reconstruction of a real world physical room, scene or space) without the need to progress through the entire original room capturing workflow. Realignment may be necessary due to, for example, sensor drift such as when the positional tracking of the headset and controllers by sensors becomes inaccurate. Example implementations are directed to recapturing or realigning a room or scene in an artificial reality (XR) environment as a three dimensional (3D) space. To initially capture a room, a user manually annotates, using an XR system, one or more walls around the XR system. The XR system then aligns or localizes itself relative to the room. Once a room is captured and stored, and the user in the future enters the room again, the XR system may not be able to align itself in the room due to, for example, lighting conditions. Rather than require the user to manually recapture every wall captured during the initial room capture, the XR system can select one wall for recapture and, after receiving an annotation or marking of the one wall from the user, use the selection to realign the XR system relative to the room. In some implementations, the XR system projects a hand-tethered miniaturized “ghosted” 3D model of the room