US-12625681-B2 - Serializing and deserializing mixed reality experiences or portions thereof
Abstract
A portion of a source mixed reality (MR) experience is retrieved. Then, the portion of the source MR experience is used to generate a serialized representation including a hierarchy of tagged elements. The hierarchy of tagged elements includes a plurality of MR step elements collectively defining a procedure to be performed by a viewer of an MR experience. Each MR step element has child elements that include an MR step number indicating a position of the MR step in the MR procedure and an MR step ID element indicating an identity of the MR step. The serialized representation is deserialized to generate a portion of a target MR experience to be edited in an MR development tool. The portion of the target MR experience is usable to cause each MR step in the plurality of MR steps to be graphically represented in the MR development tool.
Inventors
- Julian Volyn
- Michael Davis
- TJ Southard
- Phillip Do
- Josh Zavaleta
- James Williams
- Marlo Brooke
- Scott Toppel
Assignees
- simpleAR, Inc.
Dates
- Publication Date
- 20260512
- Application Date
- 20240221
Claims (15)
- 1 . A method in a computing system, the method comprising: obtaining a source mixed reality (MR) experience for a source platform; generating, based on the source MR experience, a serialized representation including human-readable code, wherein the human-readable code comprises a hierarchy of tagged elements that comprises: a schema version element; a computer vision tracking element; and a virtual object placement element that has child elements comprising: a position element that defines a position relative to an origin in the MR experience at which to place a virtual object, and a function element that defines a function to be applied to the MR experience in response to a placement of the virtual object; and deserializing the serialized representation, based on the schema version element, to generate a target MR experience for a target platform, wherein the target MR experience is usable to: place the virtual object at the position relative to the origin in the target MR experience, execute the function in response to the placement, and track, based on the computer vision tracking element, a position relative to the origin of a camera coupled to an MR device displaying the target MR experience.
- 2 . The method of claim 1 , wherein the source platform is different from the target platform.
- 3 . The method of claim 1 , wherein the hierarchy of tagged elements includes a target MR devices element specifying compatible MR devices for the MR experience.
- 4 . The method of claim 1 , wherein the hierarchy of tagged elements includes a three-dimensional (3D) animation element specifying a 3D animation to apply to the virtual object and the 3D animation element has child elements including: a file name element; a file type element; a clip name element; an autoplay behavior element; and a loop behavior element.
- 5 . The method of claim 1 , wherein the source MR experience includes a representation of a node graph that includes nodes that represent MR steps in an MR procedure.
- 6 . The method of claim 1 , wherein the target MR experience is configured to be modified using an MR development tool that includes a graphical programming environment.
- 7 . The method of claim 1 , wherein the source MR experience is configured to be modified using an MR development tool that includes a graphical programming environment.
- 8 . The method of claim 1 , wherein deserializing the serialized representation comprises mapping each tagged element in the hierarchy of tagged elements to a target element in the target MR experience.
- 9 . The method of claim 1 , wherein the serialized representation is packaged with media assets that the serialized representation references.
- 10 . The method of claim 1 , wherein the serialized representation is packaged with a plurality of media assets that the serialized representation references and the serialized representation is configured to be modified using an MR development tool that includes a graphical programming environment.
- 11 . The method of claim 1 , wherein the target platform is an MR viewing software compatible with a plurality of MR device types.
- 12 . The method of claim 1 , wherein the target MR experience includes a functionality not included in the source MR experience.
- 13 . The method of claim 1 , wherein the hierarchy of tagged elements further includes an interaction element specifying a function to apply to the source MR experience in response to an action by a user, the interaction element having child elements including: a button element; an audio trigger element; a dialog-driven choice element; an image placement element; and an object placement element.
- 14 . The method of claim 1 , wherein the computer vision tracking element specifies to track a physical object to be displayed as coupled to the virtual object.
- 15 . The method of claim 1 , wherein the serialized representation is deserialized into a plurality of target MR experiences for a plurality of corresponding target platforms.
Description
CROSS-REFERENCE TO RELATED APPLICATIONS This application claims the benefit of U.S. App. No. 63/515,503, filed Jul. 25, 2023, and entitled “CONVERTING OR GENERATING IMRSA STANDARD TAGGED AUGMENTED REALITY EXPERIENCE,” which is hereby incorporated by reference in its entirety. In cases where the present application conflicts with a document incorporated by reference, the present application controls. BACKGROUND Mixed reality (MR) experiences typically include performance of certain core functionalities, including establishing pose and tracking pose such that virtual artifacts are properly displayed with respect to features in a physical environment around a viewer. BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 is a network diagram showing an environment in which the facility operates. FIG. 2 is a block diagram showing some of the components typically incorporated in at least some of the computer systems and other devices on which the facility operates. FIG. 3 is a flow diagram showing a process used by the facility in some embodiments to serialize and deserialize a mixed reality experience. FIG. 4 is a schema key diagram showing schema notation used to denote characteristics of elements in schema depicted herein. FIGS. 5A and 5B are schema diagrams showing schema used by the facility in some embodiments to serialize and deserialize vision tracking information for a mixed reality experience. FIG. 5C is an excerpt of a serialized mixed reality experience according to the schema shown in FIG. 5B. FIGS. 6A, 6B, and 6C are schema diagrams showing schema used by the facility in some embodiments to serialize and deserialize various functionality for a mixed reality experience. FIGS. 7A, 7B, and 7C are schema diagrams showing schema used by the facility in some embodiments to serialize and deserialize various functionality for a mixed reality experience. FIG. 8 is a flow diagram showing a process used by the facility in some embodiments to serialize and deserialize a mixed reality experience having MR steps. FIGS. 9A and 9B are schema diagrams showing schema used by the facility in some embodiments to serialize and deserialize step information for a mixed reality experience. FIGS. 10A and 10B are schema diagrams showing schema used by the facility in some embodiments to serialize and deserialize multimedia information and interaction information for a mixed reality experience. DETAILED DESCRIPTION Mixed reality experiences developed and deployed across a variety of platforms using conventional techniques suffer from a lack of portability. Despite performing common core functionalities, functionality of a mixed reality experience is often not portable to other mixed reality experiences or mixed reality platforms. The inventors have recognized that conventional techniques for reusing functionality between mixed reality experiences and platforms are often unsatisfactory. A developer might create a mixed reality experience or a portion thereof for a platform. For example, assets for camera tracking or controlling object behavior are developed. These assets may be useful on other mixed reality platforms. But manually porting software from the first platform to another platform is often the only option for reusing mixed reality assets across platforms. Manual porting generally involves a developer attempting to manually recreate similar functionality on a different platform. Manually porting a mixed reality experience from one platform to another platform is often a laborious and error-prone task requiring significant developer time and computing resources. This limits the number of mixed reality experiences that can be developed and inhibits deploying mixed reality experiences across various platforms. In response to recognizing these disadvantages, the inventors have conceived and reduced to practice a software and/or hardware facility for serializing and deserializing mixed reality (MR) experiences or portions thereof (“the facility”). The facility provides for serializing and deserializing mixed reality experiences or portions thereof. First, the facility receives a source MR experience for a source platform. Then the facility uses the schema to generate, based on the source MR experience, a serialized representation including human-readable code. The human-readable code includes a hierarchy of tagged elements that includes a version indicator and a type of computer vision tracking to be used in an MR experience. The hierarchy of tagged elements further includes an indication of a virtual object, the indication including a position relative to an origin in the MR experience at which to place the virtual object and a function to be applied to the MR experience in response to the placement. The facility then generates, based on the serialized representation and the version indicator, a target MR experience to be executed using a target platform. The target MR experience is usable to place the virtual object at the position r