EP-3757726-B1 - METHODS AND APPARATUS FOR PROJECTING AUGMENTED REALITY ENHANCEMENTS TO REAL OBJECTS IN RESPONSE TO USER GESTURES DETECTED IN A REAL ENVIRONMENT
Inventors
- AGRAWAL, ANKUR
- ANDERSON, GLEN J.
- BAIR, BENJAMIN
- Chierichetti, Rebecca
- Denman, Pete
Dates
- Publication Date
- 20260506
- Application Date
- 20200319
Claims (11)
- An apparatus (200) for projecting augmented reality, AR, enhancements to real objects (116, 118, 120), comprising: an object detector (206) configured to detect one or more real objects (116, 118, 120) located in a real environment (100), based on depth data obtained from a sensor array (102) located within the real environment (100); a gesture detector (210) configured to detect a user gesture within the real environment (100), based on motion data obtained from the sensor array (102), the user gesture associated with a target real object from among the one or more real objects (116, 118, 120), wherein the user gesture represents a desired shape of a desired virtual drawing to be projected to the target real object; an enhancement determiner (216) configured to determine an AR enhancement based on the user gesture and the target real object, and to instruct a projector (104) to project the AR enhancement to the target real object, wherein the AR enhancement includes a virtual drawing having the shape corresponding to the desired shape of the desired virtual drawing made by the user gesture; and a physics simulator (224) configured to simulate physics associated with the user gesture and the target real object, the enhancement determiner (216) further configured to determine the AR enhancement based on the simulated physics, wherein the virtual drawing in the AR enhancement includes a virtual character interacting with the target real object, the virtual character to be selected from a plurality of candidate virtual characters, the selection to be based on simulated physics associated with the virtual drawing in relation to the target real object.
- The apparatus (200) of claim 1, further including a map generator (204) configured to generate a map of the real environment (100) based on the depth data, wherein the object detector (206) is configured to detect the one or more real objects (116, 118, 120) located in the real environment (100) based on the map.
- The apparatus (200) of claim 2, further including a target object determiner (214) configured to determine the target real object associated with the user gesture based on the user gesture and further based on at least one of the map or the one or more real objects (116, 118, 120).
- The apparatus (200) of claim 1, further including an object identifier (208) configured to semantically identify the one or more real objects (116, 118, 120).
- The apparatus (200) of claim 1, further including a gesture identifier (212) configured to semantically identify the user gesture.
- The apparatus (200) of claim 1, further including a context evaluator (222) configured to evaluate contextual information associated with at least one of the user gesture or the target real object, the enhancement determiner (216) to determine the AR enhancement based on the evaluated contextual information.
- The apparatus (200) of any of claims 1-6, wherein the AR enhancement further includes at least one of a virtual resizing of the target real object, or a virtual structure intersecting the target real object.
- A method performed by the apparatus of any of claims 1-7 for projecting augmented reality, AR, enhancements to real objects (116, 118, 120), the method comprising: detecting one or more real objects (116, 118, 120) located in a real environment (100) based on depth data obtained from a sensor array (102) located within the real environment (100); detecting a user gesture within the real environment (100) based on motion data obtained from the sensor array (102), the user gesture associated with a target real object from among the one or more real objects (116, 118, 120), wherein the user gesture represents a desired shape of a desired virtual drawing to be projected to the target real object; determining an AR enhancement based on the user gesture and the target real object, wherein the AR enhancement includes a virtual drawing having the shape corresponding to the desired shape of the desired virtual drawing made by the user gesture; and instructing a projector to project the AR enhancement to the target real object; and simulating physics associated with at least one of the user gesture and the target real object; and determining the AR enhancement based on the simulated physics, wherein the virtual drawing in the AR enhancement includes a virtual character interacting with the target real object, the virtual character to be selected from a plurality of candidate virtual characters, the selection to be based on simulated physics associated with the virtual drawing in relation to the target real object.
- The method of claim 8, further including: evaluating contextual information associated with at least one of the user gesture or the target real object; and determining the AR enhancement based on the evaluated contextual information.
- The method of any of claims 8-9, wherein the AR enhancement further includes at least one of a virtual resizing of the target real object, or a virtual structure intersecting the target real object
- A non-transitory computer-readable storage medium comprising instructions that, when executed, cause one or more processors (134) of a machine to implement the method or apparatus (200) for projecting augmented reality, AR, enhancements to real objects (116, 118, 120) of any of the foregoing claims.
Description
FIELD OF THE DISCLOSURE This disclosure relates generally to augmented reality and, more specifically, to methods and apparatus for projecting augmented reality enhancements to real objects in response to user gestures detected in a real environment. BACKGROUND Augmented reality (AR) techniques can be implemented to alter an individual's perception of a real environment by enhancing one or more real object(s) located within the real environment with computer-generated perceptual information. In some known AR techniques, visual perceptual information is displayed on (e.g., projected onto) at least one real object located within the real environment. The displayed perceptual information can be constructive to (e.g., additive to) or destructive of (e.g., masking of) one or more physical features and/or properties of the real object and/or the real environment. AR techniques are typically performed in real-time, and are often based on semantic contexts associated with the real content (e.g., real structures, real objects, real individuals, etc.) of the real environment. US 9563955 B1 describes techniques for efficiently identifying objects of interest in an environment and thereafter tracking the location and/or orientation of those objects. A system may analyze images captured by a camera to identify objects that may be represented by images. These objects may be identified in the images based on their size, color, and/or other physical attributes. After identifying these potential objects, the system may define a region around each object for further inspection. Thereafter, portions of a depth map of the environment corresponding to these regions may be analyse to determine whether any of the objects identifies from the images are "objects of interest" - or objects that the system has previously been instructed to track. These objects of interest may include portable projection surfaces, a users hand, or any other physical object. The techniques identify these objects with reference to the respective depth signatures of these objects. US 2019/0043267 A1 describes technologies for virtual attribute assignment which include a compute device. The compute device is configured to receive an attribute assignment command from a user and analyze the attribute assignment command to determine a user-selected virtual object, a user-referenced attribute of the user-selected virtual object, a user-selected real object, and a user-referenced attribute of the user-selected real object. Based on the attribute assignment command, the compute device is further configured to determine a state of the user-references attribute of the user-selected real object and update a state of the user-referenced attribute of the user-selected virtual object based on the state of the user-references attribute of the user-selected real object. US 2019/0043260 A1 describes systems, apparatus and computer-readable media for managing data storage for generating virtual bindings. A user may perform one or more gestures and/or voice commands to create virtual bindings with physical objects, where the created virtual bindings may take on attributes and create/perform actions based on attributes of the physical objects. A projection device may recognize the physical objects and cause the bindings and/or projected virtual objects to perform various actions in response to different user gestures and/or voice commands. Additionally, the system may instruct some physical objects (e.g., robots, electromechanical devices, etc.) in response to user gestures/voice commands to cause those physical devices to perform various actions. US 2014/104274 A1 describes an augmented reality system which enables grasping of virtual objects such as to stack virtual cubes or manipulate virtual objects in other ways. A user's hand or another real object is tracked in an augmented reality environments. The shape of the tracked real object is approximated using different types of particles and the virtual objects are updated according to simulated forces exerted between the augmented reality environment and the particles. SUMMARY OF INVENTION The present invention is defined in the independent claims. Preferred features are recited in the dependent claims. In the following description, any embodiment referred to and not falling within the literal meaning of the claims is merely an example useful to the understanding of the invention. This is without prejudice to the question of realizations of the claimed invention by equivalent means, which is subject to national laws. BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 illustrates an example real environment in which an example sensor array and an example projector may be implemented in accordance with teachings of this disclosure to project one or more AR enhancement(s) to one or more real object(s) located within the real environment in response to one or more user gesture(s) detected in the real environment.FIG. 2 is a block diagr