US-12620052-B1 - Warping an image with a combination of warping functions
Abstract
A method includes obtaining an image that includes a plurality of pixels. The method includes identifying a first subset of the plurality of pixels that defines a particular geometry that a user of the electronic device is currently focusing on. The method includes warping the first subset of the plurality of pixels according to a first warping function that is based on a movement of the electronic device. The method includes warping a second subset of the plurality of pixels, that is different from the first subset, according to a second warping function that is different from the first warping function.
Inventors
- Seyedkoosha MIRHOSSEINI
- Seyedpooya Mirhosseini
Assignees
- APPLE INC.
Dates
- Publication Date
- 20260505
- Application Date
- 20230811
Claims (20)
- 1 . A method comprising: at an electronic device including one or more processors, a non-transitory memory, and an image sensor: obtaining an image that includes a plurality of pixels; identifying a first subset of the plurality of pixels that defines a particular geometry that a user of the electronic device is currently focusing on, wherein the particular geometry is a two-dimensional (2D) plane; warping the first subset of the plurality of pixels according to a first warping function that is based on a movement of the electronic device, wherein the first warping function is used to warp pixels defining the 2D plane; and warping a second subset of the plurality of pixels, that is different from the first subset, according to a second warping function that is different from the first warping function, wherein the second subset defines a geometry different from the 2D plane, and wherein the second warping function is used to warp pixels defining the geometry different from the 2D plane.
- 2 . The method of claim 1 , wherein: the one or more processors include a central processing unit and a graphics processing unit; warping the first subset comprises determining the first warping function for the first subset in the central processing unit and applying the first warping function on the first subset in a display pipeline; and warping the second subset comprises determining the second warping function for the second subset in the graphics processing unit and applying the second warping function on the second subset in the graphics processing unit.
- 3 . The method of claim 1 , wherein warping the first subset according to the first warping function requires a first amount of computing resources and warping the second subset according to the second warping function requires a second amount of computing resources that is greater than the first amount of computing resources.
- 4 . The method of claim 1 , wherein warping the second subset comprises: determining whether or not a warping criterion associated with the second warping function is satisfied; warping the second subset according to the second warping function in response to determining that the warping criterion associated with the second warping function is satisfied; and forgo warping the second subset according to the second warping function in response to determining that the warping criterion associated with the second warping function is not satisfied.
- 5 . The method of claim 4 , wherein the warping criterion is satisfied when the movement of the electronic device exceeds a threshold amount of movement; and the warping criterion is not satisfied when the movement of the electronic device is below the threshold amount of movement.
- 6 . The method of claim 4 , wherein the warping criterion is satisfied when a difference between the image and a previous image is greater than a threshold; and the warping criterion is not satisfied when the difference between the image and the previous image is less than the threshold.
- 7 . The method of claim 1 , further comprising: determining the first warping function by constructing a homography function based on the movement of the electronic device; and generating an inverse homography function based on the homography function.
- 8 . The method of claim 7 , wherein the inverse homography function is a closed form inverse of the homography function.
- 9 . The method of claim 7 , wherein the inverse homography function is represented by a first warp matrix and a second warp matrix is an inverse of the first warp matrix.
- 10 . The method of claim 7 , wherein the homography function is a polynomial.
- 11 . The method of claim 7 , wherein the homography function is a linear equation.
- 12 . The method of claim 1 , wherein respective positions of pixels in the first subset do not change when the first subset is warped according to the first warping function; and wherein respective positions of at least some pixels in the second subset change when the second subset is warped according to the second warping function.
- 13 . The method of claim 1 , wherein the second warping function utilizes an interpolation function and the first warping function does not utilize the interpolation function.
- 14 . The method of claim 1 , wherein warping the first subset according to the first warping function does not result in sampling artifacts at locations corresponding to the first subset; and wherein warping the second subset according to the second warping function results in a sampling artifact at least one location corresponding to the second subset.
- 15 . The method of claim 1 , further comprising: warping a third subset of the plurality of pixels that defines the particular geometry according to the first warping function.
- 16 . The method of claim 1 , wherein the second subset defines a three-dimensional (3D) object depicted in the image.
- 17 . The method of claim 1 , wherein the second subset represents an object that the user is currently not focusing on.
- 18 . An electronic device comprising: one or more processors; an image sensor; a non-transitory memory; and one or more programs stored in the non-transitory memory, which, when executed by the one or more processors, cause the device to: obtain an image that includes a plurality of pixels; identify a first subset of the plurality of pixels that defines a particular geometry that a user of the electronic device is currently focusing on, wherein the particular geometry is a two-dimensional (2D) plane; warp the first subset of the plurality of pixels according to a first warping function that is based on a movement of the electronic device, wherein the first warping function is used to warp pixels defining the 2D plane; and warp a second subset of the plurality of pixels, that is different from the first subset, according to a second warping function that is different from the first warping function, wherein the second subset defines a geometry different from the 2D plane, and wherein the second warping function is used to warp pixels defining the geometry different from the 2D plane.
- 19 . The device of claim 18 , wherein the second subset defines a three-dimensional (3D) object depicted in the image, and wherein the geometry different from the 2D plane is warped using the second warping function.
- 20 . A non-transitory memory storing one or more programs, which, when executed by one or more processors of an electronic device with an image sensor, cause the electronic device to: obtain an image that includes a plurality of pixels; identify a first subset of the plurality of pixels that defines a particular geometry that a user of the electronic device is currently focusing on, wherein the particular geometry is a two-dimensional (2D) plane; warp the first subset of the plurality of pixels according to a first warping function that is based on a movement of the electronic device, wherein the first warping function is used to warp pixels defining the 2D plane; and warp a second subset of the plurality of pixels, that is different from the first subset, according to a second warping function that is different from the first warping function, wherein the second subset defines a geometry different from the 2D plane, and wherein the second warping function is used to warp pixels defining the geometry different from the 2D plane.
Description
CROSS-REFERENCE TO RELATED APPLICATION This application claims the benefit of U.S. Provisional Patent App. No. 63/397,226, filed on Aug. 11, 2022, which is incorporated by reference in its entirety. TECHNICAL FIELD The present disclosure generally relates to warping an image with a combination of warping functions. BACKGROUND Some devices include an integrated camera and a display. The camera captures an image of a physical environment, and the display displays the image for a user to view. As the device moves, the device can warp the image to synthesize and present a view from a new point-of-view. Warping the image provides an appearance that information presented on the display is responsive to motion of the device. However, warping the image can be a resource-intensive operation. BRIEF DESCRIPTION OF THE DRAWINGS So that the present disclosure can be understood by those of ordinary skill in the art, a more detailed description may be had by reference to aspects of some illustrative implementations, some of which are shown in the accompanying drawings. FIG. 1A is a diagram of an example operating environment in accordance with some implementations. FIG. 1B is a diagram of an example image being warped in accordance with some implementations. FIG. 2 is a block diagram of a system that warps an image in accordance with some implementations. FIG. 3 is a flowchart representation of a method of warping an image in accordance with some implementations. FIG. 4 is a block diagram of a device that warps an image in accordance with some implementations. In accordance with common practice the various features illustrated in the drawings may not be drawn to scale. Accordingly, the dimensions of the various features may be arbitrarily expanded or reduced for clarity. In addition, some of the drawings may not depict all of the components of a given system, method or device. Finally, like reference numerals may be used to denote like features throughout the specification and figures. SUMMARY Various implementations disclosed herein include devices, systems, and methods for warping an image. In some implementations, an electronic device includes one or more processors, a non-transitory memory and an image sensor. In various implementations, a method includes obtaining an image that includes a plurality of pixels. In some implementations, the method includes identifying a first subset of the plurality of pixels that defines a particular geometry that a user of the electronic device is currently focusing on. In some implementations, the method includes warping the first subset of the plurality of pixels according to a first warping function that is based on a movement of the electronic device. In some implementations, the method includes warping a second subset of the plurality of pixels, that is different from the first subset, according to a second warping function that is different from the first warping function. In accordance with some implementations, a device includes one or more processors, a non-transitory memory, and one or more programs. In some implementations, the one or more programs are stored in the non-transitory memory and are executed by the one or more processors. In some implementations, the one or more programs include instructions for performing or causing performance of any of the methods described herein. In accordance with some implementations, a non-transitory computer readable storage medium has stored therein instructions that, when executed by one or more processors of a device, cause the device to perform or cause performance of any of the methods described herein. In accordance with some implementations, a device includes one or more processors, a non-transitory memory, and means for performing or causing performance of any of the methods described herein. DESCRIPTION Numerous details are described in order to provide a thorough understanding of the example implementations shown in the drawings. However, the drawings merely show some example aspects of the present disclosure and are therefore not to be considered limiting. Those of ordinary skill in the art will appreciate that other effective aspects and/or variants do not include all of the specific details described herein. Moreover, well-known systems, methods, components, devices and circuits have not been described in exhaustive detail so as not to obscure more pertinent aspects of the example implementations described herein. A physical environment refers to a physical world that people can sense and/or interact with without aid of electronic devices. The physical environment may include physical features such as a physical surface or a physical object. For example, the physical environment corresponds to a physical park that includes physical trees, physical buildings, and physical people. People can directly sense and/or interact with the physical environment such as through sight, touch, hearing, taste, and smell. In contrast, an e