Search

US-12626457-B1 - View-invariant edge filtering for electronic devices

US12626457B1US 12626457 B1US12626457 B1US 12626457B1US-12626457-B1

Abstract

View-invariant edge filtering is provided. In one or more implementations, view-invariant edge filtering may provide anti-aliasing that is effective in reducing or eliminating perceivable aliasing across various different viewing configurations of display content, including foveated rendering configurations and/or viewing configurations in which display content is moveable and rotatable in three virtual dimensions.

Inventors

  • James A. Mccombe

Assignees

  • APPLE INC.

Dates

Publication Date
20260512
Application Date
20240222

Claims (20)

  1. 1 . An electronic device, comprising: a memory; and one or more processors configured to: obtain display content to be displayed by the electronic device for three-dimensional viewing in a first viewing configuration and a second viewing configuration that is different than the first viewing configuration, the display content including a display object having an edge, wherein the first and second viewing configurations include a foveated rendering of the display content and a viewing angle of a three-dimensional perspective view of the display object; apply a view-invariant filtering operation to the edge of the display object; and display, in at least one of the first or second viewing configurations, the display content with the view-invariant filtering operation applied.
  2. 2 . The electronic device of claim 1 , wherein the one or more processors are further configured to display, in at least the other of the first or second viewing configurations, the display content with the view-invariant filtering operation applied.
  3. 3 . The electronic device of claim 2 , wherein the first viewing configuration comprises the foveated rendering of the display content, the second viewing configuration comprises the three-dimensional perspective view of the display content, and wherein the one or more processors are further configured to display the display content with the view-invariant filtering operation applied in the at least one of the first or second viewing configurations by displaying the display content with the view-invariant filtering operation applied concurrently in both the first and second viewing configurations.
  4. 4 . The electronic device of claim 1 , wherein the view-invariant filtering operation is applied to generate feathered rendered pixel values within a fixed width margin in a rendered pixel value space.
  5. 5 . The electronic device of claim 4 , wherein the one or more processors are configured to display the display content with the view-invariant filtering operation applied in the at least one of the first and second viewing configurations by displaying each of the feathered rendered pixel values that are within the fixed width margin in the rendered pixel value space, using a first number of display pixels of a display of the electronic device, and by displaying at least one other rendered pixel value outside the fixed width margin in the rendered pixel value space with a second number of display pixels different from the first number of display pixels.
  6. 6 . The electronic device of claim 5 , wherein the second number of display pixels is one display pixel and the first number of display pixels is more than one display pixel.
  7. 7 . The electronic device of claim 4 , wherein the one or more processors are configured to apply the view-invariant filtering operation at least in part by determining alpha values of the feathered rendered pixel values using a predetermined filter function, and wherein, for each feathered rendered pixel value, the predetermined filter function is a function of a distance of the feathered rendered pixel value to the edge of the display object.
  8. 8 . The electronic device of claim 7 , wherein the one or more processors are configured to apply the view-invariant filtering operation by: obtaining a distance, to the edge of the display object, of a location of a fragment of the display content in a display content space; obtaining a derivative of the distance in the display content space with respect to the distance in the rendered pixel value space; setting a filter width of the predetermined filter function in the display content space using the derivative and the fixed width margin in the rendered pixel value space; and determining one of the alpha values of a respective one of the feathered rendered pixel values corresponding to the fragment of the display content, using the distance in the display content space as an input to the predetermined filter function with the filter width in the display content space.
  9. 9 . A non-transitory computer-readable medium storing instructions which, when executed by one or more processors, cause the one or more processors to: obtain display content to be displayed by a device for three-dimensional viewing in any of a plurality of viewing configurations, the display content including a display object having an edge, wherein the plurality of viewing configurations include a foveated rendering of the display content and a viewing angle of a three-dimensional perspective view of the display object; apply, by the device, a filtering operation to the edge of the display object, the filtering operation having an invariance across the plurality of viewing configurations; and display, by the device in at least one of the plurality of viewing configurations, the display content with the filtering operation applied.
  10. 10 . The non-transitory computer-readable medium of claim 9 , wherein the plurality of viewing configurations further include a viewing distance of the display object.
  11. 11 . The non-transitory computer-readable medium of claim 9 , wherein the filtering operation is applied to generate rendered pixel values within a fixed width margin in a rendered pixel value space.
  12. 12 . The non-transitory computer-readable medium of claim 9 , wherein the filtering operation is based at least in part on a derivative of distance in a display content space of the display content with respect to a distance in a pixel space.
  13. 13 . A method, comprising: obtaining display content to be displayed by a device for three-dimensional viewing in any of a plurality of viewing configurations, the display content including a display object having an edge, wherein the plurality of viewing configurations include a foveated rendering of the display content and a viewing angle of a three-dimensional perspective view of the display object; applying, by the device, a view-invariant filtering operation to the edge of the display object; and displaying, by the device in at least one of the plurality of viewing configurations, the display content with the view-invariant filtering operation applied.
  14. 14 . The method of claim 13 , wherein the plurality of viewing configurations further include a viewing distance of the display object.
  15. 15 . The method of claim 13 , wherein the view-invariant filtering operation is applied to generate rendered pixel values within a fixed width margin in a rendered pixel value space.
  16. 16 . The method of claim 15 , wherein displaying the display content with the view-invariant filtering operation applied in the at least one of the plurality of viewing configurations comprises displaying the rendered pixel values that are within the fixed width margin in the rendered pixel value space, using a variable number of display pixels of a display of the device, the variable number of display pixels being based on the at least one of the plurality of viewing configurations.
  17. 17 . The method of claim 15 , wherein the applying the view-invariant filtering operation comprises determining alpha values for the rendered pixel values using a predetermined filter function.
  18. 18 . The method of claim 17 , wherein, for each rendered pixel value, the predetermined filter function is a function of a distance of the rendered pixel value to the edge of the display object.
  19. 19 . The method of claim 18 , wherein applying the view-invariant filtering operation comprises: obtaining a distance, to the edge of the display object, of a location of a fragment of the display content in a display content space; obtaining a derivative of the distance in the display content space with respect to the distance in the rendered pixel value space; setting a filter width of the predetermined filter function in the display content space using the derivative and the fixed width margin in the rendered pixel value space; and determining one of the alpha values of a respective one of the rendered pixel values corresponding to the fragment of the display content, using the distance in the display content space as an input to the predetermined filter function with the filter width in the display content space.
  20. 20 . The method of claim 19 , wherein obtaining the distance in the display content space comprises obtaining the distance using a signed distance field.

Description

CROSS REFERENCE TO RELATED APPLICATIONS This application claims the benefit of priority to U.S. Provisional Patent Application No. 63/454,942, entitled, “View-Invariant Edge Filtering for Electronic Devices”, filed on Mar. 27, 2023, the disclosure of which is hereby incorporated herein in its entirety. TECHNICAL FIELD The present description relates generally to electronic devices including, for example, view-invariant edge filtering for electronic devices. BACKGROUND Electronic devices typically utilize arrays of display pixels to present display content. Displaying boundaries in the display content with the display pixels can result in a visual artifact, often referred to as aliasing, in which the boundaries that are intended to appear smooth instead appear jagged or stepped according to the physical features of the display pixels. If the boundary is moved while being displayed, this aliasing at the boundary can also cause a user to perceive an undesirable “crawling” artifact as pixels near the boundary are turned on or off with the motion of the boundary across the display pixels. BRIEF DESCRIPTION OF THE DRAWINGS Certain features of the subject technology are set forth in the appended claims. However, for purpose of explanation, several implementations of the subject technology are set forth in the following figures. FIG. 1 illustrates an example system architecture including various electronic devices that may implement the subject system in accordance with one or more implementations. FIG. 2 illustrates an example of a foveated display frame in accordance with aspects of the subject technology. FIG. 3 illustrates an example of a foveated display frame that includes display objects having edges in accordance with aspects of the subject technology. FIG. 4 illustrates an example of a display frame that includes three-dimensional display content in accordance with aspects of the subject technology. FIG. 5 illustrates a view of an edge of a display object before and after a three-dimensional rotation of the display object in accordance with aspects of the subject technology. FIG. 6 illustrates a flow diagram of an example process for view-invariant edge filtering in accordance with aspects of the subject technology. FIG. 7 illustrates an example computing device with which aspects of the subject technology may be implemented. DETAILED DESCRIPTION The detailed description set forth below is intended as a description of various configurations of the subject technology and is not intended to represent the only configurations in which the subject technology can be practiced. The appended drawings are incorporated herein and constitute a part of the detailed description. The detailed description includes specific details for the purpose of providing a thorough understanding of the subject technology. However, the subject technology is not limited to the specific details set forth herein and can be practiced using one or more other implementations. In one or more implementations, structures and components are shown in block diagram form in order to avoid obscuring the concepts of the subject technology. A physical environment refers to a physical world that people can sense and/or interact with without aid of electronic devices. The physical environment may include physical features such as a physical surface or a physical object. For example, the physical environment corresponds to a physical park that includes physical trees, physical buildings, and physical people. People can directly sense and/or interact with the physical environment such as through sight, touch, hearing, taste, and smell. In contrast, an extended reality (XR) environment refers to a wholly or partially simulated environment that people sense and/or interact with via an electronic device. For example, the XR environment may include augmented reality (AR) content, mixed reality (MR) content, virtual reality (VR) content, and/or the like. With an XR system, a subset of a person's physical motions, or representations thereof, are tracked, and, in response, one or more characteristics of one or more virtual objects simulated in the XR environment are adjusted in a manner that comports with at least one law of physics. As one example, the XR system may detect head movement and, in response, adjust graphical content and an acoustic field presented to the person in a manner similar to how such views and sounds would change in a physical environment. As another example, the XR system may detect movement of the electronic device presenting the XR environment (e.g., a mobile phone, a tablet, a laptop, or the like) and, in response, adjust graphical content and an acoustic field presented to the person in a manner similar to how such views and sounds would change in a physical environment. In some situations (e.g., for accessibility reasons), the XR system may adjust characteristic(s) of graphical content in the XR environment in response to represent