Search

US-20260129308-A1 - Mitigating Flicker and Reducing Power Consumption in a Head-Mounted Device

US20260129308A1US 20260129308 A1US20260129308 A1US 20260129308A1US-20260129308-A1

Abstract

A method of operating an electronic device such as a head-mounted device to mitigate flicker-related issues is provided. The method can include capturing first images of a physical environment at a first frequency, determining a frequency of a light source, capture second images of the physical environment at a second frequency different than the first frequency based on the frequency of the light source, and displaying warped images at a display frequency different than the second frequency. The warped images can be produced by warping a subset of the second images based on poses of the head-mounted device in the physical environment at times corresponding to when the subset of the second images are being captured at the second frequency and based on poses of the head-mounted device in the physical environment at times corresponding to when the warped images are being displayed at the display frequency.

Inventors

  • Daniel A Glynn
  • Simon FORTIN-DESCHENES
  • Luke A Pillans
  • Joseph Cheung
  • Seyedkoosha MIRHOSSEINI

Assignees

  • APPLE INC.

Dates

Publication Date
20260507
Application Date
20250321

Claims (20)

  1. 1 . A method of operating a head-mounted device, comprising: with one or more image sensors, capturing first images of a physical environment at a first frequency; determining a frequency of a light source in the physical environment; configuring the one or more image sensors to capture second images of the physical environment at a second frequency different than the first frequency based on the frequency of the light source; and with one or more displays, outputting warped images at a display frequency different than the second frequency, wherein the warped images are produced by warping a subset of the second images based on poses of the head-mounted device in the physical environment at times corresponding to when the subset of the second images are being captured at the second frequency and based on poses of the head-mounted device in the physical environment at times corresponding to when the warped images are being output on the one or more displays at the display frequency.
  2. 2 . The method of claim 1 , wherein the second frequency at which the second images are being captured by the one or more image sensors is equal to the frequency of the light source or the frequency of the light source divided by an integer.
  3. 3 . The method of claim 1 , wherein the display frequency is less than the second frequency at which the second images are being captured by the one or more image sensors.
  4. 4 . The method of claim 1 , wherein the display frequency is equal to the first frequency at which the first images are being captured by the one or more image sensors prior to configuring the one or more image sensors to operate at the second frequency.
  5. 5 . The method of claim 1 , further comprising: subsequent to determining the frequency of the light source, adjusting an exposure time for capturing the first images based on the frequency of the light source.
  6. 6 . The method of claim 5 , further comprising: aligning capture time periods for at least some of the first images to respective peaks of the light source.
  7. 7 . The method of claim 1 , wherein after configuring the one or more image sensors to capture second images of the physical environment at the second frequency, capture time periods of the second images are aligned to respective peaks of the light source.
  8. 8 . The method of claim 7 , wherein warping the subset of the second images comprises: warping a first image in the subset of the second images using a first warp definition generated based on a pose of the head-mounted device in the physical environment at a first mid-capture time of the first image and based on a pose of the head-mounted device at a first mid-display time of the first image; warping a second image in the subset of the second images using a second warp definition generated based on a pose of the head-mounted device in the physical environment at a second mid-capture time of the second image and based on a pose of the head-mounted device at a second mid-display time of the second image; and warping a third image in the subset of the second images using a third warp definition generated based on a pose of the head-mounted device in the physical environment at a third mid-capture time of the third image and based on a pose of the head-mounted device at a third mid-display time of the third image.
  9. 9 . The method of claim 8 , wherein: a difference between the first mid-display time and the first mid-capture time is equal to a base capture-to-display latency; and a difference between the second mid-display time and the second mid-capture time is equal to the base capture-to-display latency plus an offset that is a function of the display frequency and the second frequency.
  10. 10 . The method of claim 9 , wherein a difference between the third mid-display time and the third mid-capture time is equal to the base capture-to-display latency plus at least two times the offset.
  11. 11 . The method of claim 7 , wherein warping the subset of the second images comprises warping a given image by a first amount based on poses of the head-mounted device in the physical environment and warping a portion of the given image by a second amount different than the first amount to mitigate judder in the portion of the given image.
  12. 12 . The method of claim 1 , further comprising: subsequent to configuring the one or more image sensors to capture second images of the physical environment at the second frequency, mitigating motion blur by reducing an exposure time for capturing at least the subset of the second images.
  13. 13 . The method of claim 1 , further comprising: subsequent to configuring the one or more image sensors to capture second images of the physical environment at the second frequency, mitigating flicker by adjusting an exposure time for capturing at least the subset of the second images.
  14. 14 . The method of claim 1 , further comprising: dropping another subset of the second images different than the subset of the second images, wherein the another subset of the second images are not being output on the one or more displays.
  15. 15 . The method of claim 1 , further comprising: using another subset of the second images different than the subset of the second images for one or more of: exposure time evaluation, image sensor gain evaluation, clipping evaluation, high dynamic range (HDR) recovery, and two-dimensional brightness and color correction map generation.
  16. 16 . The method of claim 15 , wherein the subset of the second images are captured using first exposure times or a first image sensor gain, and wherein the another subset of the second images are captured using second exposure times different than the first exposure times or a second image sensor gain different than the first image sensor gain.
  17. 17 . The method of claim 1 , further comprising: with a recording pipeline, generating a recording by storing only a portion of the subset of the second images.
  18. 18 . A method of operating a head-mounted device, comprising: detecting a light source in a physical environment and determining a frequency of the light source; with one or more image sensors, capturing images of the physical environment while capture time periods used for capturing the images are aligned to peaks of the light source; and with one or more displays, outputting a first subset of the images at a display frequency different than the frequency of the light source, wherein the first subset of the images being output on the one or more displays at the display frequency are being captured using a first set of image sensor settings while a second subset of the images, different than the first subset of the images, are being captured using a second set of image sensor settings at least partially different than the first set of image sensor settings.
  19. 19 . The method of claim 18 , wherein the second subset of the images captured using the second set of image sensor settings are not being output on the one or more displays.
  20. 20 . The method of claim 18 , further comprising warping the first subset of images by: warping a first image in the first subset of the images using a first warp definition generated based on a pose of the head-mounted device in the physical environment at a first mid-capture time of the first image and based on a pose of the head-mounted device at a first mid-display time of the first image; warping a second image in the first subset of the images using a second warp definition generated based on a pose of the head-mounted device in the physical environment at a second mid-capture time of the second image and based on a pose of the head-mounted device at a second mid-display time of the second image; and warping a third image in the first subset of the images using a third warp definition generated based on a pose of the head-mounted device in the physical environment at a third mid-capture time of the third image and based on a pose of the head-mounted device at a third mid-display time of the third image.

Description

This application claims the benefit of U.S. Provisional Patent Application No. 63/715,129, filed Nov. 1, 2024, which is hereby incorporated by reference herein in its entirety. FIELD This relates generally to electronic devices, and, more particularly, to electronic devices such as head-mounted devices. BACKGROUND Electronic devices such as head-mounted devices can have cameras for obtaining a live video feed of a physical environment and one or more displays for presenting the live video feed to a user. The physical environment can include one or more light sources. The cameras can acquire images for the live video feed at some frame rate. The displays can output the live video feed at some frame rate. The light sources can be modulated at some frequency that is different than the frame rate of the cameras and displays. If care is not taken, the light sources in the environment can result in noticeable flicker in the live video feed. It is within such context that the embodiments herein arise. SUMMARY An aspect of the disclosure provides a method for operating an electronic device such as a head-mounted device. The method can include: with one or more image sensors, capturing first images of a physical environment at a first frequency; determining a frequency of a light source in the physical environment; configuring the one or more image sensors to capture second images of the physical environment at a second frequency different than the first frequency based on the frequency of the light source; and with one or more displays, outputting warped images at a display frequency different than the second frequency. The warped images can be produced by warping a subset of the second images based on poses of the head-mounted device in the physical environment at times corresponding to when the subset of the second images are being captured at the second frequency and based on poses of the head-mounted device in the physical environment at times corresponding to when the warped images are being output on the one or more displays at the display frequency. Another subset of the second images different than the subset of the second images can be used for one or more of: exposure time evaluation, image sensor gain evaluation, clipping evaluation, high dynamic range (HDR) recovery, and two-dimensional brightness and color correction map generation. An aspect of the disclosure provides a method of operating a head-mounted device that includes: detecting a light source in a physical environment and determining a frequency of the light source; with one or more image sensors, capturing images of the physical environment while capture time periods used for capturing the images are aligned to peaks of the light source; and with one or more displays, outputting a first subset of the images at a display frequency different than the frequency of the light source. The first subset of the images being output on the one or more displays at the display frequency can be captured using a first set of image sensor settings while a second subset of the images, different than the first subset of the images, can be captured using a second set of image sensor settings at least partially different than the first set of image sensor settings. The second subset of the images captured using the second set of image sensor settings are not being output on the one or more displays. An aspect of the disclosure provides a method of operating a head-mounted device in a physical environment, including: with one or more cameras, capturing images at a first cadence; with one or more displays, outputting a first subset of the images at a second cadence different than the first cadence; selectively dropping a second subset of the images different than the first subset of the images; and warping the first subset of the images based on capture times of the first subset of the images and based on display times of the first subset of the images on the one or more displays prior to outputting the first subset of the images on the one or more displays. BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 is a top view of an illustrative head-mounted device in accordance with some embodiments. FIG. 2 is a schematic diagram of an illustrative electronic device in accordance with some embodiments. FIG. 3 is a diagram of an illustrative electronic device having hardware and/or software subsystems configured to perform frequency and phase locking in accordance with some embodiments. FIG. 4 is an overhead perspective view of an illustrative electronic device in a physical environment. FIG. 5A illustrates a first view of the physical environment of FIG. 4 at a first time as would be seen by a user's left eye if the user were not wearing the electronic device. FIG. 5B illustrates a first image of the physical environment of FIG. 4 captured by a left image sensor of the electronic device at the first time. FIG. 5C illustrates a second view of the physical environment of FIG. 4 a