KR-20260066003-A - Synchronizing Image Signal Processing Across Multiple Image Sensors
Abstract
The device includes a first image sensor configured to capture a first image having a first field of view (FOV), a second image sensor configured to capture a second image having a second FOV, an image statistics collection subsystem configured to collect first image statistics associated with the first image and collect second image statistics associated with the second image, a mismatch detection subsystem configured to compare the first image statistics and the second image statistics, and image signal processing blocks configured to synchronize the processing of the first image and the second image based on the comparison of the first image statistics and the second image statistics. The processing of the first image and the second image may include using one or more combined image signal processing control parameters based on image statistics within an overlapping FOV and/or within the entire FOV.
Inventors
- 카오, 렌보
- 자오, 용희
- 바이, 잉준
- 세이브, 크리스토프
- 블라진스키, 헨릭 케이.
Assignees
- 애플 인크.
Dates
- Publication Date
- 20260512
- Application Date
- 20260330
- Priority Date
- 20240221
Claims (20)
- As a circuit part, One or more inputs configured to receive a first image of the environment from a first image sensor and a second image of the environment from a second image sensor; and It includes one or more image signal processing circuits, and the one or more image signal processing circuits are: Determining whether the first image sensor is obscured; A circuit portion configured to synchronize settings associated with the first and second images based on information associated with the second image in response to a determination that the first image sensor is obscured.
- In claim 1, the circuit part comprising the synchronized settings including image adjustment settings for adjusting the first and second images.
- In paragraph 2, the image adjustment settings for adjusting the first and second images include one or more of exposure settings, color correction settings, and tone curve settings for the first and second images.
- In paragraph 2, the circuit part comprising the synchronized settings including image sensor settings for controlling the first and second image sensors.
- In paragraph 4, the image sensor settings for controlling the first and second image sensors include one or more of autoexposure (AE) settings, automatic white balancing (AWB) settings, and autoexposure (AF) settings for the first and second image sensors.
- In paragraph 1, the one or more image signal processing circuits are: Obtaining first image statistical information associated with the first image of the above environment; Obtaining second image statistical information associated with the second image of the above environment; A circuit part further configured to determine whether the first image sensor is obscured by comparing the first image statistical information with the second image statistical information.
- In claim 6, the circuit portion is further configured to compare the first image statistical information with the second image statistical information by comparing one or more of the brightness information, thumbnail information, color information, focus information, and first exposure time associated with the first image with one or more of the brightness information, thumbnail information, color information, focus information, and second exposure time associated with the second image.
- In claim 6, the first image has a first field of view, the second image has a second field of view different from the first field of view, the first field of view and the second field of view coincide in an overlapping field of view (FOV) region, and the one or more image signal processing circuits additionally: A circuit configured to synchronize image adjustment settings for the first and second images based on statistical information of the first and second images within the overlapping FOV area in response to determining that the first image sensor is not obscured.
- In paragraph 8, the above one or more image signal processing circuits are: A circuit portion further configured to synchronize additional image adjustment settings for the first and second images based on statistical information of the first and second images within an entire field of view (FOV) area identical to the combination of the first field of view and the second field of view, in response to determining that the first image sensor is not obscured.
- In paragraph 6, the first image has a first viewing angle, the second image has a second viewing angle different from the first viewing angle, and the one or more image signal processing circuits are: A circuit portion further configured to synchronize image adjustment settings for the first and second images based on statistical information of the first and second images within an entire field of view (FOV) area identical to the combination of the first field of view and the second field of view, in response to determining that the first image sensor is not obscured.
- In paragraph 1, the one or more image signal processing circuits are: A circuit portion further configured to synchronize the settings associated with the first and second images by excluding some information associated with at least a portion of the first image or excluding some information associated with at least a portion of the second image in response to a determination that the first image sensor is not obscured.
- In paragraph 1, the one or more image signal processing circuits are: A circuit part further configured to fill a portion of the first image with a portion of the second image in response to determining that the first image sensor is obscured.
- As a method, A step of receiving a first image of the environment from a first image sensor; A step of receiving a second image of the environment from a second image sensor; A step of determining whether the first image sensor is obscured; A method comprising the step of synchronizing settings associated with the first and second images based on information associated with the second image in response to determining that the first image sensor is obscured.
- In claim 13, the step of synchronizing the settings associated with the first and second images comprises the step of synchronizing image adjustment settings for adjusting the first and second images.
- In claim 14, the step of synchronizing image adjustment settings for adjusting the first and second images comprises synchronizing one or more of exposure settings, color correction settings, and tone curve settings for the first and second images.
- In claim 14, the step of synchronizing the settings associated with the first and second images comprises the step of synchronizing image sensor settings for controlling the first and second image sensors.
- In claim 16, the step of synchronizing the image sensor settings for controlling the first and second image sensors comprises synchronizing one or more of the automatic exposure (AE) settings, automatic white balancing (AWB) settings, and automatic focus (AF) settings for the first and second image sensors.
- A non-transient computer-readable storage medium for storing one or more programs configured to be executed by one or more processors, wherein the one or more programs are, Receiving a first image of the environment from a first image sensor; Receiving a second image of the environment from a second image sensor; Determining whether the first image sensor is obscured; A non-transient computer-readable storage medium comprising instructions for synchronizing settings associated with the first and second images based on information associated with the second image in response to determining that the first image sensor is obscured.
- A non-transient computer-readable storage medium according to claim 18, wherein synchronizing the settings associated with the first and second images comprises synchronizing image adjustment settings for adjusting the first and second images.
- A non-transient computer-readable storage medium according to claim 19, wherein synchronizing the settings associated with the first and second images comprises synchronizing image sensor settings for controlling the first and second image sensors.
Description
Synchronizing Image Signal Processing Across Multiple Image Sensors This application claims priority to U.S. Patent Application No. 18/583,729 filed on February 21, 2024, which claims the benefit of U.S. Provisional Patent Application No. 63/503,136 filed on May 18, 2023, the entire contents of which are incorporated herein by reference. Technology field The present invention relates to electronic devices in general, and more specifically to electronic devices such as head-worn devices. An electronic device, such as a head-worn device, may have a camera for capturing a video feed of an external environment and one or more displays for providing the captured video feed to a user. The head-worn device may include a hardware or software subsystem for processing the video feed, such as a hardware/software subsystem for performing image quality adjustment on the captured video feed. Designing a head-worn device equipped with multiple cameras can be difficult. Without careful attention, the settings of images captured using multiple cameras may not match. Displaying images with mismatched settings to the user may cause visual discomfort. In such situations, an embodiment of the present invention is proposed. An electronic device, such as a head-worn device, may include one or more cameras for capturing a video feed of the real environment and one or more displays for presenting a passthrough video feed to the user. The electronic device may include a processing circuitry that performs one or more processing functions on the captured video feed to generate a passthrough video feed. One aspect of the present disclosure provides a method for operating an electronic device having at least a first image sensor and a second image sensor—the method comprises the steps of: capturing a first image using a first image sensor having a first field of view; capturing a second image using a second image sensor having a second field of view different from the first field of view; determining whether the first image sensor is currently obscured; and processing the first image and the second image by synchronizing the processing of the first image and the second image by using information associated with the second image in response to the determination that the first image sensor is currently obscured. The method may further comprise the step of displaying the processed first image using a first display of the electronic device and displaying the processed second image using a second display of the electronic device. An operation to determine whether the first image sensor is currently obscured may include a step of comparing thumbnail information for the first image with thumbnail information for the second image, a step of comparing brightness information of the first image with brightness information of the second image, a step of comparing color information of the first image with color information of the second image, a step of comparing focus information associated with the first image with focus information associated with the second image, and/or a step of comparing a first exposure time (integration time) associated with the first image with a second exposure time associated with the second image. An operation to synchronize the processing of the first image and the second image may include a step of synchronizing the auto exposure of the first image and the second image using brightness information associated with the second image, a step of synchronizing the auto white balance of the first image and the second image using color information associated with the second image, a step of synchronizing the tone mapping of the first image and the second image using histogram information associated with the second image, and/or a step of synchronizing one or more additional image signal processing functions of the first image and the second image using pixel information associated with the second image. One aspect of the present disclosure provides a method for operating an electronic device having at least a first image sensor and a second image sensor—the method comprises: capturing a first image using a first image sensor having a first field of view; capturing a second image using a second image sensor having a second field of view different from the first field of view, wherein the first field of view and the second field of view coincide in an overlapping field of view (FOV) region—; and processing the first image and the second image using one or more combined image signal processing control parameters calculated based on information associated with the first image in the overlapping FOV region and information associated with the second image in the overlapping FOV region. The method may include a step of facilitating color, tone mapping, brightness, and/or noise matching between the first image and the second image. The method may include a step of spatially matching at the pixel level and/or globally matching over th