US-20260129289-A1 - IMAGE ACQUISITION SYSTEMS AND METHODS
Abstract
A method and system for image acquisition may be provided. A first exposure duration of a first image sensor and a second exposure duration of each of at least one second image sensor may be obtained. For each second image sensor, a synchronization time difference between the first image sensor and the second image sensor may be determined based on an exposure duration difference between the first exposure duration and the second exposure duration of the second image sensor, and first image data and second image data from the first image sensor and the second image sensor may be acquired by sending synchronizing signals to the first image sensor and the second image sensor based on the synchronization time difference. A target image of the target scene may be generated based on the first image data and the second image data of each second image sensor.
Inventors
- Ming Liu
- Wei Wu
- Dingxi Liu
- Hongwu Chen
- Jianjun Yang
- Wei Xie
- Changyan GAO
- Zhiyong Xu
Assignees
- ZHEJIANG DAHUA TECHNOLOGY CO., LTD.
Dates
- Publication Date
- 20260507
- Application Date
- 20251229
- Priority Date
- 20230707
Claims (20)
- 1 . A method for image acquisition implemented on a computing device having at least one processor and at least one storage device, the method comprising: obtaining a first exposure duration of a first image sensor and a second exposure duration of each of at least one second image sensor, the first image sensor and the at least one second image sensor being configured to shoot a target scene with different exposure durations; for each of the at least one second image sensor, determining, based on an exposure duration difference between the first exposure duration and the second exposure duration of the second image sensor, a synchronization time difference between the first image sensor and the second image sensor, wherein the synchronization time difference is smaller than the exposure duration difference; acquiring first image data and second image data from the first image sensor and the second image sensor, respectively, by sending synchronizing signals to the first image sensor and the second image sensor based on the synchronization time difference; generating a target image of the target scene based on the first image data of the first image sensor and the second image data of each second image sensor.
- 2 . The method of claim 1 , wherein the synchronization time difference is equal to half of the exposure duration difference.
- 3 . The method of claim 1 , wherein for each of the at least one second image sensor, a difference between start exposure times of the first image sensor and the second image sensor is smaller than the exposure duration difference.
- 4 . The method of claim 1 , wherein the first image sensor is configured to sense one of infrared light and color light in the target scene, and the at least one second image sensor is configured to sense another of infrared light and color light in the target scene.
- 5 . The method of claim 1 , wherein the generating a target image of the target scene based on the first image data of the first image sensor and the second image data of each second image sensor comprises: performing a noise reduction operation on the first image data to generate denoised first image data; performing a noise reduction operation on the second image data of each second image sensor to generate denoised second image data; generating third image data by fusing the denoised first image data and the denoised second image data of each second image sensor; and generating the target image by enhancing the third image data.
- 6 . The method of claim 1 , wherein the first image sensor is determined from multiple image sensors with different exposure durations by: for each of the multiple image sensors, obtaining historical image data captured by the image sensor; determining relative position information of the image sensor to a region of interest (ROI) in the target scene based on the historical image data; and determining the first image sensor from the plurality of sensors based on the relative position information of each of the multiple image sensors to the ROI.
- 7 . The method of claim 1 , wherein the first image sensor is determined from multiple image sensors with different durations by: for each of the multiple image sensors, determine an exposure duration difference between an exposure duration of the image sensor and an exposure duration of each of the remaining image sensors of the multiple image sensors; determine a total exposure duration difference corresponding to the image sensor by summing the exposure duration difference between the exposure duration of the image sensor and the exposure duration of each of the remaining image sensors of the multiple image sensors; and determining, based on the total exposure duration differences corresponding to the multiple image sensors, the first image sensor.
- 8 . The method of claim 1 , wherein the first image sensor has a first pixel array that includes N first rows; the at least one second image sensor includes a second image sensor having a second pixel array that includes M second rows, the synchronizing signals include N first synchronizing signals corresponding to the N first rows and M second synchronizing signals corresponding to the M second rows.
- 9 . The method of claim 8 , wherein N is equal to M, and a transmission time difference between the first synchronizing signal of the i th first row and the second synchronizing signal of the i th second row is equal to the synchronization time difference.
- 10 . The method of claim 8 , wherein N is different from M, and the sending synchronizing signals to the first image sensor and the second image sensor based on the synchronization time difference comprises: determining a proportionality coefficient based on N and M; determining a corresponding relationship between the first rows of the first pixel array and the second rows of the second pixel array based on the proportionality coefficient; and sending the N first synchronizing signals and the M second synchronizing signals based on the corresponding relationship and the synchronization time difference.
- 11 . The method of claim 10 , wherein when N is smaller than M, each first row of the first pixel array corresponds to multiple second rows of the second pixel array, the count of the multiple second rows is equal to the proportionality coefficient, when M is smaller than N, multiple first rows of the first pixel array correspond to one second row of the second pixel array, the count of the multiple first rows is equal to the proportionality coefficient, and a transmission time difference between the first synchronizing signal of the i th first row and the second synchronizing signal of each second row corresponding to the i th first row is equal to the synchronization time difference.
- 12 . A system for image acquisition, comprising: at least one storage device including a set of instructions for medical imaging; and at least one processor in communication with the at least one storage device, wherein when executing the set of instructions, the at least one processor is configured to direct the system to perform operations including: obtaining a first exposure duration of a first image sensor and a second exposure duration of each of at least one second image sensor, the first image sensor and the at least one second image sensor being configured to shoot a target scene with different exposure durations; for each of the at least one second image sensor, determining, based on an exposure duration difference between the first exposure duration and the second exposure duration of the second image sensor, a synchronization time difference between the first image sensor and the second image sensor, wherein the synchronization time difference is smaller than the exposure duration difference; acquiring first image data and second image data from the first image sensor and the second image sensor, respectively, by sending synchronizing signals to the first image sensor and the second image sensor based on the synchronization time difference; generating a target image of the target scene based on the first image data of the first image sensor and the second image data of each second image sensor.
- 13 . The system of claim 12 , wherein the synchronization time difference is equal to half of the exposure duration difference.
- 14 . The system of claim 12 , wherein for each of the at least one second image sensor, a difference between start exposure times of the first image sensor and the second image sensor is smaller than the exposure duration difference.
- 15 . The system of claim 12 , wherein the first image sensor is configured to sense one of infrared light and color light in the target scene, and the at least one second image sensor is configured to sense another of infrared light and color light in the target scene.
- 16 . The system of claim 12 , wherein the generating a target image of the target scene based on the first image data of the first image sensor and the second image data of each second image sensor comprises: performing a noise reduction operation on the first image data to generate denoised first image data; performing a noise reduction operation on the second image data of each second image sensor to generate denoised second image data; generating third image data by fusing the denoised first image data and the denoised second image data of each second image sensor; and generating the target image by enhancing the third image data.
- 17 . The system of claim 12 , wherein the first image sensor is determined from multiple image sensors with different exposure durations by: for each of the multiple image sensors, obtaining historical image data captured by the image sensor; determining relative position information of the image sensor to a region of interest (ROI) in the target scene based on the historical image data; and determining the first image sensor from the plurality of sensors based on the relative position information of each of the multiple image sensors to the ROI.
- 18 . The system of claim 12 , wherein the first image sensor is determined from multiple image sensors with different durations by: for each of the multiple image sensors, determine an exposure duration difference between an exposure duration of the image sensor and an exposure duration of each of the remaining image sensors of the multiple image sensors; determine a total exposure duration difference corresponding to the image sensor by summing the exposure duration difference between the exposure duration of the image sensor and the exposure duration of each of the remaining image sensors of the multiple image sensors; and determining, based on the total exposure duration differences corresponding to the multiple image sensors, the first image sensor.
- 19 . The system of claim 12 , wherein the first image sensor has a first pixel array that includes N first rows; the at least one second image sensor includes a second image sensor having a second pixel array that includes M second rows, the synchronizing signals include N first synchronizing signals corresponding to the N first rows and M second synchronizing signals corresponding to the M second rows.
- 20 - 23 . (canceled)
Description
CROSS-REFERENCE TO RELATED APPLICATIONS This application is a continuation of International Application No. PCT/CN2024/089132, filed on Apr. 22, 2024, which claims priority of Chinese Patent Application No. 202310833723.9 filed on Jul. 7, 2023, the contents of each of which are incorporated herein by reference. TECHNICAL FIELD The present disclosure relates to systems and methods for image acquisition, and in particular, to systems and methods for image acquisition using multiple image sensors. BACKGROUND Currently, in order to obtain images with improved quality, a plurality of image sensors are usually used to collect image data simultaneously, and the image data from the image sensors is fused to generate the final image. For example, infrared light and visible light are detected via different image sensors, respectively, to obtain an infrared light image and a color image. Then, the color image and the infrared light image are fused to generate an image with high quality (e.g., high brightness). In this way, a relatively high quality image can be obtained even in low illumination. SUMMARY According to an aspect of the present disclosure, a method for image acquisition may be provided. The method may be implemented on a computing device having at least one processor and at least one storage device. The method may include obtaining a first exposure duration of a first image sensor and a second exposure duration of each of at least one second image sensor. The first image sensor and the at least one second image sensor may be configured to shoot a target scene with different exposure durations. The method may also include, for each of the at least one second image sensor, determining a synchronization time difference between the first image sensor and the second image sensor based on an exposure duration difference between the first exposure duration and the second exposure duration of the second image sensor, and acquiring first image data and second image data from the first image sensor and the second image sensor, respectively, by sending synchronizing signals to the first image sensor and the second image sensor based on the synchronization time difference. The synchronization time difference may be smaller than the exposure duration difference. The method may further include generating a target image of the target scene based on the first image data of the first image sensor and the second image data of each second image sensor. In some embodiments, the synchronization time difference may be equal to half of the exposure duration difference. In some embodiments, for each of the at least one second image sensor, a difference between start exposure times of the first image sensor and the second image sensor may be smaller than the exposure duration difference. In some embodiments, the first image sensor may be configured to sense one of infrared light and color light in the target scene, and the at least one second image sensor may be configured to sense another of infrared light and color light in the target scene. In some embodiments, to generate a target image of the target scene based on the first image data of the first image sensor and the second image data of each second image sensor, the method may include performing a noise reduction operation on the first image data to generate denoised first image data. The method may also include performing a noise reduction operation on the second image data of each second image sensor to generate denoised second image data. The method may include generating third image data by fusing the denoised first image data and the denoised second image data of each second image sensor. The method may further include generating the target image by enhancing the third image data. In some embodiments, the first image sensor may be determined from multiple image sensors with different exposure durations by performing the following operations. The method may include, for each of the multiple image sensors, obtaining historical image data captured by the image sensor, and determining relative position information of the image sensor to a region of interest (ROI) in the target scene based on the historical image data. The method may further include determining the first image sensor from the plurality of sensors based on the relative position information of each of the multiple image sensors to the ROI. In some embodiments, the first image sensor may be determined from multiple image sensors with different durations by performing the following operations. The method may include, for each of the multiple image sensors, determine an exposure duration difference between an exposure duration of the image sensor and an exposure duration of each of the remaining image sensors of the multiple image sensors, and determine a total exposure duration difference corresponding to the image sensor by summing the exposure duration difference between the exposure duration of the image se