Search

EP-4738854-A1 - METHOD FOR SYNCHRONIZATION OF FRAME-BASED AND EVENT-BASED SENSORS

EP4738854A1EP 4738854 A1EP4738854 A1EP 4738854A1EP-4738854-A1

Abstract

Method for synchronization of a frame-based sensor (3) and an event-based sensor (2), implemented by a processor (1) configured to receive a frame stream from the frame-based sensor (3) and an event data stream from the event-based sensor (2), the event data stream comprising events generated each time the event-based sensor (2) detects a change in light intensity, wherein the frame-based sensor (3) comprises a synchronizing pin (31) configured to emit a pulse at a start of acquisition of each frame of the frame stream, and wherein the event-based sensor (2) comprises a trigger input pin (21) connected to the synchronizing pin (31), wherein the event-based sensor (2) is configured to detect the pulse emitted by the synchronizing pin (31) on the trigger input pin (21), and to insert an associated trigger event in the event data stream in response to the detection of the pulse, the method comprising the steps of: - storing frames from the frame-based sensor in a queue of frames (Qf), a frame being characterized by a frame indicator reflecting an order of the frame in the frame stream, - storing event data slices (Eds) in a queue of event data slices (Qe), an event data slice (Eds) comprising events from the event data stream generated during a time interval between two successive trigger events and a slice indicator reflecting an order of the event data slice (Eds) in the event data stream, - for each frame of the queue of frames (Qf), generating a synced data slice (Sds) comprising the frame and event information identifying all event data slices (Eds) comprising events generated during acquisition of the frame, by comparing the frame indicator and the respective slice indicators of event data slices (Eds).

Inventors

  • KHAMOUMA, Ayman
  • GOUAULT, Jonathan
  • CHOTARD, LUDOVIC
  • Schwambach, Vitor

Assignees

  • Prophesee

Dates

Publication Date
20260506
Application Date
20241031

Claims (20)

  1. A method for synchronization of a frame-based sensor (3) and an event-based sensor (2), implemented by a processor (1) configured to receive a frame stream from the frame-based sensor (3) and an event data stream from the event-based sensor (2), the event data stream comprising events generated each time the event-based sensor (2) detects a change in light intensity, wherein the frame-based sensor (3) comprises a synchronizing pin (31) configured to emit a pulse at a start of acquisition of each frame of the frame stream, and wherein the event-based sensor (2) comprises a trigger input pin (21) connected to the synchronizing pin (31), wherein the event-based sensor (2) is configured to detect the pulse emitted by the synchronizing pin (31) on the trigger input pin (21), and to insert an associated trigger event in the event data stream in response to the detection of the pulse, the method comprising the steps of: - storing frames from the frame-based sensor in a queue of frames (Qf), a frame being characterized by a frame indicator reflecting an order of the frame in the frame stream, - storing event data slices (Eds) in a queue of event data slices (Qe), an event data slice (Eds) comprising events from the event data stream generated during a time interval between two successive trigger events and a slice indicator reflecting an order of the event data slice (Eds) in the event data stream, - for each frame of the queue of frames (Qf), generating a synced data slice (Sds) comprising the frame and event information identifying all event data slices (Eds) comprising events generated during acquisition of the frame, by comparing the frame indicator and the respective slice indicators of event data slices (Eds).
  2. The method of claim 1, wherein the frame indicator is generated by the processor (1) and added to the frame as a metadata of the frame, before storing the frame in the queue of frames (Qf).
  3. The method of any one of claims 1 and 2, wherein the event information comprises pointers to a start event and an end event of the respective event data slices (Eds) comprising events generated during acquisition of the frame.
  4. The method of any one of claims 1 to 3, further comprising a step of decomposition of the event data stream into a plurality of event data slices (Eds), the decomposition being implemented by the processor (1).
  5. The method of any one of claims 1 to 4, wherein the frame-based sensor (3) comprises a frame counter, whose frame counter value is incremented at each frame generation by the frame-based sensor (3), the method further comprising a continuity check on the frame counter value upon storing a new frame in the queue of frames (Qf), based on a comparison between the frame indicator of the new frame and the frame indicator of a last frame from the queue of frames (Qf).
  6. The method of claim 5, wherein the processor (1) communicates with an application (10), the method further comprising a warning step of: - sending a notification to the application (10) if the continuity check identifies a frame loss.
  7. The method of any one of claims 1 to 6, wherein the generated synced data slice (Sds) is stored in a buffer (Bs).
  8. The method of any one of claims 1 to 7, wherein the frame-based sensor (3) is of rolling-shutter type.
  9. The method of claim 8, wherein all generated synced data slices (Sds) are stored in a buffer (Bs), each synced data slice (Sds) is associated with a timestamp that corresponds to a start of acquisition, the frame-based sensor (3) generating frames comprising a number h of lines, each line being acquired during a row exposure time and delayed by a row skew time (δ rss ) from the previous line, and wherein the method further comprises fetching a timed synced data slice (Sds) by: - computing, for the timed synced data slice (Sds), an expected end of frame acquisition from the start of acquisition, the row exposure time ( δ rexp ), the number of lines and the row skew time (δ rss ); - searching in the buffer (Bs) a next synced data slice corresponding to the first synced data slice associated with a timestamp greater than the expected end of frame acquisition; - if the next synced data slice is found, creating a set of synced data slices comprising the timed synced data slice (Sds) and the synced data slices anterior to the next synced data slice.
  10. The method of any one of claims 1 to 9, wherein the frame-based sensor (3) comprises a frame counter whose frame counter value is incremented at each frame generation by the frame-based sensor (3) and the frame-based sensor (3) is configured to transmit a first frame associated with the frame counter value, the method further comprising an initialization step of: - removing a number of event data slices (Eds) from the event data slices queue (Qe) equal to the frame counter value.
  11. The method of any of claims 1 to 10, further comprising a step of comparison of the respective slice indicator of a first and a second event data slice (Eds) from the queue of event data slices (Qe), and removing, or not, the first event data slice from the queue of data event slices (Qe), based on the comparison.
  12. The method of any of claims 1 to 11, further comprising, upon generation of the synced data slice (Sds), a step of removing a frame from the queue of frames (Qf) or removing an event data slice (Eds) from the queue of event data slices (Qe), based on the comparison between the frame indicator of the frame and the slice indicator of the event data slice (Eds).
  13. The method of any one of claims 1 to 12, wherein the processor (1) communicates with an application (10), the method further comprising a warning step of: - sending a notification to the application (10), each time an event data slice (Eds) is removed from the queue of data event slices (Qe); - sending a notification to the application, each time a frame is removed from the queue of frames (Qf).
  14. The method of any one of claims 1 to 13, wherein the event data stream is decomposed by the event-based sensor (2) into event frames comprising events generated during an arbitrary frame period, each event being associated with a timestamp, the method further comprising a generation of event data slices (Eds) from at least a first event frame with a first event, by the processor (1), and wherein the slice indicator corresponds to an event timestamp corresponding to the timestamp of the first event of the first event frame.
  15. The method of claim 14, wherein the frame indicator corresponds to a timestamp generated by the frame-based sensor (3), the queue of frames (Qf) comprises a head frame and the queue of event data slices (Qe) comprises a head event data slice, the method comprising the steps of: - computing a time difference between the timestamp of the head frame and the event timestamp of the head event data slice to obtain a head offset value; - computing a difference value representative of a time difference between an offset reference value and the head offset value; - if the difference value is positive and greater than a time margin, removing the head event data slice from the queue of event data slices (Qe); - if the difference value is negative and less than a time margin, removing the head frame from the queue of frames (Qf); - else, generating a new synced data slice from the head frame and the head event data slice, and removing the said head frame and head event data slice from the frame queue (Qf) and queue of event data slices (Qe), respectively.
  16. The method of claim 15, comprising the step of: - updating the offset reference value to the head offset value upon generation of the new synced data slice.
  17. The method of any one of claims 1 to 16, wherein the frame-based sensor (3) comprises a frame counter whose frame counter value is incremented at each frame generation by the frame-based sensor, and wherein the event-based sensor (2) comprises a trigger counter whose value is incremented each time the pulse is detected on the trigger input pin, the slice indicator corresponding to an index computed from the trigger counter and the frame indicator corresponding to the frame counter value, the step of generating the synced data slice being based on a comparison between the index and the frame counter value.
  18. The method of claim 17, comprising the steps of: - pulling a head frame and a head event data slice from the respective queues (Qf, Qe); - comparing the index of the head event data slice and the frame counter value of the head frame; - if the index and the frame counter value are equal, generating a new synced data slice from the frame and the event data slice; - if the index is greater than the frame counter value, removing the head frame from the queue of frames; - if the frame counter value is greater than the index, removing the head event data slice from the queue of event data slices.
  19. The method of any one of claims 1 to 18, wherein the event-based sensor (2) is configured according to a MIPI CSI2 protocol, such that event data slices (Eds) are generated from the event data stream by the event-based sensor (2) before transmission to the processor (1).
  20. System comprising a frame-based sensor (3), an event-based sensor (2), and a processor (1), wherein the processor (1) is configured to implement the method from any one of claims 1 to 19 for synchronization of the frame-based sensor (3) and the event-based sensor (2).

Description

SUMMARY OF THE INVENTION The invention concerns the field of data fusion, and in particular the synchronization between image data from event-based and frame-based sensors. TECHNICAL FIELD AND PRIOR ART When realizing data fusion from different sensors, it is important to synchronize the data transmitted to a processor. For instance, a dual camera system comprising at least one event-based sensor and one frame-based sensor and a downstream processor, such as a system-on-chip (or SoC) can be used for data fusion and enhancement. Such systems find applications for snapshot deblur, video deblur or offline on-demand slow-motion. Interestingly, the information from the event data stream generated by the event-based sensor can be used to correct the motion blur observed in frames from the frame-based sensor acquired during long exposure snapshots, especially at low light. For these applications to be successful, it is important that the two data streams are well synchronized, preferably down to microsecond precision. Processing and data fusion of data streams coming from event-based sensors and frame-based sensors is a difficult task. Usually, frame-based sensors used in multi-sensor systems, such as stereo systems, provide a mechanism for synchronization. Typically, one sensor is a master and emits a synchronization pulse at the start of exposure of each frame, on a dedicated pin. A slave sensor then uses the synchronization signal as a trigger to start its frame exposure period. This way both master and slave sensors produce the same number of frames, each with an aligned frame counter that allows for synchronization by the downstream processor. Alternatively, frame-based sensors also often count with a flash strobe signal that is used to trigger a flash at the start of acquisition of a frame. Said flash strobe signal can also be used as a synchronization pulse. The processor synchronizes the received frames from multiple cameras by aligning the frame counters in the metadata at the start of each frame. Additionally, the frame counters can be used to detect and correct any eventual data loss, thus keeping synchronization. Frame-based sensors read-out frames sequentially, so that the frames are transmitted every operative period or frame period. Synchronization is particularly complex in the case of data fusion with frame-based sensors of rolling-shutter type, where there can be overlap between the acquisition of two frames. On the other hand, event-based sensors do not capture images using a shutter as conventional frame-based sensors. Instead, each pixel of the event-based sensor operates independently and asynchronously, responding to local changes in brightness or light intensity as they occur. This allows event-based sensors to have a very high temporal resolution, of the order of a microsecond. Also, the event-based sensors can monitor an input pin and generate time-stamped trigger events in the event data stream that convey the time and value of said input pin upon level changes. Unlike the frame-based sensor, the event-based sensor does not transmit the data corresponding to one image frame sequentially but provides data continuously. To match standard transmission protocols, the event data stream must be decomposed in a plurality of event frames at regular intervals. Typically, an arbitrary frame period is implemented. The event data stream transmitted by the event-based sensor is split into a plurality of event frames, storing data generated during the arbitrary frame period. However, the arbitrary frame period is not necessarily aligned with the operative period at which the frame-based sensor transmits frames. Even when specifically implementing the arbitrary frame period to match the operative period, there might be slight variations in the operative period that leads to desynchronization, for instance due to jitter and minor frame period variations due to exposure time changes. A fixed implementation of the arbitrary frame period is then not sufficient to ensure proper synchronization. Moreover, during the transmission and processing, data loss can occur. It is then important for the processor to continue to monitor the synchronization and apply corrective action if needed, so that the streams maintain synchronization even if data loss occurs in one or both data streams. SUMMARY OF THE INVENTION The invention provides a synchronization method that allows to synchronize information from a frame-based sensor and an event-based sensor, that is robust and capable of recovering from data losses. In particular, the processor can obtain for each frame the corresponding events spanning the entire exposure window of the frame from the frame-based sensor. The aforementioned problem is solved by a method for synchronization of a frame-based sensor and an event-based sensor, implemented by a processor configured to receive a frame stream from the frame-based sensor and an event data stream from the event-based