Search

EP-3779768-B1 - EVENT DATA STREAM PROCESSING METHOD AND COMPUTING DEVICE

EP3779768B1EP 3779768 B1EP3779768 B1EP 3779768B1EP-3779768-B1

Inventors

  • CHEN, SHOUSHUN
  • WANG, Shizheng

Dates

Publication Date
20260506
Application Date
20180524

Claims (5)

  1. A method for processing an event data flow from a Dynamic Vision Sensor, DVS, for recording pixel information about a specific event, comprising: reading a plurality of pieces of event data sequentially from the event data flow, and each event data piece with a first duration, wherein the specific event is an event where brightness of a pixel in the DVS has changed, and the acquired event data comprises a coordinate position of each pixel whose brightness has changed within the first duration and a timestamp of each pixel whose brightness has changed; with respect to each piece of event data with the first duration, analyzing the event data to acquire time-difference information about each event within the first duration, comprising: with respect to each piece of event data with the first duration, recording a start time point and/or an end time point of the first duration; and calculating the time-difference information about each event within the event data piece with the first duration in accordance with the start time point and/or the end time point, and the timestamp of each event in the event data; and characterized in that the method for processing the event data flow further comprises: generating an image frame presenting a change in movement within the first duration in accordance with the time-difference information about each event within the event data piece with the first duration, comprising: performing a pre-processing operation on the time-difference information about each event to acquire processed time-difference information about each event, wherein the pre-processing operation includes mapping the time-difference information of a floating-point type about each event within the event data piece with the first duration to a integer predetermined range; calculating a pixel value corresponding to each event in accordance with the processed time-difference information about the event, wherein a pixel value pixel i corresponding to an event i is calculated through the following formula: pixel i = A- Δ t i , or pixel i = 1 Δ t i , where Δ t i represents the processed time-difference information about the event i and A is a constant; and generating the image frame in accordance with the pixel values and the coordinate positions corresponding to all the events within the event data piece with the first duration, wherein, the image frame is angled to an XY plane by an angle, so that a movement direction and a relative change in a movement speed of an object can be acquired by the image frame.
  2. The method according to claim 1, wherein the step of reading of the plurality of pieces of event data with the first duration sequentially from the event data flow comprises reading the plurality of pieces of event data with the first duration sequentially from the event data flow, adjacent event data pieces with the first duration comprising a same event data piece with a second duration.
  3. The method according to claim 1 or 2, wherein the step of reading of the plurality of pieces of event data with the first duration sequentially from the event data flow comprises: when the brightness of one pixel within the event data piece with the first duration has changed multiple times, providing a plurality of timestamps corresponding to the coordinate position of the pixel in the read event data; and selecting a timestamp for the pixel whose brightness has changed most recently from the plurality of timestamps as a timestamp corresponding to the coordinate position of the pixel.
  4. A computing device, comprising one or more processors, a memory, and one or more programs stored in the memory and configured to be executed by the one or more processors, wherein the one or more programs comprise instructions for executing the method according to any of claims 1 to 3.
  5. A computer-readable storage medium storing therein one or more programs comprising instructions that, when executed by a computing device, cause the computing device to implement the method according to any of claims 1 to 3.

Description

TECHNICAL FIELD The present disclosure relates to the field of data processing technology, in particular to a method for processing an event data flow and a computing device. BACKGROUND Real-time optical flow calculation always plays a very important role in the field of computer vision, e.g., optical flow-based segmentation, movement detection, and target tracking and obstacle avoidance for aircrafts and vehicles. In actual use, as an urgent need, it is required to increase a speed of the optical flow calculation while ensuring high accuracy, and an optical flow is a basic element which plays a decisive role, so many optical flow calculation methods have been presented. In a conventional optical flow calculation method, a large quantity of data is captured by a conventional image sensor and then processed and, as a result, redundant data is generated from static backgrounds repeatedly. The reading and processing of a large quantity of redundant data leads to significant computational costs as well as a restriction on the processing speed. In addition, an even-based motion sensor has exhibited a great potential on a real-time performance of the optical flow calculation. As compared with the conventional image sensor, the motion sensor is capable of responding to an event representing a relative brightness change asynchronously. Moreover, a flow of an asynchronous digital event is outputted by the motion sensor without any restrictions with regard to exposure time or frame rate. The motion sensor may detect an object which moves rapidly and the movement of which used to be captured by an expensive high-speed camera at a frame rate of several thousand Hz, but with significantly less redundant output data. Hence, the event-based optical flow calculation method has been widely used with respect to motion sensors. Generally, the event-based optical flow calculation method may include an event-based Lucas-Kanade method and a local plane fitting method. In the event-based optical flow calculation method, as an important step, slope (or gradient) information is extracted on the basis of a pixel intensity in a local area. However, in a conventional Dynamic Vision Sensor (DVS) system, the event may report a pixel position merely in the absence of illumination. Hence, the intensity of each pixel may be simulated in accordance with the quantity of events accumulated within a short time period. In this simulation method, a relative intensity change, rather than a real-time intensity, is represented, so the calculation is not accurate. In addition, as another problem, when the object that moves rapidly is detected, the accuracy of the event-based optical flow calculation may be limited by event sparsity. During the operation of the conventional DVS, each pixel operates independently, and it is impossible for the event generated by an individual activated pixel to provide sufficient information for the optical flow calculation. Based on the above, there is an urgent need to provide a new scheme for extracting movement slope information in accordance with the pixel intensity, so as to improve the speed of the optical flow calculation. Mueggler Elias ET AL: 'Continuous-Time Trajectory Estimation for Eventbased Vision Sensors', Robotics: Science and Systems XI, 13 July 2015 (2015-07-13), pages 1-9, XP055784250, discloses an ego-motion estimation for an event-based vision sensor using a continuous-time framework to directly integrate the information conveyed by the sensor. The Dynamic Vision Sensor (DVS) pose trajectory is approximated by a smooth curve in the space of rigid-body motions using cubic splines and it is optimized according to the observed events.RIDWAN IFFATUR ET AL: "An Event-Based Optical Flow Algorithm for Dynamic Vision Sensors", 2 June 2017 (2017-06-02), ICIAP: INTERNATIONAL CONFERENCE ON IMAGE ANALYSIS AND PROCESSING, 17TH INTERNATIONAL CONFERENCE, NAPLES, ITALY, SEPTEMBER 9-13,2013. PROCEEDINGS; [LECTURE NOTES IN COMPUTER SCIENCE; LECT.NOTES COMPUTER], SPRINGER, BERLIN, HEIDELBERG, PAGE(S) 182-189, XP047417518 discloses an event-based optical flow algorithm for the Davis Dynamic Vision Sensor (DVS). The algorithm is based on the Reichardt motion detector inspired by the fly visual system, and has a very low computational requirement for each event received from the DVS. SUMMARY An object of the present disclosure is to provide a method for processing an event data flow and a computing device, configured for determining the movement direction and movement speed of the object according to the gradient and slope of the generated image frame, so as to solve or at least alleviate at least one of the above-mentioned problems. These problems are solved by a method for processing an event data flow as claimed by claim 1, by a computing device as claimed by claim 6 and by a computer-readable storage medium for such a computing device as claimed by claim 7. Further advantageous embodiments are the subject-matter of