Search

EP-4025934-B1 - PROCESSING OF LIDAR IMAGES

EP4025934B1EP 4025934 B1EP4025934 B1EP 4025934B1EP-4025934-B1

Inventors

  • PACALA, ANGUS

Dates

Publication Date
20260506
Application Date
20200908

Claims (15)

  1. A system comprising: a kernel-based coprocessor (1510) comprising a classifier circuit (1514); and a light ranging device (210) communicably coupled with the classifier circuit and comprising: a transmission circuit (240) comprising a plurality of light sources (240) that emit light pulses; a detection circuit (230) comprising: an array of photosensors (236) that detect reflected light pulses and output signals measured over time; and a signal processor (238) connected to the array of photosensors and configured to determine depth values from measurements using the array of photosensors; and an image reconstruction circuit (1325) communicably coupled with the detection circuit and configured to: assign a sensor ID to each of first depth values for a first scan of the light ranging device; construct a first lidar image using the first depth values by: mapping, using the sensor IDs, the first depth values to first lidar pixels in the first lidar image, the first lidar image being a rectilinear image, wherein the mapping uses a mapping table that specifies a lidar pixel based on a corresponding sensor ID; and store the first lidar pixels of the first lidar image in a local image buffer of the light ranging device; and send the first lidar pixels of a local frame of the first lidar image or of a complete frame of the first lidar image to the kernel-based coprocessor, wherein the classifier circuit of the kernel-based coprocessor is configured to generate classification information of a set of lidar pixels corresponding to a same object based on corresponding depth values of the set of lidar pixels.
  2. The system of claim 1, wherein the detection circuit is configured to provide the depth values in a specified order, and wherein the image reconstruction circuit is configured to assign the sensor ID for a particular depth value based on the specified order.
  3. The system of claim 1, wherein a row of the mapping table includes a column for the sensor ID and another column that specifies coordinates of the lidar pixel in a lidar image.
  4. The system of claim 1, wherein the image reconstruction circuit is configured to determine when a specified subset of the first lidar pixels have been stored for the first lidar image and send the specified subset of the first lidar pixels to the kernel-based coprocessor.
  5. The system of claim 1, wherein the classifier circuit is further configured to generate the classification information by: receiving lidar images output by the image reconstruction circuit; analyzing the depth values in lidar pixels of the lidar images; correlating the set of lidar pixels based on corresponding depth values of the set of lidar pixels; and outputting the classification information of the set of lidar pixels based on the correlating.
  6. The system of claim 1, wherein the kernel-based coprocessor includes a depth imaging circuit communicably coupled with the light ranging device and configured to: receive the first lidar pixels; and apply one or more filter kernels to subsets of the first lidar pixels to generate filtered images of lidar pixels.
  7. A method of performing ranging using a light ranging system installed on a mobile apparatus, the method comprising: transmitting, using a transmission circuit (240), pulses from one or more light sources (242) of the light ranging system, the pulses reflecting from one or more objects; for each of an array of photosensors (236), measuring a signal by detecting photons of the pulses using a detection circuit; assigning a sensor ID to each of the signals, the sensor ID corresponding to one of the array of photosensors; analyzing the signals to determine first depth values; constructing a first lidar image using the first depth values by: mapping, using the sensor IDs, the first depth values to first lidar pixels in the first lidar image, the first lidar image being a rectilinear image, wherein the mapping uses a mapping table that specifies a lidar pixel based on a corresponding sensor ID; and storing the first lidar pixels of the first lidar image in a image buffer of the light ranging system; sending the first lidar pixels of a local frame of the first lidar image or of a complete frame of the first lidar image to a kernel-based coprocessor of the light ranging system; and generating, by a classifier circuit of the kernel-based coprocessor (1510), classification information of a set of lidar pixels corresponding to a same object based on corresponding depth values of the set of lidar pixels.
  8. The method of claim 7, wherein the sensor ID is received from the detection circuit with the signal.
  9. The method of claim 7, wherein the sensor IDs are assigned based on a specified order in which the signals are provided by the detection circuit.
  10. The method of claim 7, wherein the sensor IDs are assigned to each of the signals by assigning the sensor IDs to the first depth values, and wherein the sensor IDs are assigned based on a specified order in which the first depth values are provided to an image reconstruction circuit.
  11. The method of claim 7, further comprising: constructing a second lidar image using second depth values; storing the second lidar image in the image buffer; and adjusting one or more values of the first lidar image based on an analysis of the first lidar image and the second lidar image, thereby obtaining one or more adjusted values.
  12. The method of claim 11, wherein the one or more adjusted values include color values of the first lidar image.
  13. The method of claim 7, further comprising: applying a filter kernel to a portion of the first lidar image, wherein the filter kernel is applied before the first lidar image is completely constructed.
  14. The method of claim 7, wherein the mapping table specifies a lidar pixel based on the corresponding sensor ID and a position of the light ranging system when the signal is measured.
  15. The method of claim 7, further comprising generating the classification information by: receiving, by the classifier circuit, lidar images output by the image reconstruction circuit; analyzing, by the classifier circuit, the depth values in the lidar pixels of the lidar images; correlating, by the classifier circuit, the set of lidar pixels based on corresponding depth values of the set of lidar pixels; and outputting, by the classifier circuit, the classification information of the set of lidar pixels based on the correlating.

Description

BACKGROUND Light Detection And Ranging (LIDAR) systems are used for object detection and ranging, e.g., for vehicles such as cars, trucks, boats, etc. LIDAR systems also have uses in mobile applications (e.g., for face recognition), home entertainment (e.g., to capture gesture capture for video game input), and augmented reality. A LIDAR system measures the distance to an object by irradiating a landscape with pulses from a laser, and then measuring the time for photons to travel to an object and return after reflection, as measured by a receiver of the LIDAR system. A detected signal is analyzed to detect the presence of reflected signal pulses among background light. A distance to an object can be determined based on a time-of-flight from transmission of a pulse to reception of a corresponding reflected pulse. It can be difficult to provide robust distance accuracy down to a few cm in all conditions, particularly at an economical cost for the LIDAR system. It can be further difficult to provide robust data that provides extensive information about the entirety of a surrounding environment, particularly distant objects. Obtaining advance knowledge of such distance objects can be important for vehicle navigation. Additionally, in applications such as vehicle navigation, depth information (e.g., distance to objects in the environment) is extremely useful but not sufficient to avoid hazards and navigate safely. It is also necessary to identify specific objects, e.g., traffic signals, lane markings, moving objects that may intersect the vehicle's path of travel, and so on. But, the analysis of 3D points clouds can require extensive computational resources to be performed in real-time for these applications. US 2018/329066 A1 describes methods and systems that augment 360 degree panoramic LIDAR results with color obtained from color cameras. WO 2018/221453 A1 describes an output device, control method, program, and storage medium. US 2019/049986 A1 describes a sensor data evaluation system. BRIEF SUMMARY The invention provides systems and methods in accordance with the appended claims. The disclosure provides systems and methods for analyzing lidar data. For example, the lidar data can be obtained in a particular manner that allows reconstruction of rectilinear images for which image processing can be applied from image to image. According to the invention, kernel-based image processing techniques are used. Such processing techniques can use neighboring lidar and/or associated color pixels to adjust various values associated with the lidar signals. Such image processing of lidar and color pixels can be performed by dedicated circuitry, which may be on a same integrated circuit. In some embodiments, lidar pixels can be correlated to each other. According to the invention, classification techniques identify lidar and, optionally associated color pixels, as corresponding to the same object. The classification can be performed by an artificial intelligence (AI) coprocessor. Image processing techniques and classification techniques can be combined into a single system. These and other embodiments of the disclosure are described in detail below. For example, other embodiments are directed to systems, devices, and computer readable media associated with methods described herein. A better understanding of the nature and advantages of embodiments of the present disclosure may be gained with reference to the following detailed description and the accompanying drawings. BRIEF DESCRIPTION OF THE DRAWINGS FIGS. 1A and 1B show automotive light ranging devices, also referred to herein as LIDAR systems, according to embodiments of the present disclosure.FIG. 2 shows a block diagram of an exemplary lidar device for implementing various embodiments.FIG. 3 illustrates the operation of a typical lidar system that may be improved by embodiments.FIG. 4 shows an illustrative example of the light transmission and detection process for a light ranging system according to embodiments of the present disclosure.FIG. 5 shows various stages of a sensor array and associated electronics according to embodiments of the present disclosure.FIG. 6 shows a histogram according to embodiments of the present disclosure.FIG. 7 shows the accumulation of a histogram over multiple pulse trains for a selected pixel according to embodiments of the present disclosure.FIG. 8 shows a series of positions for applying a matched filter to a raw histogram according to embodiments of the present disclosure.FIG. 9 illustrates a panoramic lidar image to which depth values from pixel sensors have been assigned according to embodiments of the present disclosure.FIG. 10 shows a simplified front view of a sensor array according to an embodiment of the present invention.FIGS. 11A and 11B are simplified conceptual illustrations showing the potential for pointing error in a scanning system using a sensor array.FIG. 12 illustrates an example of an imaging system using an F