KR-102962732-B1 - Apparatus for LIDAR
Abstract
The present application relates to a lidar device, wherein a lidar device according to one embodiment of the present invention comprises: a laser light source unit that irradiates a line laser into a target space; an image sensor unit that detects the line laser reflected within the target space and generates a detection image; and an image processing unit that generates a three-dimensional coordinate map of the target space using a laser pattern appearing in the detection image, wherein the laser pattern may include a reference pattern and a target pattern having a shape that is shorted from the reference pattern and spaced vertically.
Inventors
- 김은지
- 온백산
- 여태운
- 김대언
Assignees
- 주식회사 케이티
Dates
- Publication Date
- 20260511
- Application Date
- 20201207
Claims (18)
- In an image-based 3D LiDAR device, A laser light source unit that irradiates a line laser within a target space; An image sensor unit that detects the line laser reflected within the target space and generates a detection image; An image processing unit that generates a three-dimensional coordinate map of the target space using a laser pattern appearing in the detected image; An output control unit that controls the emission timing and emission maintenance time for irradiating the above line laser into the above target space; and It includes an exposure control unit that controls the exposure timing and exposure holding time for detecting the line laser by exposing the image sensor unit, and The above laser pattern is It includes a reference pattern and a target pattern having a shape that is shorted from the reference pattern and spaced vertically apart, The above output control unit A Lidar device that uses a synchronization signal to match the exposure timing and the emission timing, and controls the line laser to emit for a first emission holding time when an odd-numbered synchronization signal is received, and to emit for a second emission holding time when an even-numbered synchronization signal is received.
- In paragraph 1, the image processing unit The z-axis coordinates for each point included in the target pattern are extracted using a height position value that is vertically spaced from the centerline of the detection image of the target pattern, and The y-axis coordinates for each point included in the target pattern are extracted using a vertical position value in which the target pattern is spaced vertically from the reference pattern, and The x-axis coordinates of each point are extracted using a horizontal position value in which each point included within the above target pattern is horizontally spaced from the reference point of the above detection image, and A LiDAR device characterized by generating three-dimensional coordinates for each point within the target pattern using the above x-axis coordinates, y-axis coordinates, and z-axis coordinates.
- In paragraph 2, The above reference point is any one of the center point, the left end, and the right end of the above detection image, and A lidar device characterized in that the center line is a horizontal line passing through the center point of the detection image.
- In paragraph 2, the image processing unit A LiDAR device characterized by correcting the y-axis coordinate by further utilizing the horizontal position value when extracting the y-axis coordinate.
- In paragraph 4, the image processing unit A lidar device characterized by correcting the y-axis coordinate using a table storing distance information to the point corresponding to the vertical position value and the horizontal position value.
- In paragraph 1, the image processing unit A LiDAR device characterized by extracting pattern pixels corresponding to the laser pattern from the above detection image, and dividing the pattern pixels to add a plurality of virtual pixels.
- In paragraph 1, the image processing unit A LiDAR device characterized by extracting pattern pixels corresponding to the laser pattern from the detection image, and inserting a plurality of virtual pixels into the pattern pixels so that the virtual pixels form a vertical or horizontal layer within the pattern pixels.
- In paragraph 6 or 7, the image processing unit A lidar device characterized by generating a binarized image by binarizing the pattern pixels and virtual pixels based on a threshold value, and extracting the laser pattern from the binarized image.
- In paragraph 1, The image sensor unit further includes a lens unit that receives the line laser, and The above image processing unit A LiDAR device characterized by generating three-dimensional coordinate information with distortion corrected by the lens unit by referring to a correction table.
- delete
- delete
- delete
- In paragraph 1, the output control unit A lidar device characterized in that the first emission holding time matches the exposure holding time, and the second emission holding time is maintained for a shorter duration than the exposure holding time.
- In Clause 13, the image sensor unit A first detection image corresponding to the odd-numbered synchronization signal and a second detection image corresponding to the even-numbered synchronization signal are each generated, and A lidar device characterized by generating the first detection image by image processing to have a higher brightness gain value than the second detection image.
- In Clause 14, the image processing unit A lidar device characterized by assigning line numbers in ascending order starting from the line laser located at the bottom of the first detection image when the number of line lasers detected in the first detection image is equal to the number of line lasers output from the laser light source.
- In item 15, the image processing unit A lidar device characterized by distinguishing between the above-mentioned reference pattern and target pattern and assigning the above-mentioned line number to each.
- In Clause 16, the image processing unit A lidar device characterized by assigning a line number to each of the line lasers by referring to the second detection image when the number of line lasers detected in the first detection image is less than the number of line lasers output from the laser light source.
- In Clause 17, the image processing unit A lidar device characterized by, if the number of line lasers detected in the first detection image is less than the number of line lasers output from the laser light source, setting the candidates for line numbers of the line lasers included in the first detection image as x and (x-1), respectively, and then assigning the line number of the line laser detected identically in the second detection image as (x-1) and the line number of the line laser not detected in the second detection image as x, respectively.
Description
LiDAR Apparatus {Apparatus for LIDAR} The present application relates to a lidar device using a line laser, and more specifically, to a lidar device capable of generating a three-dimensional coordinate map within a target space by performing image processing on a detected image. A laser (Light Amplification by the Stimulated Emission of Radiation, LASER) is light amplified by stimulated emission and is utilized as a core technology in many fields, including electronics, optical communication, medicine, and defense. In addition, LiDAR (Light Detection and Ranging) devices, which measure the distance to an object using lasers, have recently been attracting attention as a core technology for autonomous vehicles, mobile robots, and drones. Conventional LiDAR devices utilize the Time of Flight (TOF) method, which calculates distance based on the laser's flight time. While this offers the advantage of high precision and high resolution measurements, it has disadvantages such as high cost and a large overall size. FIG. 1 is a block diagram showing a lidar device according to one embodiment of the present invention. FIG. 2 is a schematic diagram showing the emission of a line laser within a target space using a lidar device according to one embodiment of the present invention. FIGS. 3 and 4 are schematic diagrams illustrating the generation of three-dimensional coordinates for a single-line line laser using a lidar device according to an embodiment of the present invention. FIGS. 5 and 6 are schematic diagrams illustrating the generation of three-dimensional coordinates for a multi-line line laser using a lidar device according to an embodiment of the present invention. Figures 7 and 8 are schematic diagrams showing distortion of the detected image caused by the lens. FIG. 9 is a schematic diagram illustrating the correction of detection image distortion of a lidar device according to one embodiment of the present invention. FIGS. 10 and FIGS. 11 are schematic diagrams illustrating the generation of distance information using a lidar device according to an embodiment of the present invention. FIGS. 12 and FIGS. 13 are exemplary diagrams illustrating the generation of distance information using a lidar device according to an embodiment of the present invention. FIG. 14 is a schematic diagram illustrating the generation of virtual pixels of a LiDAR device according to one embodiment of the present invention. FIG. 15 is a schematic diagram illustrating the detection of a laser pattern using a virtual pixel of a lidar device according to one embodiment of the present invention. FIG. 16 is a schematic diagram illustrating the generation of distance information using a laser pattern of a lidar device according to one embodiment of the present invention. FIGS. 17 and 18 are schematic diagrams showing detection images according to the distance from an object within the target space. FIG. 19 is a block diagram showing the line laser emission timing and emission hold time of a lidar device according to one embodiment of the present invention. FIG. 20 is a block diagram illustrating the generation of a first detection image and a second detection image of a lidar device according to an embodiment of the present invention. FIG. 21 is an example diagram showing the line number setting for each line laser when the number of detected line lasers is equal to the number of emitted line lasers. FIG. 22 is an example diagram showing the line number setting for each line laser when the number of detected line lasers is different from the number of emitted line lasers. Hereinafter, preferred embodiments are described in detail with reference to the attached drawings so that those skilled in the art can easily practice the present invention. However, in describing the preferred embodiments of the present invention in detail, if it is determined that a detailed description of related known functions or configurations may unnecessarily obscure the essence of the present invention, such detailed description is omitted. Additionally, the same reference numerals are used throughout the drawings for parts having similar functions and operations. Additionally, throughout the specification, when a part is described as being 'connected' to another part, this includes not only cases where they are 'directly connected,' but also cases where they are 'indirectly connected' with other elements in between. Furthermore, the term 'includes' a component means that, unless specifically stated otherwise, it does not exclude other components but may include additional components. Also, terms such as "part" or "module" described in the specification refer to a unit that processes at least one function or operation, which may be implemented in hardware or software, or as a combination of hardware and software. FIG. 1 is a block diagram showing a lidar device according to one embodiment of the present invention, and FIG. 2 is a schematic diagram showing the e