US-12626380-B2 - Pattern-based depth mapping with extended reference image
Abstract
A method for depth mapping includes providing a depth mapping device comprising a projector, which is configured to project a pattern of optical radiation onto a target area over a first field of view about a projection axis, and a camera, which is configured to capture images of the target area within a second field of view, narrower than the first field of view, about a camera axis, which is offset transversely relative to the projection axis. The projector projects the pattern onto first and second planes at first and second distances from the camera, and the camera captures first and second reference images containing first and second parts of the pattern on the first and second planes, respectively. The first and second reference images are combined to produce an extended reference image including both the first and second parts of the pattern.
Inventors
- Shay Yosub
- Noam Badt
- Boris Morgenstein
- Yuval Vardi
- David Pawlowski
- Assaf Avraham
- Pieter Spinnewyn
- Tom Levy
- Yohai Zmora
Assignees
- APPLE INC.
Dates
- Publication Date
- 20260512
- Application Date
- 20220914
Claims (18)
- 1 . A method for depth mapping, comprising: providing a depth mapping device comprising a projector, which is configured to project a pattern of optical radiation onto a target area over a first field of view about a projection axis, and a camera, which is configured to capture images of the target area within a second field of view, narrower than the first field of view, about a camera axis, which is offset transversely relative to the projection axis; operating the projector to project the pattern onto a first plane at a first distance from the camera, and using the camera, capturing a first reference image containing a first part of the pattern on the first plane; operating the projector to project the pattern onto a second plane at a second distance from the camera, different from the first distance, and using the camera, capturing a second reference image containing a second part of the pattern on the second plane, wherein the first and second parts of the pattern both comprise a same central part of the pattern, which appears in both the first and second reference images, and wherein the second reference image comprises a peripheral part of the pattern, which is disjoint from and adjoins the central part of the pattern; combining the first and second reference images to produce an extended reference image comprising both the first and second parts of the pattern; and applying the extended reference image in processing a further image captured by the camera of an object within the target area to generate a depth map of the object.
- 2 . The method according to claim 1 , wherein the pattern comprises multiple spots extending across the first field of view.
- 3 . The method according to claim 1 , wherein the projection axis is angled toward to the camera axis so as to increase an overlap between the first and second fields of view.
- 4 . The method according to claim 1 , and comprising operating the projector to project the pattern onto a third plane at a third distance from the camera, different from the first and second distances, and using the camera, capturing a third reference image containing a third part of the pattern on the third plane, wherein combining the first and second images comprises combining the first, second and third reference images to produce the extended reference image.
- 5 . The method according to claim 1 , wherein combining the first and second reference images comprises: computing a transformation over the central part of the pattern, to match the second reference image to the first reference image; applying the computed transformation to the peripheral part of the pattern in the second reference image to generate a transformed reference image; and appending the transformed reference image to the first reference image to produce the extended reference image.
- 6 . The method according to claim 5 , wherein computing the transformation comprises calculating a warping function over the central part of the pattern in the second reference image, and wherein applying the computed transformation comprises extrapolating the warping function over the peripheral part of the pattern.
- 7 . The method according to claim 6 , wherein calculating the warping function comprises compensating for a distortion of the pattern in the first and second reference images.
- 8 . The method according to claim 5 , wherein computing the transformation comprises finding local disparities between the first and second reference images over the central part of the pattern, and calculating the transformation so as to compensate for the local disparities.
- 9 . A depth mapping device, comprising: a projector, which is configured to project a pattern of optical radiation onto a target area over a first field of view about a projection axis; a camera, which is configured to capture images of the target area within a second field of view, narrower than the first field of view, about a camera axis, which is offset transversely relative to the projection axis; and a processor, which is configured to operate the projector to project the pattern onto a first plane at a first distance from the camera, to capture, using the camera, a first reference image containing a first part of the pattern on the first plane, to operate the projector to project the pattern onto a second plane at a second distance from the camera, different from the first distance, to capture, using the camera, a second reference image containing a second part of the pattern on the second plane, to combine the first and second reference images to produce an extended reference image comprising both the first and second parts of the pattern, and to apply the extended reference image in processing a further image captured by the camera of an object within the target area to generate a depth map of the object, wherein the first and second parts of the pattern both comprise a same central part of the pattern, which appears in both the first and second reference images, and wherein the second reference image comprises a peripheral part of the pattern, which is disjoint from and adjoins the central part of the pattern.
- 10 . The device according to claim 9 , wherein the pattern comprises multiple spots extending across the first field of view.
- 11 . The device according to claim 9 , wherein the projection axis is angled toward the camera axis so as to increase an overlap between the first and second fields of view.
- 12 . The device according to claim 9 , wherein the projector is configured to project the pattern onto a third plane at a third distance from the camera, different from the first and second distances, whereby the camera captures a third reference image containing a third part of the pattern on the third plane, and wherein the processor is configured to combine the first, second and third reference images to produce the extended reference image.
- 13 . The device according to claim 9 , wherein the processor is configured to compute a transformation over the central part of the pattern, to match the second reference image to the first reference image, to apply the computed transformation to the peripheral part of the pattern in the second reference image to generate a transformed reference image, and to append the transformed reference image to the first reference image to produce the extended reference image.
- 14 . The device according to claim 13 , wherein the transformation comprises a warping function calculated by the processor over the central part of the pattern in the second reference image, and wherein the processor is configured to extrapolate the warping function over the peripheral part of the pattern.
- 15 . The device according to claim 14 , wherein the warping function compensates for a distortion of the pattern in the first and second reference images.
- 16 . The device according to claim 13 , wherein the processor is configured to find local disparities between the first and second reference images over the central part of the pattern, and to calculate the transformation so as to compensate for the local disparities.
- 17 . A computer software product for use with a depth mapping device including a projector, which is configured to project a pattern of optical radiation onto a target area over a first field of view about a projection axis, and a camera, which is configured to capture images of the target area within a second field of view, narrower than the first field of view, about a camera axis, which is offset transversely relative to the projection axis, the product comprising a tangible, non-transitory computer-readable medium in which program instructions are stored, which instructions, when read by a processor, cause the processor to operate the projector to project the pattern onto a first plane at a first distance from the camera, to capture, using the camera, a first reference image containing a first part of the pattern on the first plane, to operate the projector to project the pattern onto a second plane at a second distance from the camera, different from the first distance, to capture, using the camera, a second reference image containing a second part of the pattern on the second plane, to combine the first and second reference images to produce an extended reference image comprising both the first and second parts of the pattern, and to apply the extended reference image in processing a further image captured by the camera of an object within the target area to generate a depth map of the object, wherein the first and second parts of the pattern both comprise a same central part of the pattern, which appears in both the first and second reference images, and wherein the second reference image comprises a peripheral part of the pattern, which is disjoint from and adjoins the central part of the pattern.
- 18 . The product according to claim 17 , wherein the instructions cause the processor to compute a transformation over the central part of the pattern, to match the second reference image to the first reference image, to apply the computed transformation to the peripheral part of the pattern in the second reference image to generate a transformed reference image, and to append the transformed reference image to the first reference image to produce the extended reference image.
Description
FIELD OF THE INVENTION The present invention relates generally to methods and systems for three-dimensional (3D) mapping, and specifically to pattern-based depth mapping. BACKGROUND OF THE INVENTION Various methods are known in the art for optical 3D mapping, i.e., generating a 3D profile of the surface of an object by processing an optical image of the object. This sort of 3D profile is also referred to as a depth map or depth image, and 3D mapping is also referred to as depth mapping. Some methods are based on projecting a pattern of structured light onto an object or scene that is to be mapped, for example a pattern of spots. A camera captures an image of the projected pattern. A processor finds local disparities between the pattern in the captured image and a reference pattern captured at a known distance from the camera. Based on the local disparities, the processor computes a depth map of the object or scene. The terms “light” and “optical radiation” are used interchangeably in the present description and in the claims to refer to electromagnetic radiation in any of the visible, infrared, and ultraviolet ranges of the spectrum. SUMMARY Embodiments of the present invention that are described hereinbelow provide improved methods and systems for pattern-based depth mapping. There is therefore provided, in accordance with an embodiment of the invention, a method for depth mapping, which includes providing a depth mapping device including a projector, which is configured to project a pattern of optical radiation onto a target area over a first field of view about a projection axis, and a camera, which is configured to capture images of the target area within a second field of view, narrower than the first field of view, about a camera axis, which is offset transversely relative to the projection axis. The projector is operated to project the pattern onto a first plane at a first distance from the camera, and using the camera, a first reference image is captured containing a first part of the pattern on the first plane. The projector is operated to project the pattern onto a second plane at a second distance from the camera, different from the first distance, and using the camera, a second reference image is captured containing a second part of the pattern on the second plane. The first and second reference images are combined to produce an extended reference image including both the first and second parts of the pattern. The extended reference image is applied in processing a further image captured by the camera of an object within the target area to generate a depth map of the object. In a disclosed embodiment, the pattern includes multiple spots extending across the first field of view. Additionally or alternatively, the projection axis is angled toward to the camera axis so as to increase an overlap between the first and second fields of view. Further additionally or alternatively, the method includes operating the projector to project the pattern onto a third plane at a third distance from the camera, different from the first and second distances, and using the camera, capturing a third reference image containing a third part of the pattern on the third plane, wherein combining the first and second images includes combining the first, second and third reference images to produce the extended reference image. In some embodiments, the first and second parts of the pattern both include a central part of the pattern, which appears in both the first and second reference images, and the second reference image includes a peripheral part of the pattern, which is disjoint from and adjoins the central part of the pattern. In some of these embodiments, combining the first and second reference images includes computing a transformation over the central part of the pattern, to match the second reference image to the first reference image, applying the computed transformation to the peripheral part of the pattern in the second reference image to generate a transformed reference image, and appending the transformed reference image to the first reference image to produce the extended reference image. In some embodiments, computing the transformation includes calculating a warping function over the central part of the pattern in the second reference image, and applying the computed transformation includes extrapolating the warping function over the peripheral part of the pattern. In a disclosed embodiment, calculating the warping function includes compensating for a distortion of the pattern in the first and second reference images. Additionally or alternatively, computing the transformation includes finding local disparities between the first and second reference images over the central part of the pattern, and calculating the transformation so as to compensate for the local disparities. There is also provided, in accordance with an embodiment of the invention, a depth mapping device, including a projector, which is c