EP-4158381-B1 - TIME-OF-FLIGHT PIXEL WITH VERTICAL PHOTOGATES
Inventors
- OH, MINSEOK
Dates
- Publication Date
- 20260513
- Application Date
- 20210322
Claims (15)
- A time-of-flight, ToF, camera (100) comprising: an image sensor comprising a plurality of addressable pixels (106) configured for backside illumination, each addressable pixel (106) comprising a first vertical photogate (202a, 308a) and a second vertical photogate (202b, 308b) spaced from the first vertical photogate (202a, 308a), wherein a vertical photogate extends lengthwise along a thickness of an epitaxial layer of an addressable pixel (106); a processor; and a storage device storing instructions executable by the processor to, for each addressable pixel (106) during an integration period, apply a first relative bias between the first vertical photogate (202a, 308a) and the second vertical photogate (202b, 308b) to collect charge at a first pixel tap (201a), during the integration period, apply a second relative bias between the first vertical photogate (202a, 308a) and the second vertical photogate (202b, 308b) to collect charge at a second pixel tap (201b), and determine a distance value for the addressable pixel (106) based at least upon the charge collected at the first pixel tap (201a) and charge collected at the second pixel tap (201b).
- The ToF camera (100) of claim 1, wherein the first pixel tap (201a) and the second pixel tap (201b) each comprises a storage gate (204a, 312a, 204b, 312b).
- The ToF camera (100) of claim 1, wherein the first pixel tap (201a) and the second pixel tap (201b) each comprises a storage diode.
- The ToF camera (100) of claim 1, wherein the vertical photogates (202a, 308a, 202b, 308b) have a length that is between 50% and 95% of a thickness of a photoelectron generation region (304) of the addressable pixel (106) arranged between the first vertical photogate (202a, 308a) and the second vertical photogate (202b, 308b).
- The ToF camera (100) of claim 1, wherein the first vertical photogate (202a, 308a) and the second vertical photogate (202b, 308b) are separated by a distance less than a thickness of a photoelectron generation region (304) of the addressable pixel (106) arranged between the first vertical photogate (202a, 308a) and the second vertical photon gate (202b, 308b).
- The ToF camera (100) of claim 1, wherein each addressable pixel (106) comprises a region of germanium.
- The ToF camera (100) of claim 1, wherein applying a first relative bias comprises applying a bias to the first vertical photogate (202a, 308a) while maintaining the second vertical photogate (202b, 308b) at ground.
- The ToF camera (100) of claim 1, wherein applying a first relative bias comprises applying a positive bias to the first vertical photogate (202a, 308a) and a negative bias to the second vertical photogate (202b, 308b).
- The ToF camera (100) of claim 1, wherein the addressable pixels (106) comprise a pitch of 1 µm to 3 µm.
- The ToF camera (100) of claim 1, wherein each addressable pixel (106) comprises a region of epitaxial silicon.
- A method (800) for determining a distance value for an addressable pixel (106) of an image sensor of a ToF camera (100), the addressable pixel (106) comprising a first vertical photogate (202a, 308a) and a second vertical photogate (202b, 308b) spaced from the first vertical photogate (202a, 308a), wherein a vertical photogate extends lengthwise along a thickness of an epitaxial layer of an addressable pixel (106), the method comprising: illuminating (802) a scene with amplitude modulated light; during an integration period, applying (808) a first relative bias between the first vertical photogate (202a, 308a) and the second vertical photogate (202b, 308b) to collect charge at a first pixel tap (201a); during the integration period, applying (818) a second relative bias between the first vertical photogate (202a, 308a) and the second vertical photogate (202b, 308b) to collect charge at a second pixel tap (201b); and determining a distance value for the addressable pixel (106) based at least upon the charge collected at the first pixel tap (201a) and the charge collected at the second pixel tap (201b).
- The method (800) of claim 11, wherein collecting charge at the first pixel tap (201a) and collecting charge at the second pixel tap (201b) comprises collecting charge from a germanium region.
- The method (800) of claim 11, wherein collecting charge at the first pixel tap (201a) and collecting charge at the second pixel tap (201b) comprises collecting charge from an epitaxial silicon photoelectron generation region (304).
- The method (800) of claim 11, wherein collecting charge at a first pixel tap (201a) comprises collecting charge at a first storage gate (204a, 312a) and wherein collecting charge at a second pixel tap (201b) comprises collecting charge at a second storage gate (204b, 312b).
- The method (800) of claim 11, wherein collecting charge at a first pixel tap (201a) comprises collecting charge at a first storage diode and wherein collecting charge at a second pixel tap (201b) comprises collecting charge at a second storage diode.
Description
BACKGROUND Many computing applications use controllers, remotes, keyboards, mice, or other input devices to allow a user to interact with the application. More recently, some computing applications such as computer games and multimedia applications increasingly employ depth cameras to capture motion and body movement of a user, enabling the user to interact with the application via natural gestures. Some such depth cameras are time-of-flight (ToF) cameras, which determine depth by measuring the round-trip travel time for light between the camera and an object. For example, a temporally-modulated light signal may illuminate the object while the ToF camera captures the reflected, phase-shifted signal from which depth is calculated. US 2019/339392 A1 discloses an image sensor including a photodiode, a first doped region, a second doped region, a first storage node, a second storage node, a first vertical transfer gate, and a second vertical transfer gate. The photodiode is disposed in a semiconductor material to convert image light to an electric signal. The first doped region and the second doped region are disposed in the semiconductor material between a first side of the semiconductor material and the photodiode. The first doped region is positioned between the first storage node and the second storage node while the second doped region is positioned between the second storage node and the first doped region. The vertical transfer gates are coupled between the photodiode to transfer the electric signal from the photodiode to a respective one of the storage nodes in response to a signal. US 2019/214428 A1 provides an imaging sensor array comprising an epitaxial germanium layer disposed on a silicon layer, and an electrically biased photoelectron collector arranged on the silicon layer, on a side opposite the germanium layer. US 2009/244514 A1 describes a distance measuring sensor that may include: a photoelectric conversion region; first and second charge storage regions; first and second trenches; and/or first and second vertical photogates. The photoelectric conversion region may be in a substrate and/or may be doped with a first impurity in order to generate charges in response to received light. The first and second charge storage regions may be in the substrate and/or may be doped with a second impurity in order to collect charges. The first and second trenches may be formed to have depths in the substrate that correspond to the first and second charge storage regions, respectively. The first and second vertical photo gates may be respectively in the first and second trenches. A three-dimensional color image sensor may include a plurality of unit pixels. Each unit pixel may include a plurality of color pixels and the distance measuring sensor. SUMMARY The invention provides a time-of-flight camera according to claim 1 and a method according to claim 11. Advantageous embodiments are provided in the dependent claims. BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 is an exploded, schematic view showing aspects of an example time-of-flight (ToF) camera.FIG. 2 schematically shows an electrical schematic of an example time-of-flight pixel comprising vertical photogates.FIG. 3 shows a sectional view of an example ToF pixel comprising a storage gate.FIG. 4 shows a sectional view of an example ToF pixel comprising a storage diode.FIG. 5 shows a sectional view of an example ToF pixel comprising a germanium region for far infrared imaging.FIG. 6 shows an example timing diagram for operating a ToF camera pixel.FIG. 7 shows an example ToF pixel configured for front-side illumination.FIG. 8 is a flow diagram depicting an example method for determining a distance value for a time-of-flight pixel.FIG. 9 is a block diagram of an example computing system. DETAILED DESCRIPTION A time-of-flight (ToF) camera may determine, for each addressable pixel of an image sensor of the camera, a depth of a subject (a distance from the subject to the pixel) based on a phase of a received light signal that is temporally modulated by a time-of-flight illuminator. The depth values determined for each addressable pixel of the camera image sensor are used to create a depth image, which may be used, for example, to identify motion (e.g. gestures) of a subject. The received light signal generates photoelectrons in a region of the pixel, thereby producing an electric charge signal. A ToF sensor may be able to modulate the pixel response in synchronization with a modulated illumination source to direct the charge to different taps of the pixel during an integration period. A global shutter mechanism may be used to simultaneously modulate the entire pixel array. Data is sampled at a plurality of different phases of the temporally modulated light signal, and a depth value for a pixel is determined using the signals acquired for each pixel tap at each illumination phase that is sampled. Current ToF pixels may employ planar photogates to collect charge d