EP-4535668-B1 - DELAY CALIBRATION CIRCUIT AND METHOD, ANALOG-TO-DIGITAL CONVERTER, RADAR SENSOR AND DEVICE
Inventors
- ZHANG, Xinlong
- ZHOU, Wenting
Dates
- Publication Date
- 20260513
- Application Date
- 20230407
Claims (15)
- A delay calibration circuit (550) arranged within a successive approximation register, SAR, analog-to-digital converter, ADC, wherein the SAR ADC is configured to convert an analog signal sampled in a present sampling cycle into a digital signal of N bits, N > 1; wherein the delay calibration circuit comprises: a monitoring circuit unit (651, 10), configured to monitor a completion time of the analog signal being converted in the present sampling cycle and to output a detection signal in response to a last bit of the N bits having been generated in the present sampling cycle; and an adjustable delay calibration circuit unit (652) coupled with the monitoring circuit unit (651, 10) and configured to adjust at least one delay duration according to timing information between the detection signal and a sampling clock signal of a next sampling cycle, wherein each delay duration of the at least one delay duration corresponds to a respective length of time spent by the SAR ADC to generate a respective bit of the N bits in the next sampling cycle; wherein the adjustable delay calibration circuit unit (652) comprises: a plurality of adjustable delay modules (30), wherein each delay duration for the respective bit is adjusted using at least one respective adjustable delay module (30) of the plurality of adjustable delay modules (30); and a digital calibration circuit (20) coupled with each adjustable delay module (30) of the plurality of adjustable delay modules (30) and the monitoring circuit unit (651, 10), and configured to receive the detection signal and the sampling clock signal, to generate an M-bit digital control code for the respective bit of the N bits according to the timing information, and to output the M-bit digital control code to the at least one respective adjustable delay modules (30), and wherein M is greater than or equal to 1 and is equal to a number of the at least one respective adjustable delay module (30).
- The delay calibration circuit of claim 1, wherein the timing information is a timing order between a triggering edge of a delay signal of a ready signal of the least bit and a triggering edge of the sampling clock signal corresponding to the next sampling cycle.
- The delay calibration circuit of claim 1, wherein each adjustable delay module (30) comprises a delay device (31) and a control switch (32) coupled with the delay device (31), and wherein the control switch (32) is configured to be turned on or off under the control of a codeword signal in a corresponding digital control code.
- The delay calibration circuit of claim 1, wherein the plurality of adjustable delay modules (30) provide unit delays, and wherein at least some of the unit delays are different from each other.
- The delay calibration circuit of claim 1, wherein the plurality of adjustable delay modules (30) are grouped into a plurality of groups of adjustable delay modules (30) according to the number of bits of the N bits, and at least one respective adjustable delay module (30) in each group of adjustable delay modules (30) is configured to receive a corresponding M-bit digital control code.
- The delay calibration circuit of claim 5, wherein the plurality of groups of adjustable delay modules (30) each include same number of adjustable delay modules (30) or different number of adjustable delay modules (30) from one another.
- The delay calibration circuit of claim 1, wherein adjustable delay calibration circuit is configured to extend the at least one delay duration in response to the triggering edge of a delay signal of a ready signal of the last bit being prior to the triggering edge of the sampling clock signal of the next sampling cycle.
- The delay calibration circuit of claim 1, wherein the adjustable delay calibration circuit (20) is configured to shorten each delay duration of the at least one delay duration in response to a rising edge of the detection signal being after a rising edge of the sampling clock signal of the next sampling cycle, and to extend each delay duration of the at least one delay duration in response to a triggering edge of the detection signal being prior to a triggering edge of the sampling clock signal of the next sampling cycle.
- The delay calibration circuit of claim 1, wherein the plurality of adjustable delay modules (30) provide unit delays, and wherein at least some of the unit delays are equal.
- An analog-to-digital converter (820), ADC, configured to convert an analog signal received in a present sampling cycle into a digital signal, wherein the ADC (820) comprises: the delay calibration circuit (550) of any of claims 1 to 9 configured to adjust a corresponding duration spent for generating each bit of a digital signal in a next sampling cycle.
- The ADC of claim 10, further comprising: a comparator (520) configured to receive the analog signal and a reference signal, and to compare potential values of the analog signal and the reference signal in the present sampling cycle to output a comparison result; a SAR logic circuit (530) coupled with an output terminal of the comparator (520), and configured to perform logic processing on the comparison result within a corresponding delay duration set by the delay calibration circuit (550) to output each bit of the digital signal corresponding to the analog signal; and a feedback circuit (540) coupled with an output terminal of the SAR logic circuit (530) and a reference signal input terminal of the comparator (520), and is configured to convert each bit of the digital signal into the reference signal.
- A signal receiving device (700, 810), comprising: a mixer (701) configured to mix a radio frequency receiving signal with a radio frequency reference signal to output an intermediate frequency signal; and an analog-to-digital converter (702), ADC, configured to convert the intermediate frequency signal into a digital signal and to output the digital signal, wherein the ADC (702) comprises the delay calibration circuit (550) of any of claims 1 to 9 for generating a digital signal in a corresponding sampling cycle based on each delay duration adjusted by the delay calibration circuit (550) within at least one sampling cycle.
- A radar sensor (800, 900), comprising: an antenna device (930) comprising a transmitting antenna and a receiving antenna; a signal transmitting device (830, 910) configured to process a reference signal generated by a signal source into a radio frequency transmitting signal and to transmit the radio frequency transmitting signal through the transmitting antenna; and the signal receiving device (700, 810, 920) of claim 12, wherein the radio frequency receiving signal received by the signal receiving device (700, 810, 920) is obtained after an electromagnetic wave emitted by the transmitting antenna is reflected by an object and converted by the receiving antenna.
- An electronic device, comprising: the radar sensor of claim 13; and a signal processing device (940) coupled to the radar sensor, and configured to perform signal processing on a digital signal output from the radar sensor, to detect targets in a surrounding environment.
- A delay calibration method, comprising: monitoring (S10) a completion time of an analog signal sampled in a present sampling cycle and being converted into a digital signal of N bits, and outputting a detection signal in response to a last bit of the N bits having been generated in the present sampling cycle, and wherein N > 1; and adjusting (S20) at least one delay duration according to timing information between the detection signal and a sampling clock signal of a next sampling cycle, wherein each delay duration of the at least one delay duration corresponds to a respective length of time spent by the SAR analog-to-digital converter, ADC, to generate a respective bit of the N bits in the next sampling cycle; wherein adjusting the at least one delay duration according to the timing information between the detection signal and the sampling clock signal of the next sampling cycle, comprises: generating (S21) an M-bit digital control code for the respective bit of the N bits in the next sampling cycle according to a detection result of the timing information; outputting the M-bit digital control code to at least one corresponding adjustable delay module (30) of a plurality of adjustable delay modules (30), wherein the at least one corresponding adjustable delay module (30) is configured to adjust a delay duration of the at least one delay duration for the respective bit of the N bits; and adjusting (S22) the delay duration of the at least one delay duration using the at least one corresponding adjustable delay module and the M-bit digital control code, and wherein M is greater than or equal to 1 and is equal to a number of the at least one respective adjustable delay module.
Description
TECHNICAL FIELD The various embodiments described in this document relate in general to the technical field of integrated circuits, and more specifically to a delay calibration circuit and method, an analog-to-digital converter, a radar sensor, and a device. BACKGROUND The successive approximation register (SAR) analog-to-digital converter (ADC) is disposed inside a semiconductor chip and configured to convert an input analog voltage value into a digital code. The SAR ADCs are widely used in electronic devices with low-power consumption because the SAR ADC is relatively short in sampling delay time, good in conversion rate and accuracy, simple in structure, low in power consumption, and easy to be compatible with digital circuits. To achieve ultra-high-speed ADC while maintaining relatively low energy consumption, a relatively high sampling rate needs to be achieved in a single channel to reduce the number of interleaved channels and wiring complexity. US20220149858A1 discloses an analog-to-digital converter (ADC) including a conversion circuit with multiple bit-conversion circuits. The bit-conversion circuits asynchronously and sequentially perform the SAR analog-to-digital conversion to determine different bits in the quantized representation of the input signal. US20160094239A1 discloses a semiconductor device capable of accurately controlling the cycle of an internal clock signal. This semiconductor device, by using signal that is output from a sequence register of an asynchronous successive approximation type ADC when N times of comparison are completed, detects whether or not the signal and its delay signal are output when the period transitions from a comparison period to a sampling period, and generates, on the basis of the detection result, a delay control signal for controlling the cycle of an internal clock signal by controlling the delay times of the delay circuits. US11271577B1 discloses a successive-approximation-register analog-to-digital convertor circuit, i.e., an ADC circuit, which includes an array of bit capacitors; a comparator electrically connected to the bit capacitors; a NOR gate electrically connected to the comparator; an AND gate; a delay control circuit; and a SAR control circuit. SUMMARY The invention is set out in the appended set of claims. In view of the above, it is necessary to provide a delay calibration circuit and method, an analog-to-digital converter, a radar sensor, and a device, to solve the problems in the background technology, to monitor timing relationship between the signals in the chip in real time to extract PVT information, to dynamically adjust conversion timing of the asynchronous SAR ADC, to maximize an available conversion time in each cycle of the asynchronous SAR ADC, and to improve the robustness of the asynchronous SAR ADC without affecting the normal operation of the asynchronous SAR ADC. To solve above technical problems, according to a first aspect of embodiments of the present disclosure, a delay calibration circuit is provided and is arranged within a SAR analog-to-digital converter, (ADC), where the SAR ADC is configured to convert an analog signal sampled in a present sampling cycle into a digital signal of N bits, N > 1; where the delay calibration circuit includes: a monitoring circuit unit, configured to monitor a completion time of a last bit of the N bits converted in a present sampling cycle and to output a detection signal; and an adjustable delay calibration circuit unit coupled with the monitoring circuit unit and configured to adjust at least one delay duration according to timing information between the detection signal and a sampling clock signal of a next sampling cycle, where each delay duration of the at least one delay duration is a length of time spent for the SAR ADC to generate a corresponding bit in the next sampling cycle. According to a second aspect of embodiments of the disclosure, a signal receiving device is provided and includes a mixer configured to mix a radio frequency receiving signal with a radio frequency reference signal to output an intermediate frequency signal; and an analog-to-digital converter (ADC) configured to convert the intermediate frequency signal into a digital signal and to output the digital signal, where the ADC includes the delay calibration circuit described in the first aspect for generating a digital signal in a corresponding sampling cycle based on each delay duration adjusted by the delay calibration circuit within at least one sampling cycle. According to a third aspect of embodiments of the disclosure, a radar sensor is provided and includes an antenna device including a transmitting antenna and a receiving antenna; a signal transmitting device configured to process a reference signal generated by a signal source into a radio frequency transmitting signal and to transmit the radio frequency transmitting signal through the transmitting antenna; and the signal receiving device described i