We present a field-programmable gate array (FPGA) implementation of a time-to-digital converter (TDC) based on a low-cost, low-area Spartan 6 device. The converter is based on a tapped delay line model. Several implementation details are discussed with a particular focus on critical blocks such as the input stage and thermometer-to-binary decoding techniques. We implemented a tap filtering technique to improve the differential nonlinearity (DNL) of the single delay line while keeping a good LSB value of 25.57 ps with a single-shot precision (SSP) between 0.69 - 1.46 LSB. Measured DNL and integral nonlinearity (INL) lie in the range between -0.90 + 1.23 and -0.43 ÷ 2.96 LSB, respectively. Measured DNL and INL lie in the range between -0.90 ÷ 1.23 and -0.43 ÷ 2.96 LSB, respectively. We then implemented an interpolating TDC to overcome the limitations of a single delay line in terms of linearity and measurement range. The interpolating TDC uses the sliding scale technique, where the time interval to be measured is asynchronous with respect to the FPGA clock, achieving DNL and INL in the range -0.072 ÷ 0.070 and -0.755 ÷ 0.872 LSB. SSP is in the 1.096 ÷ 2.815 range. Moreover, we present a novel comparison between the DNLs obtained with two different methods: statistical code density test and using a finely controlled delay source. Finally, we present the results of a Monte Carlo simulation used to investigate the effects of nonlinear propagation of the signal through the delay line.
|Titolo:||Design and Characterization of a Low-Cost FPGA-Based TDC|
|Data di pubblicazione:||2018|
|Appare nelle tipologie:||1.1 Articolo in rivista|