High-Speed Digitizers

cancel
Showing results for 
Search instead for 
Did you mean: 

time resolution vs bit resolution

I am in the process of evaluating several digitzers.  My question is sort of open ended any help would be appreciated.  I am looking to set up an experiment that will determine how the bit resolution of the digitizer affects time resolution. For example if i have  12-bit 200MS/s and a 8-bit 1Gs/s digitizers which would provide better time of flight information, and how would i go about testing this in the lab.
Mark Mutton
Electrical Engineer
0 Kudos
Message 1 of 4
(8,292 Views)
Bit resolution determines how finely you can distinguish differences in voltages for a signal. Time resolution determines the difference in time from one sample to another. The higher the sample rate used to acquire a signal the shorter the time duration between samples gets thus resulting in a better time resolution. Usually there is a trade off between speed and accuracy for digitizers so the faster the signal is acquired the less accurately it is measured. Some devices such as the PXI-5922 allow for a flexible resolution depending on the speed the signal is acquired at. However, most instruments have a fixed bit resolution regardless of the sample rate used.
 
To select the proper device you need to consider two questions:
- What is the smallest duration of time that you want to pass between samples? (time resolution)
- What is the smallest difference in amplitude that you need to detect on the signal? (bit resolution)
 
Depending on what your answers are to the questions above both digitizers may be able to meet your requirements.  If you are measuring time of flight information you may need to synchronize with other devices using triggers, etc. Therefore the digitizer's synchronization ability would also be important to consider.
0 Kudos
Message 2 of 4
(8,283 Views)
One other thing to consider for your application is that some digitizers have higher resolution on the trigger position than the digitizing period.  For example, the NI-5122 has a time-to-digital converter on the trigger with a resolution of 40ps, while the minimum real-time period of the digitizer is 10ns.  So, if you trigger on a specific voltage level, you would get much better time resolution than the sample period would indicate.  This information is available from the reference position output when you fetch the data.
0 Kudos
Message 3 of 4
(8,278 Views)
Choosing the best trade off between resolution and speed is difficult.  The first question you need to answer is what is the highest frequency content you need to measure.  If you are interested in measuring 100 MHz signals make sure you digitizer has at least 100 MHz of bandwidth (which is different from sample rate).  You may even want to get something that has a little extra bandwidth because devices usually don't have great flatness at the end of their bandwidth specs (signals start to get attenuated).
 
Once you know how fast of signals you want to measure then you should try to get the highest resolution / highest dynamic range digitizer you can find.  The resolution helps you resolve differences in voltage/time accurately and the dynamic range determines how "clean" your measurement device is.  There is no point in buying a 14 bit digitizer if you only get 50 dB of dynamic range.  Many manufactures out there "trick" people by advertising digitizers that have resolutions that aren't useful with the dynamic range of the device.
 
Resolution tends to be more important than sample rate for accurate measurements.  You can use a 10 MS/s digitizer to resolve 100 ps time differences if you have the correct algorithms in place and the digitizer has enough resolution and dynamic range.
 
Also, remember to choose a sample rate that satisfy Nyquist (things are usually easier if you do).
 
Good luck,
 
Kunal
Message 4 of 4
(8,180 Views)