05-04-2017 10:48 AM
Hello,
I am using a PCI 5105 to acquire AC signals that correspond to currents. The acquisition is within ±15V (30 volts) range at 60MHz. Thus, the resolution I'm working with is 30/4096 = 7.32mV, as the 5105 has 12 bits of resolution. Searching the forum, I found that the actual resolution is a little bit higher than this value, around 9mV, which is not really a problem.
The problem is that this value means a significant amount of current, so what I need to know is what happens between two LSB's. What I mean is: when the signal is, for example, 6mV above the last bit that is set to 1, does the output holds the previous value or it has a chance of setting the next bit to 1? Also, if my signal is decreasing, and it decreases 4 or 5mV, the mostright bit is set to 0 or it stays at 1 until the signal decreases the 9mV step?
In short, if a input signal increases less than the resolution step, what happens at the output? And what happens when it's decreasing?
PS: I can't work with a lower voltage range or sampling frequency, so decreasing the step is not an option.
Thanks in advance,
Miguel.
Solved! Go to Solution.
05-11-2017
01:30 PM
- last edited on
03-24-2025
01:26 PM
by
Content Cleaner
Hi,
Like you yourself have calculated, the code width (range/2^resolution) is the smallest value change that the device can detect. If device has a range from -10 to 10 and the resolution is 12 bits, so the code widht is 2.4mV. Changes below this values is not detected by ADC.
Calculating the Smallest Detectable Change—Code Width
https://www.ni.com/docs/en-US/bundle/ni-daqmx/page/codewidth.html
If you need more steps, you should use a higher resolution device.
At below link there were others PXI/PCI scope options:
https://www.ni.com/en-us/shop/category/oscilloscopes-and-digitizers.html
Regards,