02-11-2009 02:24 PM
Hello,
I am trying to read an analog signal that varies from -11 to 0 V using a PCI-6111 DAQ card. The signal comes from a PhotoMultiplier Tube (PMT) that is part of a microscope setup, so it is very important that the resolution of the analog input signal be as large as possible to generate quality images. According to the spec sheet for the PCI-6111, the analog input resolution is 12 bits, which should correspond to a sensitivity of ~2.686 mV for my voltage range.
To test this, I configured an analog input task with a voltage range of -11 to 0 V to read samples from an analog output, to which I wrote a simple waveform. Since the analog output has 16 bit resolution I assumed that it would not limit the accuracy of this measurement. I've attached the VI I used for this measurement below. The analog input data is saved un-truncated to a text file.
Analyzing this data I found that the actual input sensitivity is ~9.766 mV, correspoding to exactly 1126.4 voltage levels and ~10 bits.
Is there any reason why the analog input resolution is so much lower than indicated on the spec sheet? What are some ways that I could improve the sensitivity of this measurement?
Best,
Keith
Solved! Go to Solution.
02-11-2009 03:20 PM
02-11-2009 03:27 PM - edited 02-11-2009 03:28 PM
Okay, that explains a lot. Just a quick follow-up question: What are the possible voltage ranges that it will choose from?
Thanks so much for your help.
02-11-2009 08:51 PM
Sorry, when you referred to the specs, I thought you already had them. Didn't this come with your board?