06-29-2009 02:12 PM
Hello everyone, I have a program that takes current readings from a 3458a DMM and displays the readings in an array using LabVIEW. The DMM is commanded to take 200 readings total at a rate of 10 readings per second. I have a timer that displays how long the readings take (and have also timed it outside of LabVIEW) and for some reason, the timing is always short. For example, when told to take 200 readings, the program only runs for about 18 seconds. Is there something wrong with the program that is causing it to do this? I know that 2 seconds is not alot, but this program needs to be very accurate.
I appreciate any help you have to offer!
06-30-2009 02:41 PM
06-30-2009 03:31 PM
Thanks for your response. I have attached a picture of the block diagram and front panel, just simplified alot (I cut out everything that didnt have to do with acquiring the readings so that it is easier to read). I am getting the right number of readings even though the timing is off.
Thanks again for your help!
07-01-2009 05:46 PM
It is hard to see what you changed in the program. I do not see the command that initiates the 200 reads. Can you "unstack" that sequence structure?
My thoughts are that it is an issue with the hardware or how you are timing it. If you read 20 samples or 200 samples, is it always 2 seconds off or is it related to how many readings you are taking?
regards