LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Slowest Sample Rate Possible for Analog Inputs?

What is the slowest sample rate possible for analog inputs? Is there a limitation of 1 sample/second? Or can you sample at .1 samples/second? Do I have to put a wait ms.vi to slow the sampling down? I need to record data points to file, but the data changing happens slow. Thanks for any input.
0 Kudos
Message 1 of 3
(2,610 Views)
I'm not sure which hardware you're using, but I don't think it should matter. Even if the sample rate is fast, you don't need to read all the samples. Like you said, just use a wait function and read only once in some time. If your hardware has a buffer that can't be emptied automatically, then read it all the time, and write some code to only write to a file when a certain condition occurs (for example, using a case structure).

___________________
Try to take over the world!
0 Kudos
Message 2 of 3
(2,596 Views)
Hi,

This greatly depends on whether or not you need accurately timed sampling. If you do not require accurate sampling, then you could have a while loop with a singe point analog read and use a wait statement to slow the loop. This way you can have several minutes between samples using the wait function. However, since this is software timed it will not be perfectly accurate (may be several milliseconds off).

If you need accurate timing between samples then you will need to do buffered analog input. This uses a hardware clock to time the analog input. The slowest onboard clock is the 100 KHz timebase. And the maximum of an E-series counter is 24 bits. This means that a MAXIMUM of 2^24 timebase ticks can pass between samples. This calculates to roughly 0.006 Hz or one sample every 167 seconds.

-Sal
0 Kudos
Message 3 of 3
(2,580 Views)