03-23-2010 05:42 PM
I'm trying to sample some voltage signals at around 100-1000 Hz (using a PCI-6229) and simultaneously correlate these samples against an absolute time.
So, for example:
0.001 s, +1.0 V
0.002 s, +1.1 V
0.003 s, +1.2 V
0.004 s, +1.3 V
0.005 s, +1.4 V
...etc
However, when using the "Get Date/Time in Seconds" function in LabView, the timing comes in "bursts" like this:
0.047 s
0.063 s
0.063 s
0.062 s
0.078 s
0.078 s
0.078 s
0.078 s
0.094 s
0.094 s
This is regardless of whether I use the aforementioned function or if I read the data as a 1D Waveform and pull out the t0 values.
03-23-2010 05:56 PM
Hmmm, never mind, got around the problem by switching to the "Tick Count (ms)" function.
But just out of curiosity, if someone could explain why the "Get Date/Time in Seconds" function behaves like that I'd appreciate it.
03-23-2010 10:27 PM