LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

File Timing Wrong in Triggered Data Aquisition

hi everyone,

I am having an issue with some saved data in labview.

I have set up data aquisition to trigger off the digital edge of a function generator pulse. This writes 2000 samples at 100KS/s to an .lvm file and then loops back round to wait for the next function generator pulse. The only problem is that the data time column in the saved file is then appearing with strange values. For example, If I set the function generator to 5 Hz (This is a good agilent FG so input is 5 Hz) I should see the first response have a time column of 0.00000, 0.00001 etc and the next to be 0.20000, 0.20001.. etc and 0.4 then 0.6, etc etc etc. What I get is that the first response starts 0.00000 but the next one is say 0.19495 and then 0.40154.

Does anyone know where the time stamp is coming from? - I would like it to be the trigger but is it coming when the file is saved or elsewhere such as the DAQMX start task?


Regards

Richard




0 Kudos
Message 1 of 4
(2,580 Views)
The value for the timing can come from several places, depending on how you are taking data.  From your values, it appears you are using a waveform data acquisition from DAQmx, then saving this with Write LVM.  In this case, the timestamp comes from the DAQmx driver, which gets it from the operating system.  You don't say, but I suspect you are not using RT.  This leads to jitter in timestamps on the order of 20ms.  You only have about 5ms jitter (which is the default slice time for Windows systems), so that's not too bad.  Windows, Linux, and OSX are not real time operating systems, so all will show this type of problem.  How do you fix it?

The easiest fix is to change the timestamps to what you know they are before you save the data.

The second best fix is to get your values from hardware.  NI-SCOPE boards make this easy, since they have a relative time output that gives you hardware time.  You can use a DAQ board to measure the period of your start pulses and use this to calculate a new timestamp, based on the old one.

The most expensive fix is to use LabVIEW RT to run your system, but the hardware solution will outperform it.

If you need more help, please give us more information:
  1. Data acquisition board used
  2. Operating system
  3. LabVIEW version
  4. Your code (or at least the part that is giving you problems)
0 Kudos
Message 2 of 4
(2,563 Views)
Hi,

I am using exactly that - please find attached. FYI I am using an M-Series low cost DAQ card on Windows XP, Labview 8.2.1.

The post makes sense. Thankyou for your input. I think I will take the starting time (to) from the waveform and round to the nearest 0.1 second if working at 10 Hz.


0 Kudos
Message 3 of 4
(2,551 Views)
If you want to pursue a hardware solution, you can use one of the counter inputs on your M-series board to acquire the period between trigger pulses.  Cache your first timestamp and increment it by the period at each acquisition.  If you have a computer interface to the trigger generator, you can also query it directly for period.  Either of these methods eliminates having to know the period beforehand.

There are a couple of other things you can do to improve your code.  You can probably remove the sequence structure from you diagram.  The only thing it is doing is forcing the read of the filename to occur after the first section of code.  Unless you change variables during a run, you can move the setup code outside the loop (and the clear task code, as well).  At the low frequencies you are running, it will work, but generating a new task each iteration is inefficient and will cause you problems if you try to run much faster.
0 Kudos
Message 4 of 4
(2,535 Views)