Multifunction DAQ

cancel
Showing results for 
Search instead for 
Did you mean: 

Synchronized AI and AO

Hello,

I’d like to measure the time delay between three analog signals. This delay might be rather large (say a few hundred milliseconds) and I’d like to measure it accurately, the more accurate the better (with reasonable resolution down to a few microseconds). For that, I have a NI USB-6259 (M series), LabVIEW 8.2.1, NI-DAQmx 8.7 drivers and a Core2Duo PC.

Before using real signals, I’d like to test the vi using AO on the DAQ device. This leads me to the included vi acquisitionContinue.vi, which should do synchronized AI and AO, AO using /Dev1/ai/SampleClock and /Dev1/ai/ StartTrigger, this is heavily inspired by LV’s included examples (Multi-Function-Synch AI-AO.vi). I use three analog outputs for which I create signals using a custom made vi genereSignal.vi.  AO0, AO1 and AO2 being respectively wired to AI0, AI1 and AI2 on the 6259 device.

The analysis is simple enough that it does not prevent continuous acquisition while done “in place” (in the acquisition while loop) and this vi can run for hours without any problem.

However, I’m a little bit disappointed since the signals on AI seem to be time shifted from the signals I send to AO, moreover, this phase shift is different for each signal and not constant in time. I believe that there might be something hardware-related but I just can’t get it. What seems even more strange is that either AI signals are in phase advance of a very few samples or they are shifted for a very long time (around one period of a typical 10 kHz signal).

I thus would like to get your opinions about this. Am I missing something? Is the synchronization done well? Where does this phase shift come from and might it be avoided or predicted?

Thank you for your attention and time.

0 Kudos
Message 1 of 1
(2,935 Views)