08-04-2009 08:29 PM
Hi All,
I am writing some LabVIEW code to be used with a cDAQ-9172.
I want to repeatedly output an analog waveform after receiving digital trigger. I understand there are a couple ways to do this, but the most promising for our purpose uses "DAQmx wait until done", stopping the task, and restarting it. Attached is a simplified version of what I am doing.
It works great, except that at certain frequencies with certain time-outs generations are missed, or even worse in our application, cut short (leaving a voltage offset until the next generation). For example, if the timeout is 0.03 s, and I have a digital trigger at a frequency of 13.3 Hz, the analog output is often missed or cut short. Frequencies above and below this are no problem, though. Does anyone know what is going on and ways to minimize this issue (especially the unintended offset)?
Also, what is the allowed range of the timeout input in "DAQmx wait until done"? I could not find it in the documentation and I assume it is not the full range of a double.
Thanks in advance
08-06-2009 09:45 PM
I'm assuming your using a 9265 for your AO, but can you identify your DIO? As for the range for the DAQmx Wait Until Done, the two extremes are If you set timeout (sec) to -1, the VI waits indefinitely. If you set timeout (sec) to 0, the VI checks once and returns an error if the measurement or generation is not done.
Regards,
Glenn
08-07-2009 01:51 PM
Thanks for the response.
For AO we use a NI 9263 and DIO a NI 9401.
What's the granularity of the timeout forDAQmx Wait Until Done? i.e. How accurate is it and to what precision? Is it accurate up to milliseconds, microseconds, etc?
08-10-2009 05:22 PM
Using Wait Until DoneAs documented, "...the Wait Until Done function/VI is used to ensure that the specified operation is complete before you stop the task." With that in mind the timeout's input as documented is set as seconds, so I don't believe it reaches millisecond measurement with the two exteremes I mentioned above (wait indefintely or check once).
Regards,
Glenn
08-10-2009 06:02 PM
In the documentation I see it only says "timeout (sec)." I assume that is simply specifying the units (as opposed to ms). It seems that if seconds were the least unit it could be adjusted by, then the parameter would be of type integer instead of double. Do you have any reason to believe otherwise? A quick empirical test shows that it does have accuracy down to at least 100 ms granularity. Do you know anywhere I can confirm this and get a feel for finer adjustments?
The major issue, though, is why do waveforms sometimes get cut short and leave a voltage offset in my above program?
I would really appreciate help with that.
08-11-2009 02:12 PM
08-11-2009 02:15 PM