08-16-2011 04:42 AM - edited 08-16-2011 04:42 AM
Hello!
I am working with the VI below (it's a modified example, so sorry if it's a little ugly). Its purpose is simple: read the input and output a period of a sine wave with the same amplitude (a lot like amplitude modulation).
The problem that I am encountering is that slowly but surely, the output is more and more delayed: when starting the VI the delay is 50ms (one period of the sine wave - the minimum possible), but after a few minutes it goes well above 200ms. The output still follows the input very well, it's just that it comes later and later.
I have tried experimenting with a few parameters, like the DataXferReqCond property, with no success.
Any idea why this happens?
Solved! Go to Solution.
08-16-2011 08:17 AM
One thing I forgot to mention: I'm using the USB 6341 board
08-19-2011 05:57 AM - edited 08-19-2011 06:00 AM
08-19-2011 10:51 AM
08-19-2011 04:03 PM
Hi BogdanB,
I don't think you're getting any errors, but there are some red flags in how the code is implemented. Without going into details, I would instead do it this way:
The AI Task is now software-timed and will just return the most recent point whenever the read is called. This way you won't have the possibility of reading entries that have been previously buffered. The loop rate is regulated by the analog output task. Frequency is now adjustable on-the-fly.
Best Regards,
08-23-2011 03:45 AM
Thank you, that explains it.
It might maybe have been good if you did explain in more detail, since I'm not sure why changing the details of the signal generation increase or decrease the delay. But your variant does work as needed
08-23-2011 11:12 AM
Hi BogdanB,
In the original code the sampling info given to the generated waveform was based off of the wrong sample clock. The Analog Output was actually updating at the specified Sample Clock Rate, but the waveform was defined based off of the frequency of the Analog Input. This would explain the change of delay over time.
The initial delay is also non-deterministic in the original code. The Analog Input is buffered, and begins before the Analog Output. There is an indeterminate amount of time that elapses between the start of Analog Input and the start of Analog Output, so you could have several samples queued up at the start.
The new code drops the AI timing altogether and just reads the most current value whenever you are ready to generate new data to be put into the AO buffer. The delay is a fixed amount of time which depends on the AO buffer size. I think this should be a more desirable way to implement the code (and it also allows you to change frequency on-the-fly). Just make sure to configure a large enough buffer to prevent underflows.
Best Regards,