Multifunction DAQ

cancel
Showing results for 
Search instead for 
Did you mean: 

Additive delay

Solved!
Go to solution

Hello!

I am working with the VI below (it's a modified example, so sorry if it's a little ugly). Its purpose is simple: read the input and output a period of a sine wave with the same amplitude (a lot like amplitude modulation).

The problem that I am encountering is that slowly but surely, the output is more and more delayed: when starting the VI the delay is 50ms (one period of the sine wave - the minimum possible), but after a few minutes it goes well above 200ms. The output still follows the input very well, it's just that it comes later and later.

I have tried experimenting with a few parameters, like the DataXferReqCond property, with no success.

Any idea why this happens?

0 Kudos
Message 1 of 7
(3,427 Views)

One thing I forgot to mention: I'm using the USB 6341 board

0 Kudos
Message 2 of 7
(3,419 Views)

Hi BogdanB!

 

* Sorry my response was not correct, I'll have another look over it and get back to you!


Regards,

Peter D

0 Kudos
Message 3 of 7
(3,381 Views)

Hi BogdanB

 

I have been having a look over your code and I have not been able to replicate your error.  It seems to be running correctly.

 

I have added error handling for one of the analog tasks, so any errors will cause an error message.


Regards,

Peter D

0 Kudos
Message 4 of 7
(3,365 Views)
Solution
Accepted by BogdanB

Hi BogdanB,

 

I don't think you're getting any errors, but there are some red flags in how the code is implemented.  Without going into details, I would instead do it this way:

 

AI_Modulation.png

 

 

The AI Task is now software-timed and will just return the most recent point whenever the read is called.  This way you won't have the possibility of reading entries that have been previously buffered.  The loop rate is regulated by the analog output task.  Frequency is now adjustable on-the-fly.

 


Best Regards,

John Passiak
0 Kudos
Message 5 of 7
(3,348 Views)

Thank you, that explains it.

It might maybe have been good if you did explain in more detail, since I'm not sure why changing the details of the signal generation increase or decrease the delay. But your variant does work as needed

0 Kudos
Message 6 of 7
(3,307 Views)

Hi BogdanB,

 

In the original code the sampling info given to the generated waveform was based off of the wrong sample clock.  The Analog Output was actually updating at the specified Sample Clock Rate, but the waveform was defined based off of the frequency of the Analog Input.  This would explain the change of delay over time.

 

The initial delay is also non-deterministic in the original code.  The Analog Input is buffered, and begins before the Analog Output.  There is an indeterminate amount of time that elapses between the start of Analog Input and the start of Analog Output, so you could have several samples queued up at the start.

 

 

The new code drops the AI timing altogether and just reads the most current value whenever you are ready to generate new data to be put into the AO buffer.  The delay is a fixed amount of time which depends on the AO buffer size.  I think this should be a more desirable way to implement the code (and it also allows you to change frequency on-the-fly).  Just make sure to configure a large enough buffer to prevent underflows.

 

 

Best Regards,

John Passiak
0 Kudos
Message 7 of 7
(3,299 Views)