Multifunction DAQ

cancel
Showing results for 
Search instead for 
Did you mean: 

Controlled delay between analog input and output

Hello,

 

I'm using PCI-6120 and Labview 2009 and I'm measuring two analog inputs and using one of those as scaled analog output. I need to have a controlled delay between measured analog input and output. Currently I'm using very similar solution than this one: https://decibel.ni.com/content/docs/DOC-6460

 

Related to this topic: http://forums.ni.com/t5/Multifunction-DAQ/analog-input-with-delayed-analog-output/m-p/954534/highlig...

 

Problem is that when I use small delay times around 40ms or less I need to set high sampling rate and low number of samples and usually get either error 200016 (onboard device memory underflow) or error 200290 (the generation has stopped to prevent the regeneration of old samples). If I set number of samples -1, then Labview has stopped responding few times.

 

Is there any other way to generate delays that would allow smaller delay times? I'm aiming for delay range from 1ms to 1s whit accuracy 1ms or less. Higher minium delay could be acceptable, but the smaller the better. I't would be also very good if delay time can be changed during program execution, but it's not absolutely necesary.

 

I'm also open for suggestions for different hardware if some other system could handle this kind of application better. Even simple (but accurate) analog delay generator might be sufficient, but pretty much all of those that I could find and work in millisecond scale were used for audio applications and I'm not sure if those would be suitable for my application (piezo position control).

 

Any help would be greatly appriciated. My VI is in attachments.

 

- Ville

0 Kudos
Message 1 of 3
(3,598 Views)

Hello Ville,

 

One brainstorm - which I believe should be ideal if your DAQ board can do it (i believe it can, I haven't tested). 

 

If you need to create delay between AI and AO with fine resolition, then you could possibly consider to use counter to generate delayed pulse train generation, and you can use it as source of sample clock for delayed signal. You can start the counter output generation on AI start start trigger, and you can specify initial delay as small as few ticks of the counter timebase (80MHz).

 

I believe that this solution should be the best if it is fine to set the delay just once. 

 

 

With kind regards,

stefo

Certified-LabVIEW-Developer_rgb.jpg

0 Kudos
Message 2 of 3
(3,596 Views)

Thank you for quick reply,

 

I'm not sure if I understood your solution correctly, but if triggering time between AI and AO is set very low would that cause error 200290 (the generation has stopped to prevent the regeneration of old samples)? Since I'm collecting data from AI and then inserting that data after scaling to AO, having small delay between AI and AO would not give enough time for DAQmx Read to collect all samples. For example if I'm using sample rate 5kHz and 100 samples it would take 20ms to gather data and using lower than that delay would cause error or am I missing something? Or are you suggesting that triggering would be done certain amount time after AI has collected all the samples? That could be somewhat helpfull too becouse now I have to set delay greater than: sample rate / samples, to avoid errors (about 10-20ms higher).

 

- Ville 

0 Kudos
Message 3 of 3
(3,594 Views)