LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

time delay feedback

I am using labview software with a DAQ card. Is it possible to send a waveform from the DAQ card, read in a sinusoidal waveform from a piece of equipment and then output that same waveform but shifted in time by an amount delta t (which can be controlled) while still generating the original waveform?

 

Cheers

 

Steve

0 Kudos
Message 1 of 24
(4,359 Views)

BY THE WAY - THE CARD IS A PCI - 6259 CARD AND I AM USING LABVIEW 8.

 

CHEERS

 

STEVE

0 Kudos
Message 2 of 24
(4,357 Views)

bobdone wrote:

I am using labview software with a DAQ card. Is it possible to send a waveform from the DAQ card, read in a sinusoidal waveform from a piece of equipment and then output that same waveform but shifted in time by an amount delta t (which can be controlled) while still generating the original waveform?

 

Cheers

 

Steve


Yes, but the extent of that capability is limited by the OS you run under as well as the CPU,  channel count, and sample rate.

 

In Windows you can set up continuouse double buffered acq from th AI and just write that buffer to the output. In this scenario you are going to get your data in big piles so the size of the pile will limit how quickly you can apply a change to the phase delay.

 

If running in RT, and your update rate is less than 30 kHz, single channel, and using top of the line CPU then the time delay between requesting a phase change is limited by the comm method you are using to submit the change to the phase. If you use another AI channel to control the phase, then the change can be realized within one update.

 

If you switch to an FPGA solution, you could apply the same idea and run as fast as 40 MHz.

 

Ben

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
Message 3 of 24
(4,343 Views)

OK. I don't quite follow. I have 2 available AOs to generate the waveforms, 8 AIs, the DAQ card gives a maximum sampling rate of 1250000 samples/second. My computer also runs in windows with CPU ~2 GHz. Also, I need to be able to generate and read in waveforms up to ~100KHz. Can I set up the code to run in real time or would I have to use buffers to read in and write the data?

 

Steve

0 Kudos
Message 4 of 24
(4,317 Views)

I haven't really started coding yet either, is there any example code that I could look at to get an idea?

 

Steve

0 Kudos
Message 5 of 24
(4,291 Views)

Hi Steve,


Good afternoon and I hope your well today.

 

Thanks for the post - NI UK Support have given me notification of your problem and I'd like to help you out some.

 

Here is some example code which I written - its a start to what I think your aiming for - LV8.

 

The code features a DAQmx AI and AO task and a queue. The design pattern is producer/consumer - which allows the Analog Input (AI) task acquire data at a constant rate, and allows the code to analyses the data at a later data - using the queue. In this code, updating Test Phase control simulates the analysis code. 

 

The AI code is a standard continuous generation of waveform.

 

Using synchronization methods such as a common start trigger, merging of error clusters, and a shared sample clock these tasks will stay in synch. 

 

The AO (Analog Output) task outputs two waveforms, the first one you mentioned that never changes, and a second waveform (iteration 2 of the for loop) which is updated - in this case by the case structure in the Middle loop of the loop, which alters the phase value of the generate waveform vi. This loop is an example of  how to update the user buffer in Regeneration Mode. It simply updates the whole buffer with different data. 

 

As ben mentioned the lag between changing the Test Phase control at the output updating is based on the number of samples in the waveform (1kS) and the Rate. 

 

As for examples, LabVIEW ships with loads of great examples in the NI Example Finder. You can find this by going to Help>>Find Examples.. then in your case Hardware Input / Output >> DAQmx.

 

There are also great examples and information on DAQ programming on ni.com. Here is one for starters,

 

Getting Started with NI-DAQmx: Main Page

http://zone.ni.com/devzone/cda/tut/p/id/5434

Contains FAQ, Basic to advance programming techniques and documentation

 

Please let me know how you get on , and if you have any questions about the code example or the links/resources,

Hope this helps! 

 

 

 

Kind Regards
James Hillman
Applications Engineer 2008 to 2009 National Instruments UK & Ireland
Loughborough University UK - 2006 to 2011
Remember Kudos those who help! 😉
Message 6 of 24
(4,279 Views)

Sorry to hijack this thread but I can't help myself. Smiley Surprised

 

Thank you very much for stepping in with that wonderful reply James. Please tell your boss that "Ben was very happy with one of my posts!".

 

Ben

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
Message 7 of 24
(4,268 Views)

Hi Ben,

 

Thanks for the reply - I'm just doing my job... but I have got a bit of a reputation at NI UK (and the odd USA team)  for trying to support the forums during my work day. Thanks for the kind words 🙂 

 

How did the information find you Steve?

Kind Regards
James Hillman
Applications Engineer 2008 to 2009 National Instruments UK & Ireland
Loughborough University UK - 2006 to 2011
Remember Kudos those who help! 😉
Message 8 of 24
(4,262 Views)

Hi James,

 

Thanks for the help. I seem to be having a problem finding the "Get terminal name with device prefix.vi" to work - it doesn't seem to be in the system library. Is this from an older/newer version of labview than my LV8?

 

Cheers,

 

Steve

0 Kudos
Message 9 of 24
(4,244 Views)

Hi Steve,


Thanks for your reply, I wrote the code in LabVIEW 8.6.1. And then saved to previous.

 

The VI you mentioned is in the LabVIEW _Utility library. 

 

Its just generating this IO string, 

 

/Dev1/ai/StartTrigger

 

based on the  AI physical channel name,

 

 Dev1/ai0

 

So you can just create your own constant/controll input for the start digital edge trigger source input.

 

Hope this helps, 

 

 

Kind Regards
James Hillman
Applications Engineer 2008 to 2009 National Instruments UK & Ireland
Loughborough University UK - 2006 to 2011
Remember Kudos those who help! 😉
0 Kudos
Message 10 of 24
(4,237 Views)