LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Time Delay between Input and Output

Hello all!

I am posting a question in hopes that someone may be able to guide me or teach me some tricks as to how to address and issue I am having. I have attached some waveforms as well as my VI for your reference.

 

So here's the deal, I wish to minimize the time delay between my input and output signal as much as possible. Every time I run (continuously) the VI, the waveform is in a new position, and I am not really sure why. I've attached oscilloscope images for your reference. One output waveform is just the same input waveform and the other is just the harmonics. I want know if there are some techniques that I can use to minimize the time delay and stabilize the output in one location.

 

Thanks in Advance!

 

1.png2.png3.png4.png5.png

0 Kudos
Message 1 of 11
(6,067 Views)

On mobile, so can't look at your VI right now.  But if you are using the "run continuous" button, stop.  You should be using a loop.  Any initializations should be before the loop.  Any closing should be after the loop.

 

If I remember, I will give your VI a look in the morning.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
Message 2 of 11
(6,057 Views)

I gave your VI a try, and after adding a stop button and also pulling my head out of my - and plugging the AI into the right socket (took me a while to work out why I couldn't reproduce your waveforms - harmonics of noise will do that I suppose), I can see the same results.

 

Beyond outputting your original waveform after you've done the processing in sync with the processed results (and not displaying the current data, but the delayed data on the oscilloscope) I'm not sure how much faster you'll be able to make this. You can move the buffer setting outside of the loop, and a bunch of the controls also probably don't need to be inside, but I'm guessing that's just for convenience when troubleshooting...

 

I guess a hardware solution might be what you're looking for if you need to display the incoming waveform as it happens, and then the processed output as close as possible. That would at least cut out communication time to the computer and perhaps speed up your processing (I don't know...)


GCentral
Message 3 of 11
(5,998 Views)

There's a fundamental conceptual problem here.  You're doing analysis on chunks of AI data.  All of it comes from the past.  And it needs to be far enough in the past to provide the needed frequency resolution for your spectral decomposition.  delta f = 1/T.

   So you collect data from the past, then you do some processing on it which also takes finite time.  Next you're going to take the calculated results and push them into a buffer to "get in line" to be generated as AO.

 

  It sounds like you're hoping to have the AO signal overlay cleanly on the AI signal.  The only way that will *appear* to happen is if the delay from all those pieces just happens to add up to an integer # of cycles of the incoming waveform.  But the harmonics being displayed in one pulse are really describing what was present from some other pulse in the past.

 

There may be ways to get the delay smaller or more consistent, but there is fundamentally *not* a way to drive it to 0.

 

So what amount of delay would be both acceptable and possible?

 

 

-Kevin P

ALERT! LabVIEW's subscription-only policy came to an end (finally!). Unfortunately, pricing favors the captured and committed over new adopters -- so tread carefully.
Message 4 of 11
(5,990 Views)

Hello Kevin!

Thanks for your input. It is very important that I have the output waveform overlay over the original. Do you know of any VIs that I could incorporate such that I can delay the output and have it overlay the input? or if there is some kind of trigger I can use to have the program overlay them. I understand a zero time delay is pretty much impossible.

 

Thanks in advance.

0 Kudos
Message 5 of 11
(5,971 Views)

Hello cbutcher!

Thank you for your input! I will give your suggestions a try

0 Kudos
Message 6 of 11
(5,967 Views)

Nothing personal here, just a little additional prodding b/c past experience has shown me that quite often on the forums, the problem that's asked is not quite exactly the one that needs to be solved.

 

So we've established that 0 delay is impossible.  Yet you still want the signals to "overlay."  What exactly do you mean by that?

 

To make them overlay on the GUI graph would be trivial.  Just bundle the waveform from AI Read into an array with the 'exported time signal' waveform containing only harmonics.  The output is calculated from the input, and they have true sample-to-sample correspondence in time.

 

To make them overlay out in the real world would require you to do the same bundling of raw AI data and harmonics and feeding both together to your AO task, each signal on a different AO channel.  Both will be delayed by the same amount, thus they will overlay in the real world on your scope.

 

Further things to look into:

  • perhaps you'll help yourself by syncing the AI and AO tasks via shared sample clock.  Possibly a shared trigger, but a sample clock is usually sufficient by itself.
  • it'll help if your # AI samples to read corresponds to an integer # cycles of the fundamental frequency of the input waveform.  Otherwise you get spectral leakage.  If you really can't know the incoming frequency, you can reduce the leakage effect by collecting more full cycles.  I see from the VI Hierarchy window that there's a Hanning window involved in the calculation which should be helping you some too.  But when I look in the analysis code to see where it's used, well close your eyes, bar the doors, and hide the children!  My goodness what a mess!   I will *not* be trying to decipher *that*!

 

 

- Kevin P

ALERT! LabVIEW's subscription-only policy came to an end (finally!). Unfortunately, pricing favors the captured and committed over new adopters -- so tread carefully.
0 Kudos
Message 7 of 11
(5,961 Views)

@Kevin_Price wrote:

There's a fundamental conceptual problem here.  You're doing analysis on chunks of AI data.  All of it comes from the past.  

... 

There may be ways to get the delay smaller or more consistent, but there is fundamentally *not* a way to drive it to 0.

 

So what amount of delay would be both acceptable and possible?


This analysis is SO last year... Fortunately, NI has recently released the perfect hardware module to allow a happy solution to this problem.

Introducing the NI-1985 cRIO!

 

Unfortunately, there is no price listed on that page, so you might need to contact NI directly for a quotation...

 

(please read to the bottom of that post before doing so...)


GCentral
Message 8 of 11
(5,956 Views)

Thanks Kevin! Really appreciate your input

0 Kudos
Message 9 of 11
(5,935 Views)

What I meant by overlay was have the output waveform be in phase with the input waveform

0 Kudos
Message 10 of 11
(5,933 Views)