04-11-2017 07:55 PM
Hello all!
I am posting a question in hopes that someone may be able to guide me or teach me some tricks as to how to address and issue I am having. I have attached some waveforms as well as my VI for your reference.
So here's the deal, I wish to minimize the time delay between my input and output signal as much as possible. Every time I run (continuously) the VI, the waveform is in a new position, and I am not really sure why. I've attached oscilloscope images for your reference. One output waveform is just the same input waveform and the other is just the harmonics. I want know if there are some techniques that I can use to minimize the time delay and stabilize the output in one location.
Thanks in Advance!
04-11-2017 08:05 PM
On mobile, so can't look at your VI right now. But if you are using the "run continuous" button, stop. You should be using a loop. Any initializations should be before the loop. Any closing should be after the loop.
If I remember, I will give your VI a look in the morning.
04-12-2017 08:03 AM
I gave your VI a try, and after adding a stop button and also pulling my head out of my - and plugging the AI into the right socket (took me a while to work out why I couldn't reproduce your waveforms - harmonics of noise will do that I suppose), I can see the same results.
Beyond outputting your original waveform after you've done the processing in sync with the processed results (and not displaying the current data, but the delayed data on the oscilloscope) I'm not sure how much faster you'll be able to make this. You can move the buffer setting outside of the loop, and a bunch of the controls also probably don't need to be inside, but I'm guessing that's just for convenience when troubleshooting...
I guess a hardware solution might be what you're looking for if you need to display the incoming waveform as it happens, and then the processed output as close as possible. That would at least cut out communication time to the computer and perhaps speed up your processing (I don't know...)
04-12-2017 12:03 PM - edited 04-12-2017 12:04 PM
There's a fundamental conceptual problem here. You're doing analysis on chunks of AI data. All of it comes from the past. And it needs to be far enough in the past to provide the needed frequency resolution for your spectral decomposition. delta f = 1/T.
So you collect data from the past, then you do some processing on it which also takes finite time. Next you're going to take the calculated results and push them into a buffer to "get in line" to be generated as AO.
It sounds like you're hoping to have the AO signal overlay cleanly on the AI signal. The only way that will *appear* to happen is if the delay from all those pieces just happens to add up to an integer # of cycles of the incoming waveform. But the harmonics being displayed in one pulse are really describing what was present from some other pulse in the past.
There may be ways to get the delay smaller or more consistent, but there is fundamentally *not* a way to drive it to 0.
So what amount of delay would be both acceptable and possible?
-Kevin P
04-12-2017 04:18 PM
Hello Kevin!
Thanks for your input. It is very important that I have the output waveform overlay over the original. Do you know of any VIs that I could incorporate such that I can delay the output and have it overlay the input? or if there is some kind of trigger I can use to have the program overlay them. I understand a zero time delay is pretty much impossible.
Thanks in advance.
04-12-2017 05:24 PM
Hello cbutcher!
Thank you for your input! I will give your suggestions a try
04-12-2017 06:06 PM
Nothing personal here, just a little additional prodding b/c past experience has shown me that quite often on the forums, the problem that's asked is not quite exactly the one that needs to be solved.
So we've established that 0 delay is impossible. Yet you still want the signals to "overlay." What exactly do you mean by that?
To make them overlay on the GUI graph would be trivial. Just bundle the waveform from AI Read into an array with the 'exported time signal' waveform containing only harmonics. The output is calculated from the input, and they have true sample-to-sample correspondence in time.
To make them overlay out in the real world would require you to do the same bundling of raw AI data and harmonics and feeding both together to your AO task, each signal on a different AO channel. Both will be delayed by the same amount, thus they will overlay in the real world on your scope.
Further things to look into:
- Kevin P
04-12-2017 07:36 PM - edited 04-12-2017 07:39 PM
@Kevin_Price wrote:
There's a fundamental conceptual problem here. You're doing analysis on chunks of AI data. All of it comes from the past.
...
There may be ways to get the delay smaller or more consistent, but there is fundamentally *not* a way to drive it to 0.
So what amount of delay would be both acceptable and possible?
This analysis is SO last year... Fortunately, NI has recently released the perfect hardware module to allow a happy solution to this problem.
Introducing the NI-1985 cRIO!
Unfortunately, there is no price listed on that page, so you might need to contact NI directly for a quotation...
(please read to the bottom of that post before doing so...)
04-13-2017 01:28 PM
Thanks Kevin! Really appreciate your input
04-13-2017 01:29 PM
What I meant by overlay was have the output waveform be in phase with the input waveform