06-11-2012 07:22 AM
I had posted this query in another board (LabVIEW)
Couldnt get a reply to it.
Will be grateful if someone could shed some light on this.
My hypothesis is that since I am using atleast 2 complete cycles to extract tone (phase) information, it is able to reconstruct the wave well and extract real phase values and hence the fixed delay between sampling is not observed. Another possibility can be that interval sampling is quasi-simultaneous.
To be honest, it makes me wonder I didnt get any offset response due to non-simultaneous sampling!
It would be good knowledge to learn this.
Anyone?
PS: Sorry if this repeat post is not appropriate, I thought maybe the DAQ guys could provide some insight.
06-12-2012 07:31 AM
There *are* some things that could throw you off:
1. External signal is changing amplitude too slowly for phase difference between input channels to be detected. Is it a truly analog source? Or is it a stair-stepping approximation to a sine wave?
2. I haven't had occasion to use "extract tone" much and generally do acquisition on arrays rather than waveforms. So I haven't really done a lot of poking around at the timing parameters (t0, dt) in waveforms, exactly how they vary under different acquisition conditions, and exactly how signal processing routines use them. I'm not in a place where I can do a quick check now either.
I say all that before wondering whether DAQmx might be out-clevering you. Could it be that somewhere under the hood in the DAQmx driver, t0 is assigned a different value for each channel you acquire to compensate for the known phase delay between channels? And then does the "extract tone" function pay attention to the compensated t0 value?
For example, what if your real phase delay between analog input channels was 20 microseconds. Further suppose that when you read your array of waveforms, DAQmx sets t0 for the 2nd channel to be 20 microsec *earlier* than t0 for the 1st channel. And even further suppose that the "extract tone" routine pays careful attention to the t0 timestamp. If all that is true, then both channels would "look" like they have their zero crossings at the exact same absolute time.
Keep investigating because there *will* be some phase delay between channels of any multiplexing board. There will turn out to be some reasonable explanation for why you haven't been seeing it thus far. You may need to change the external signal, the acquisition parameters, the post-processing, or maybe all of the above. But the phase delay is there, and should be quantifiable when all the signals and processing are done properly and consistently.
-Kevin P
06-12-2012 05:55 PM
Thanks Kevin for your reply.
Regarding your second point, that is the only reason I can think of. If the signals are saved onto a text file and then loaded to determine phase differences, then it will definitely show a phase difference between channels (post processing) because then the files are saved in a consistent manner.
But what happens when signals are analyzed right after DAQmx acquisition. Maybe you are right that t0 for each signal is maintained thus maintaing the integrity of the phase delay calculations.
Thats the best reason I can think of. I have tested for various resistor networks, direct inputs and with RC network. But I havent checked if it is true when data is analyzed after saving on a text file (signal integrity is lost).
Maybe someone can provide info on this.