LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Urgent help!!!!!!!! how to measure the time difference between the signal input and output?

During data processing, theDAQ did the A/D transfer first, then read digital data, then process it, then output analog signal, this whole process will consume some time. My question is how measure it?
 
My idea is to compare the original signal and processed signal(both have the same frequency) on a oscilloscope, then measure the time delay. Does it work?
 
Can anybody help me find another method?
 
Thanks a lot!!!!!!!
0 Kudos
Message 1 of 3
(4,078 Views)

Your oscilloscope will probably do the most accurate job.  However, you can arrive at a reasonable approximation by running a test.  First use the Tick Count (ms) or Get Date/Time in Seconds VI to capture the start time, then run your DAQ loop a known number of times, the larger the better...say 1000.  Then use the Tick Count (ms) or Get Date/Time in Seconds VI again to capture the "done" time.  Subtract the first time from the second time to arrive at the total time required for the loops and divide by 1000 to arrive at the average time for a single loop. 

0 Kudos
Message 2 of 3
(4,053 Views)
Thanks a lot!  I will try it.
All the best!
0 Kudos
Message 3 of 3
(4,037 Views)