Dynamic Signal Acquisition

cancel
Showing results for 
Search instead for 
Did you mean: 

how to measure the delay of my soundcard

Hello, i have posted a question similar to this before, but i guess i hadn't figured it out afterall.
 
I am trying to measure the latency of my PC-soundcard for calibration purposes.
I have connected the input-channel with the output-channel and are generating a sine- waveform to the output-channel. I am using the "Gain and phase"-vi from "sound and vibration"-toolkit 4.0, to measure the phase-shift through the soundcard, but it is not working very well. Every time I run my program a different phase is measured. The gain is measured correctly, so the "Gain and phase"-vi works.
I guess that i need some kind of synchronization of the two channels to get the correct measure of the phase.
I'm using labview 8.0 and my soundcard is a SigmaTel C-Major Audio.
Does anyone have an idea how to do this?
 
 
Thanks
0 Kudos
Message 1 of 10
(11,054 Views)

Dear Jacob,

It would be helpful if you send the clip of the graph of the Phase versus frequency i.e some data

and what is the expected result , frequency of the sine wave

Thanking You,

K.Narayanan

0 Kudos
Message 2 of 10
(11,049 Views)
Yes, you will need to synchronize the output and input channel to measure the delay. Do you care about the total delay through both AO and AI, just the AO, or just the AI? You can modify SVL Measure Propagation Delay (DAQmx).vi from the Sound and Vibration Toolkit to use your soundcard and then measure the total propagation delay through both AI and AO.
Doug
Enthusiast for LabVIEW, DAQmx, and Sound and Vibration
0 Kudos
Message 3 of 10
(11,034 Views)

Hello

I would like to measure the delay through both AI and AO and also the delay from AO to AI.

I have looked at the Measure Propagation Delay(DAQmx).vi but i couldn't get it to work properly. How can it be modified to work with my soundcard?

Jacob

0 Kudos
Message 4 of 10
(11,030 Views)

Hello NaruF1

I haven't got a phase versus frequency graph so far. I have just tried to generate a 500 Hz sine to the output channel and then measure the phaseshift at the input channel.

I have attached the file i am using to measure the phase. Everytime i run the program a different phase is measured.

regards

0 Kudos
Message 5 of 10
(11,014 Views)
Is the sticking point the configuration of the AI and AO? or is it the measurement of the delay from the acquired signal and the reference? If you already have a configuration scheme that you have settled on, post it so we can start from there. If you have already evaluated various techniques for measuring the delay (difference between a trigger point, phase measurements, cross spectrum phase, auto correlation, etc.) let us know which technique you selected and why so we can work together better.
Doug
Enthusiast for LabVIEW, DAQmx, and Sound and Vibration
0 Kudos
Message 6 of 10
(10,988 Views)
Thanks for your VI attached to a previous reply. Sorry I missed it previously. The Gain and Phase VI will not work for an arbitrary delay unless you the cumulative delay is less than one period of the test tone. The gain and phase VI will always return a wrapped (from -180 to 180 deg) phase lag. A test tone with frequency of 500 Hz could only be used to measure a delay of 2 ms or less. Also, I would recommend that you include at least 20 cycles of the test tone for more accurate phase measurements. Test signals that have a clear trigger point, such as a pulse, or test signals that don't repeat, such as MLS, may be better suited to making longer delay measurements - neither of these test signals should be used with the Gain and Phase VI.
Doug
Enthusiast for LabVIEW, DAQmx, and Sound and Vibration
0 Kudos
Message 7 of 10
(10,982 Views)
When I run the attached VI to measure the total delay, the results appear to be correct, but they are not consistent from run to run. Is there a better synchronization scheme (currently, we just put AO and AI in the same loop, but we don't explicitly pass a trigger from AO to AI or vice versa) that we can use to always start the tasks at the same time. Also, I had to add a significant delay to get decently consistent results. Try out the attached VI and let us know how it works for you.
Doug
Enthusiast for LabVIEW, DAQmx, and Sound and Vibration
0 Kudos
Message 8 of 10
(10,971 Views)

We have used the sound card delay vi which was attached in the thread for our application. We get different(approx doubled) latency values when sampling rate is different.

 

Say if the Sound input function sampling rate is 44100 latency observed is in the range 2-4ms

                                                If sampling rate is 22050 latency observed is in the range 4-6 ms for the same microphone.

 

We are playing the pulse @ 44100.

 

What is the relationship between frequency and the latency.

 

As per the concept microphone will have constant latency, can anyone explain why there is doubling in the latency value when sampling rates are altered.

0 Kudos
Message 9 of 10
(9,537 Views)

Since the digital filters have a certain depth (a depth of a fixed number of samples), the slower you sample, the longer it takes for filter to fill up. So decreasing the sample rate for a given digital filter would cause more delay and increasing the sample rate for the same filter would decrease the delay.

 

However, changing the sample rate on some devices may put the ADCs on the device into a different mode with a different filter or may change post processing filters. So that's why I mentioned that as long as the same digital filter is in place, then decreasing sample rate increases the delay through that filter.

 

Hope that helps on that question.

 

DanO

Conditioned Measurements Hardware 

0 Kudos
Message 10 of 10
(9,515 Views)