LabWindows/CVI

cancel
Showing results for 
Search instead for 
Did you mean: 

what is the accuracy of analog signal time stamp increment

Hello,

I've been reading 4 analog channels at 62,5 kHz rate by NI 6221 instrument.

One of them I want to use to calculate the speed of the voltage change and to determine the accuracy of the speed.

From the absolute accuracy table and many info in the NI web/forum I performed the voltage accuracy already.

 

But I came in trouble on how to determine the time accuracy in order to find out accuracu of every single dV/dt ?

From the instrument specification I have:

 

Timing accuracy ......................... 50 ppm of sample rate

Timing resolution ....................... 50 ns

 

My application wrote the sampling time increment as wf_increment=0.000016s

 

The easiest way would be to do the calculation:

Timing accuracy*wf_increment=50 ppm*0.000016s=0.8ns

But it is hard to believe the accuracy is so fine.

How to find out the standard uncertainty from all those puzzles?

 

Regards,

Przemyslaw

 

 

 

 

0 Kudos
Message 1 of 6
(5,117 Views)

Szanowny Panie Przemysławie,

 

temat rozwiązaliśmy w dyskusji bezpośredniej, ale wkleję informację dla potomnych. Podsumowując,  należy wziąć za wzór dokument ze strony (http://digital.ni.com/public.nsf/allkb/251AF75D1578F957862576A5007810FE)  i przyjąć niedokładność zegara w PPM razy częstotliwość próbkowania jako  niepewność pomiaru czasu. Inżyniersko ujmując będzie to: 

50 ppm of sample  rate - wzięte ze specyfikacji, pomnożone przez 10 kHz
50*10*1000/1000000 =  0,5 Hz
Idąc dalej:
1/10 000,5=9,99950002499875e-5 minus 1/10 000,  ostatecznie jest to błąd na poziomie około 0,005 us.
Reasumując, jest to  błąd raczej pomijalny w porównaniu do błędu pomiaru napięcia.

 

Łączę wyrazy szacunku
Tomasz Kachnic
Certified LabVIEW Architect (CLA)
Best Regards,
TK
Certified LabVIEW Architect (CLA)

0 Kudos
Message 2 of 6
(4,922 Views)

@n_dakota: This question could be of interest to me and other users: would you mind to translate your answer in English?



Proud to use LW/CVI from 3.1 on.

My contributions to the Developer Community
________________________________________
If I have helped you, why not giving me a kudos?
0 Kudos
Message 3 of 6
(4,904 Views)

Dear @RobertoBozzolo,

 

You are right, sorry for that:

We discussed with Przemysław how the uncertainty of time in meassurement should be counted. He got different results, which were based on different assumptions. Final answer is:

1. As a reference point we should take this document: http://digital.ni.com/public.nsf/allkb/251AF75D1578F957862576A5007810FE

2. We should find uncertainty of clock in PPM and multiply it with sampling frequency.

3. In numbers it would look like here:

50 ppm of sample  rate - specified for module, multiplied by 10 kHz
50*10*1000/1000000 =  0,5 Hz
Going further:
1/10 000,5=9,99950002499875e-5 and at the end we decuct 1/10 000, finally the result is 0,005 us.

 

This result is low enough to assume that it could be skipped if we would like to compare it with voltage meassurement uncertainty.

 

Hope it helps and have a great day!

 

Tomasz Kachnic
Certified LabVIEW Architect (CLA)

Best Regards,
TK
Certified LabVIEW Architect (CLA)

0 Kudos
Message 4 of 6
(4,887 Views)

OK thanks for that.

It means, if I'm not wrong, that actual period between samples can be 100 µs ± 0.005, right?

Now I understand how to compute it and how to explain to my customers another aspect regarding the accuracy of measuring system.

 

Now, this applies to accuracy of AI sampling rate and I have two more questions:

  1. does it applies to AO? That is, is the clock for analog generation the same used by measurement?
  2. will a similar method apply to accuracy of the counters in general (i.e. when used to measure frequency or set pulse width and so on)?


Proud to use LW/CVI from 3.1 on.

My contributions to the Developer Community
________________________________________
If I have helped you, why not giving me a kudos?
0 Kudos
Message 5 of 6
(4,883 Views)

Dear @Roberto Bozzolo,

 

Below please find my answers to your questions:

It means, if I'm not wrong, that actual period between samples can be 100 µs ± 0.005, right?

Yes.

1. does it applies to AO? That is, is the clock for analog generation the same used by measurement?

AE has different clock with different uncertainty. Looking at it you can see that here we also should be aware of channel latency: http://digital.ni.com/public.nsf/allkb/FA4C741619B95082862568F10072768E

2. will a similar method apply to accuracy of the counters in general (i.e. when used to measure frequency or set pulse width and so on)?

If it is about frequency meassurements, you can find a little more about it here: http://www.ni.com/white-paper/3619/en/

 

Have a great weekend!

Tomasz Kachnic
Certified LabVIEW Architect (CLA)

Best Regards,
TK
Certified LabVIEW Architect (CLA)

0 Kudos
Message 6 of 6
(4,827 Views)