12-03-2010 08:17 AM
hello,
I know this not a labview question but i think many of you have to use this reference.
The trouble i have is where to find documents relative to pt100 calibration documentation up to date ?
i have formulae, now i want to test if i haven't done mistakes manupulating all of coefficients
i'm looking for table of DIN 60751/A2 with 0.1° C accuracy between at least between -60 +60°C
and the same for EIT90 calibration with 3 reference points ie : water, gallium ....
Thank you
Best regards
Tinnitus
Solved! Go to Solution.
12-03-2010 09:54 AM
hello,
i found a table for 3 reference points calibration i just miss DIN 60751/a2 table
regards
Tinnitus
12-06-2010 03:29 AM
Hello Tinnitus,
after a few research, I came across 2 documents, and especially one interesting one on your calibration issue. It seems like Swiss people got an accuracy as good as 0.02K with the DIN 60751 :
I'm pretty sure you're french, so I guess that the pdf language won't be a problem for you.
Regards~
Eric M. - Senior Software Engineer
Certified LabVIEW Architect - Certified LabVIEW Embedded Systems Developer - Certified LabWindows™/CVI Developer
Neosoft Technologies inc.
12-06-2010 04:29 AM
Hello,
Thanks for your reply but i already got this table accuracy of method is less than 0.02K but table is only 1°C .
As i have differences of 0.1 °C with some values for example with 157.33 ohm my code return 150,01305206°C
and table indicate 150°C.
So i would like to confirm with an external document.
Regards
Tinnitus
12-06-2010 07:25 AM - edited 12-06-2010 07:33 AM
for example with 157.33 ohm my code return 150,01305206°C
and table indicate 150°C.
Well, take 150°C you get 157.32513.... Ohm that will round up to 157.33 😉
The standards (as far as I remember) only define the T->R function and that how you can get the resolution you need.
Do you have a standard better than 20mK?? Then you will get R_0, A,B(,C) for this standard and the only truth you know is the T->R function to check against.
I remember programming a Newton zero interation to 0.1mK resolution for the R->T back in LV3.1 (and Excel) for my final thesis in a calibration lab.....
I only had a draft of the 'new' standard including the brand new ITS90 at that time. One of the draft translations had a minor error in the Type K EMV->T formulas , so I got in contact with the PTB ....
(guess where I work know 😉 , however not in the field of temperature )
But I would encourage you to contact your NMI (LNE?) and ask them 🙂
12-06-2010 08:02 AM
Thanks for reply
standard used is 5187SA accuracy is +-0.001°C
yes i saw that , i also use the newton methode to calculate T°
I just want to be sure not making error converting R->T and T->R just
because of code . i have already enough sources of error that adds and adds.......
Regards
Tinnitus
12-06-2010 08:58 AM
standard used is 5187SA accuracy is +-0.001°C
1mK is the increase of your sensor reading by radiation , when the sensor is hanging in the corner of a room (20°C) and you enter it 😉
I remember having problems getting u<5mK in a stirred liquid bath.....
What do use for reading? a Paar?
12-06-2010 11:10 AM
not a Parr
an asl 17A or an isotech uK 800 bridge
Tinnitus
12-09-2010 04:25 AM
hello,
always woking on platinum probes
i don't find information about pt25 and pt1000
Could i use same coefficients as pt100 for or there are disctincts ?
Best regards
Tinnitus
12-09-2010 04:48 AM
If not individually calibrated, use standard A,B and C and change R_o to 25, 1000 or whatever value you find at 0°C (a linear approximation to 0°C while using a water tripplepoint cell (0.01°C) seems to be valid 😉 ) That's the nice thing about the (simplified) Callendar van Dusen Formular.