Willard,
You can actually change the digits of precision in LabVIEW to display more than 2 decimal places. If you right-click on the scan rate control and select 'Format and Precision' you can alter the digits of precision to include 5 decimal places.
By setting the buffer size to 1 you are setting a memory size to hold only 1 scan at a time. By setting the number of scans to write at a time to 1 that means that every time the AI Read VI is called within the while loop, the VI will only read 1 scan from the buffer. Therefore, the hardware is set-up to read 1 scan every 50-60 seconds and so the hardware places that scan in the buffer. Then every time the AI Read VI is called in software, it reads 1 scan from the buffer. So, if the AI Read VI already read
the 1 scan from the buffer, it will wait until the buffer gets the next scan from hardware (50-60 seconds) and then plot it on the chart. So, with the set-up you have, you will not see any lag of temperature data. Once the hardware scans, the software will almost immediately read that scan. I hope this helps.
Regards,
Todd D.
Applications Engineer
National Instruments