Here is the real issue. I have an application where, using LabVIEW, I gather 17 different RTD temperature points from an Agilent 34970a gpib instrument, once per minute (meaning I graph all 17 once per minute). I then want to both store and plot these 17 points versus time and I want to have reasonable time accuracy over a data acquisition that will last days if not weeks. Using either "wait ms" or the more often reccommended "wait until next ms multiple" to time the saving and plotting of points while using an intial time and a time delta between points to set up the chart, still I have had nothing but trouble with time building up huge errors after but several hours or a day or so of running this VI. I therefore switched over to using an XY Graph that "acts" like a chart wherein I use a LabVIEW example VI called "XY Chart Buffer.vi". To this subvi I input one of my data points and real time (time actually measured at each one minute interval wherein even if the interval counter generates errors, still I will report actual time accurately. When I was using simple strip charts where I only get actual time one time at the beginning of the data acquisition, something down the time road seems to be competing with the interval counters in LabVIEW such that when the ms counter, say for example, as counted 60,000 ms (one minute of counter time), perhaps one minute and one second of real time has elapsed. The errors always build up in a way where my chart data clock is slow compared to the system clock. It is because of this very appreciable error build up over time that I chose to move toward the XY graph using a "history buffer" approach where I let the ms timer count out a minute but then I go measure and store "real time" using the get time function. Then, I am assured that the time at which I log the data, that time is accurate irresptive of whether or not the ms counter has truly counted out a minute or something else. Bottom line question is this, how do I ensure that I have a plot that has accurate time over days if not weeks without going out an measuring the time every time a go out to read my gpib instrument. Measuring time only once at the beginning of the process is not working. Are there other ways someone might reccommend?? Can I adjust the X-Scale Multiplier periodically to account for errors in the ms counters?? Thoughts??