10-13-2022 03:01 PM
I'm trying to sample data from an analytical balance that is being read with the NI VISA function. I am trying to write this data to a TDMS file at exactly a 10 second interval. I have tried doing this with the time elapsed function, measuring the delta between the computer time stamps, and also the high resolution relative seconds function. Each time, it gives me a measured interval between 10.02 and 10.07 seconds that are not consistent. Anyone have any thoughts on a better way to do this?
Thanks!
10-13-2022 07:45 PM
So you have a timing variance that is significantly less than 1%, involving processes where you have no control over timing, such as the communication with the instrument. How quickly is the signal really changing and how much error is in the measurement?
I also assume that you are nor running this on a RT system, but under windows. Do you think you really have a huge error or do you just want simpler timestamps cosmetically.
10-14-2022 06:54 AM
@iv1622 wrote:
I'm trying to sample data from an analytical balance that is being read with the NI VISA function.
Tell us more about this interaction. Does the balance constantly send out data or do you have to request it?
10-14-2022 07:04 AM
The way it is set up, the balance is constantly sending data and being read and has no communication from the computer back to the balance. The loop is then writing this to a TDMS file after 10 seconds or another interval.
10-14-2022 07:22 AM
This is being run using a windows machine. The problem is that over the time that I am collecting data (up to 4 days) this variance compounds.
I have a pump that cycles every 5 minutes and pumps water onto the balance. I am trying to make sure that the pump is accurate in a time interval of 10 minutes, and if this window goes from measuring 2 cycles vs 2.1 cycles, it can make a significant difference.
10-14-2022 08:21 AM
@iv1622 wrote:
The way it is set up, the balance is constantly sending data and being read and has no communication from the computer back to the balance. The loop is then writing this to a TDMS file after 10 seconds or another interval.
Does the balance send the data at a constant rate (ie a message every 100ms)? If so, you can just use that for your log timing. Using my example of a message every 100ms, you would read 100 samples and log the last one. No Windows timing required other than to keep up with the messaging.
10-14-2022 10:42 AM
@iv1622 wrote:
and if this window goes from measuring 2 cycles vs 2.1 cycles, it can make a significant difference.
Significance depends on the accuracy of all parts. What is the precision of the balance? What is the precision of the pump? What difference does it make if anything is off by a few percent?
From 10.00 to 10.03 (or 10.07) seconds is a difference of 0.3% (0.7%). From 2 to 2.1 cycles is 5%, or ~10x larger. Makes no sense.
You have the time between measurements (and I really don't believe that the weight at 10s differs significantly from the weight at 10.07s) and you also have the total elapsed time. You can use both to guide your code logic.
Are you saying that the clock is incorrect. If you want an interpolated data for exact times, just measure frequently and calculate the value by interpolation using the two flanking measurements.
10-14-2022 04:07 PM
@iv1622 wrote:
I'm trying to sample data from an analytical balance that is being read with the NI VISA function. I am trying to write this data to a TDMS file at exactly a 10 second interval. I have tried doing this with the time elapsed function, measuring the delta between the computer time stamps, and also the high resolution relative seconds function. Each time, it gives me a measured interval between 10.02 and 10.07 seconds that are not consistent. Anyone have any thoughts on a better way to do this?
Thanks!
Expecting jitter free timing from Windows or any other non-realtime OS is an exercise in frustration.
10-24-2022 09:23 AM
Precision of balance is 0.01 mg. The problem was not so much the inaccuracy but that there was only a positive delta. This compounded at each sample and added up to minutes of error over a few days. I was able to solve the problem by performing the logging of data separately from the reading of the data.
This allowed the logging to be done consistently +/- 1 ms without compounding.