04-27-2009 06:41 PM
hi
i am using the USB 9221 DAQ and i made a simple datalogger which stores the signal into a lvm file. the signal is a linear slope with a negative gradient
i have also included the express vi timer where when the voltage goes below 9V the timer is marked at the instantaneous time and when the voltage goes below 1volt it is marked again, hence it will give me two ponts in time so i can calculate the "delta" time (t)
but when the sampling rate is increased the timing is often incorrect.. can anyone help?
also when the signal is displayed on a chart it shows the signal delayed by up to several seconds. i know this becuase when i know the signal is zero it still showing the voltage above zero.
thanks
04-27-2009 06:50 PM - edited 04-27-2009 06:50 PM
more info
the signal starts at 10volts and moves to 0volt, the signal is linear
04-27-2009 07:54 PM
Labviewus-
No you can't calculate dT. You are dependant on the system mSec timer (the 9221 is software timed) and it is not tracable to a NI SECOND!
You can approximate dT - but you have no measurement of time- just a guess that means nothing. use a counter!
04-27-2009 08:07 PM
so i carnt even use the express vi Elapsed Time ?
04-27-2009 08:17 PM
Not if you want a measurement tracable to the unit "Second"
the system clock is a really bad "watch" and keeps "civil time" or "time of day"- no direct relationship to the second. (anyone got a Degree I can borrow? Farenheight, celcius, arc, collage- it makes no difference)
04-27-2009 09:28 PM
sorry but im a bit confused.
isnt the express vi displaying the elapsed time, therefore the time it hapened? or is the while loop holding it back giving it a delay
04-28-2009
11:08 AM
- last edited on
04-23-2025
01:46 PM
by
Content Cleaner
Hi Labviewuser3,
The 9221 is actually hardware timed, offering a sample rate of up to 800 kS/s (multiplexed). Thus, if you have configured a sample clock for the 9221, the amount of time between samples will be a constant value determined by the sample clock (e.g. a 1 KHz rate would result in a 1ms dt between samples).
Thus, you can determine the exact time elapsed (based off of the sample clock, not the OS clock) by checking the amount of samples in between your two threshholds. The amount of time elapsed would be equal to 1/Fs * N, where Fs is your sample rate, and N is the number of samples that occur between the threshholds.
EDIT: The timestamp is determined in software using t0 from the OS clock, and dt based off of the sample clock rate. Since your application is measuring the difference between two points, the t0 value should not be important.
-John
04-28-2009 11:24 AM
04-28-2009 06:34 PM
thanks guys
what would be the best way to to measure the delta time?
currently i have logic which logs the time from the Express VI "elapsed time" both start and finish times. is this sufficent in giving the correct time?
on the alternative how can i devalop logic to count the number of cycles from the 9221 as you have explained?
can you also explain the differnece between the aqiusition mode settings :
1 sample (on demand)
1 sample (HW Timed)
n samples
Continuous
thanking you
04-29-2009 11:18 AM
Hi Labviewuser3,
I would look at the amount of samples that occur in between the two voltage levels. Using continuous sampling, all you really need to do is count the number of samples that occur between your threshholds. If you are logging the data to a file, this would be the number of rows between the data points. In LabVIEW, you could use the Threshhold 1D Array function to determine the index where your threshhold is crossed if the data is in an array. Just take the difference between the indexes to determine how many samples have elapsed.
The difference between the timing types is as follows: