04-09-2013 07:33 AM
Hello,
I'm trying to measure the time between 2 datapoints.
When the data acquirement begins the time should be saved and when the signal reaches 90% of it's max.
Those 2 times then get subtracted and then you have the elapsed time.
But I'm not quite sure how to do this... I was thinking with flat sequences.
Solved! Go to Solution.
04-09-2013 08:08 AM
Are you constantly capturing a signal? What is your sample rate?
What you need to do first to capture the signal. You can then find the maximum value with the Array Max & Min function. Calculate your 90% (0.9*Max). Search the data until your get greater or equal to the 90% mark. Get that sampling number. Your time is then number of samples to the 90% mark divided by the sample rate (in Samples/Second).