LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

how long does it take to write 10,000 fractional numbers to a file by Laview?

I am using an oscilloscope to monitor a voltage siganl continuously for a long time (an hour). there is going to be some instresting signal which lasts only several hundreds microseconds. but we don't know when it is going to happen. So in practice, we save the data fetched from the scope first; then go back to look at the data after the experiments finished. that is why we have to monitor continuously and for a long time. so what is the minimum time to write 10,000 data? because we don't want to miss the intresting signal.
0 Kudos
Message 1 of 6
(2,988 Views)
tz0003;

It will take just a couple of miliseconds to save that amount of data to a text file.

Use the attached VI to test it yourself.

Regards;
Enrique Vargas
www.visecurity.com
www.vartortech.com
0 Kudos
Message 2 of 6
(2,988 Views)
As Enrique pointed out, it won;t take very long to save the data. You could also save the data as binary (DBL) or transfer from the scope as raw binary (i.e. I16) and save it that way. In any case, I suspect that the file write will take much less time than the time it takes to transfer the data from the scope. One problem that you might have is file size. Saving 10000 records every second or so for an hour as text would create a HUGE file. Far to big to be loaded in Excel. You'll need to think about how and what's going to do the analysis.
0 Kudos
Message 3 of 6
(2,988 Views)
Hello, Enrique,
thanks for the quick answer. I compared your code to mine and I notice one difference which make mine slow: I put index in front of each data before writing them down. actaually what I did is to use " format to string" plus " write characters into file". the 10,000 data is processed one by one, which really drag the whole thing down, compared to "write to spreadsheet file".
the reason I put index is to easy the later data analysis. however I can do it after the experiment finished, read the data back from the file and put index.
thanks again and glad to have a chance to talk with you.

regards,
tao
0 Kudos
Message 4 of 6
(2,988 Views)
you are right. actaully when we did the last time, we were recording 50 data points per second for 2 hours. we ends up close to a million data points. Execel would not hold it, definitely. so we take "sigma plot". it works. since most of the data is background noise and there is a very good signal-nosie ratio. so we can easily find out the part we are interested ,then zoom in. now because we will have billions data for one day, I am thinking maybe we can set up the scope better.
0 Kudos
Message 5 of 6
(2,988 Views)
Since your signal to noise ratio is high, only save data that is above the threshold. You should be able to run a comparison on each waveform from the scope and decide whether to save or not. Perhaps keep the previous scan in memory until the current one is tested so you could have the baseline data from immediately before the event you are looking for. Add a timestamp so you know when the anomaly occurred.

This way you will only have thousands of data points, and they will all be relevant. It makes the analysis much easier.

Lynn
0 Kudos
Message 6 of 6
(2,988 Views)