LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

best way to write waveforms to file

I'm reading 7 waveforms from DAQ at 20kHz sample rate and wish to write the data to a binary file [e.g. timeval ch1val ch2val....etc, in format timestamp sgl sgl sgl etc].

What is the easiest way to do this (without using the easy VIs)? The format of the file must be readable by other software - this is why I can't use the easy waveform VIs. The sgl and timestamp formats should be no trouble to read using e.g. MATLAB code.

The other consideration is that this writing to file must not interfere with the data acquisition.
0 Kudos
Message 1 of 4
(3,376 Views)
If you are going to be reading this in a different environment, like Matlab, it would be good to strip of the array values from the waveforms (use the get waveform component VI) and build a time array. The waveform is made up of three components (to, dt, Y[]). If the t0 and dt are going to be the same for all waveforms then you could build the time array on the fly given the dt of your signal. A binary file wouldn't be the easiest file type to use if you were going to be reading this outside of LabVIEW. I would advise writing the data to a tab delimited ascii file. This would be far easier to read back into Matlab, excel, or any other third party software.
0 Kudos
Message 2 of 4
(3,376 Views)
cheers for the response Blimpie Boy, I ended up doing almost that - split to components. I wrote it all to a binary file tho, as there are gonna be hours of this data and so space efficiency is important.

Of course, this is the kind of function that is a given in any other dev environment - a standardised output format which has a publicly available spec so you can write utils to read in other environments.

BUT labview has the paranoid need to make it extra hard for you to get large amounts of data out. I had to pretty much write each individual type (timestamp, sgl etc) to file on its own and try various formats in MATLAB before I was able to write and read a file reliably.

LabVIEW's docs helped only marginally.

LabVIEW's phone support did not he
lp really at all, apparently their network was down.

have fun,
Reginald.
0 Kudos
Message 3 of 4
(3,376 Views)
To convert a floating-point value to it's binary equivalent all you have to do is use the flatten to string function. The tricky part is getting the application on the other end to recognize the format. As soon as you start writing binary data you have to start worrying about what processor the end user's machine is using because it makes a difference in how the bitstream is interpretted (bid-endian vs little-endian).

Remember also that to represent LV timestamps (which in the wider world are very nonstandard) you have to use either a double-precision float or a U32 number. In the Microsoft world a slightly more standard representation of time is the number of days since Midnight Jan 1 1900--a value which can be represented in a single-precision float.

T
here are also datalogging examples that demonstrate writing binary files--though those tend to use very low-level binary representations. They aren't even scaled to floats. This would provide the most compact storage since no number would be more than 2 bytes (16-bits), and the fastest too since you aren't taking the time to scale it in LV.

Mike...

Certified Professional Instructor
Certified LabVIEW Architect
LabVIEW Champion

"... after all, He's not a tame lion..."

For help with grief and grieving.
0 Kudos
Message 4 of 4
(3,376 Views)