LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

LabVIEW - "Write to Binary File Questions"

Solved!
Go to solution

Hello,

 

I'm developing a LabVIEW program for data acquisition, and am writing the data (collected via a digitizer, which is now correctly sending the data to RAM) to a binary file using the "Write to Binary File" Function in the File I/O. 

 

The file size I will generate will be quite big (10-20 GB).  However, I only have 2 GB of RAM on this computer (it's an old lab computer, I have to work around it).  As a result, I'm currently planning on writing the software so that it sends the data from RAM to my binary file ~100 kB at a time.  The data collection is over a long period, so I don't expect any trouble with the data transfer rate. 

 

Here are the questions:

 

1) The "Write to Binary File" Function states that it will truncate arrays greater than 4 GB.  Is this in reference to the length of data being appended during that specific call of the function, or is it in reference to the total length of the binary file at that point in time?  If it's the latter, I'm in trouble...

 

2) Simple question, but I want to double check my understanding that the "Write to Binary File" function does in fact send the data to memory on the hard drive (with nothing left over in RAM), even if the binary file remains open during the entire data collection session. 

 

Thanks,

 

3MRach2S

 

I have two questions about this

0 Kudos
Message 1 of 9
(4,472 Views)
Solution
Accepted by topic author 3MRach2S

1. The file size can be larger. I just made a file that is about 18GB by opening a file, calling write to binary file in a loop, and closing file.

2. Write to binary file does not handle the memory management. What determines if your array is held in memory is whether or not it is used elsewhere. I would just try out your current idea and then monitor the memory as you run it.

Message 2 of 9
(4,458 Views)

The data will go to the drive while LabVIEW will re-use the same memory (RAM) for each chunk of data if programmed correctly.

Just write a small test program to see how things work out. What's the datatype?

Can you show us a simplified version of your code?

Message 3 of 9
(4,452 Views)

"It's All in the Wrist Timing".  There are several timing issues that may come into play:

  • How fast are the data coming into your PC (or "How busy is the CPU just getting the data").
  • How fast can you write the data to disk (or "How busy is the CPU getting rid of the data").
  • How long to you want to keep writing (or "How worried are you about losing your data in the event of an error").

Because LabVIEW is a Data Flow Language, it can handle simultaneous "Acquire" and "Save" operations, but from your description, it doesn't sound like either of the first two "timing" considerations will be a problem for you.  LabVIEW can also "save you" from losing all your data in case of a Error condition stopping your program -- depending on how you are writing to disk, you can often do a "Write some data, Close the File, Reopen the File, go to the End, and Repeat as needed until everything is written" loop to ensure that you periodically Close and then Reopen the file, so if the system crashes, you've got (unless you are very unlucky) the data saved thus far.

 

Bob Schor

Message 4 of 9
(4,442 Views)

Also be sure to have a look at the Producer/Consumer architecture.  The idea is that you read from your digitizer in one loop and send the data to another loop via a queue.  This second loop then logs the data.  This way, the writing to the file works in parallel with you acquiring the data.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
Message 5 of 9
(4,433 Views)

Why save your data in RAM and write it later in such huge chunks?

 

I always write the data to disk immediately after taking each measurement, even when I am using a high speed DAQ to capture waveforms like a real time oscilloscope.

 

Okay actually I write data to a queue and it is written to disk as fast as it can...


@3MRach2S
========================
=== Engineer Ambiguously ===
========================
0 Kudos
Message 6 of 9
(4,418 Views)

@RTSLVU wrote:

Why save your data in RAM and write it later in such huge chunks?


@3MRach2S

I don't consider 100kB a "huge chunk". 🙂

0 Kudos
Message 7 of 9
(4,399 Views)

Thanks for all the input.  It was very helpful.  My code is now working without any of the about mentioned concerns becoming a problem. 

 

I also incorporating opening and closing the file so it isn't lost in the case of a crash as suggested by some.  This computer has crashed once or twice and the samples I'm looking at degrade somewhat from sample collection, so this is quite useful!

 

The producer/consumer architecture looks very useful.  I'll try and implement it in a later version. 

 

As for the data transfer rate - I collect data very quickly (200 MS/sec), but for very short periods of time (10s of microseconds).  Data is collected at 10 Hz, so the time averaged data collection rate is relatively slow, so the computer can handle the transfer quite alright. 

 

Thanks again everyone for the help! 

0 Kudos
Message 8 of 9
(4,377 Views)

@3MRach2S wrote:

 

I also incorporating opening and closing the file so it isn't lost in the case of a crash as suggested by some. 


Even if the file is open when the computer crashes, everything already committed to disk will be there. If you worried about data still in the buffer, a "flush file" can be done. No need for constant closing and opening.

0 Kudos
Message 9 of 9
(4,369 Views)