LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Write/Read binary files >2GB

Hey!

Within a data acquisition system I am trying to write a 2D array as binary file to then open it accross different platforms.

Everything was working fine until I had to deal with some larger arrays (I've noticed that 2GB seems to be the breaking point).

 

Upon writing there is no complaint from LabVIEW.

Trying to read it on LabVIEW I get an out of memory error. If I try to read one of these problematic files on MATLAB all I get is zeros.

 

I'm running this on LabVIEW 2016 64bit on a Windows 7 with 32GB RAM and I'm trying to write on an NTFS disk so I don't see what could be the problem. Has anyone run into something like this? Or am I just missing something?

 

Below I attached the VIs I'm using to test writing and reading.

Download All
0 Kudos
Message 1 of 6
(3,461 Views)

What exactly are you trying to do with this data?

 

You typically do not want 2GB of data in memory.  What is normally done is write the data in chunks as it is being captures and then the read is also read and processed in chunks.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
Message 2 of 6
(3,434 Views)

All I need is to read it on MATLAB later for postprocessing.

I'm acquiring a lot of data and splitting into files that have a physical meaning. It turned out that sometimes those files are over 2GB. Maybe the easier way out is indeed to redefine how data is split into files or split each file into chunks as you suggest.

 

Nevertheless, what is the limitation here? Because having the >2GB array in memory before writing is no problem...

0 Kudos
Message 3 of 6
(3,424 Views)

Have you tried looking at the file using a binary file editor that can handle the large files?  I personally like UltraEdit, which has a binary view.  You could also read the file in chunks like I told you before to verify the file.  This advice is an effort to figure out where the problem lies: LabVIEW writing the file or Matlab reading the file.  If the file is verified, then the problem is with Matlab.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
0 Kudos
Message 4 of 6
(3,384 Views)

You have to use only file operations that will accept a I64 as the offset. Once you go over an I32 you can not access the data because the index is too large.

 

And as Tim mentioned, you really do not want to load the entire file into memory at a time. Read in chunks and process as you go.

 

BTW:

Writing will often be OK as you pass the I32 limit because LV uses a an I64 to track the bytes offset into the file. It is when you are reading that the I64 is important.

 

Been there, done that.

 

Ben

 

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
0 Kudos
Message 5 of 6
(3,380 Views)

I assume you are using fopen for Matlab to read the files.

A couple of things that you may or may not be aware of:

  1. You are appending the array size to the data stream. Do you take this into account when opening the file in Matlab?
  2. You are encoding the numbers in the file in big-endian. I assume you have an x86 processor and the default for fopen is native which is little endian for an x86.

Opening the file in chunks would be best as suggested earlier.

 

mcduff

0 Kudos
Message 6 of 6
(3,371 Views)