LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

please fix the out of memory file corruption bug

When LabVIEW runs out of memory while saving a VI, it corrupts the VI and makes it unusable. Then you have to rewrite the VI from scratch. Folks, please fix this bug; there is so much productivity loss because of it.

 

Any application can run out of memory and even crash, but to corrupt the file it is running, is unacceptable. It's like going to the ATM, and making the mistake of attempting to withdraw more than your account balance, and as penalty, have the money in the account forfeited and the account closed.

 

This is a recurring problem and is happening with some frequency.

 

Neil

0 Kudos
Message 1 of 9
(3,070 Views)

Saving changes before you run a VI goes a long way to avoid that. It's not a fix but a workaround that has gotten so much into my blood that I have had virtually no corrupted VIs in almost 20 years of LabVIEW use.

Rolf Kalbermatter  My Blog
DEMO, Electronic and Mechanical Support department, room 36.LB00.390
0 Kudos
Message 2 of 9
(3,056 Views)

Hi nbf,

 

- this sounds like a typical use case for revision control software (like subversion): you will never loose previous versions of your VI anymore...

- this sounds like a typical use case for doing backups: you will never loose previous versions of your VI anymore...

- this sounds like a typical use case of "save often and early": you will never loose previous versions of your VI anymore (if done correctly)...

 

Usually I don't experience those problems a lot. There are circumstances where LV might get "a bit unstable", those often include things like heavy ActiveX/DotNet/DLL usage within the VI. When you know your VI is prone to crash the compiler you have to be prepared - as noted above...

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 3 of 9
(3,054 Views)

The file corruption happened when I was trying to save the VI. It took a long time trying to save, and finally gave up with an out-of-memory message. It also corrupted the VI.

 

Thanks,

Neil

 

0 Kudos
Message 4 of 9
(3,041 Views)

Then the VI was already corrupted. Saving a VI before having run it should not be able to produce out of memory situations unless you run on a VERY resource constrained machine, or the VI got corrupted. If this corruption happens repeatedly on your machine it's probably time to run chkdisk regularly and monitor the health of your harddisk. Or you got some other nasties in your system.

Rolf Kalbermatter  My Blog
DEMO, Electronic and Mechanical Support department, room 36.LB00.390
0 Kudos
Message 5 of 9
(3,035 Views)

I was running the VI on a MacBook. I was able to run the VI; then I saved the initialization data; only when I tried to save the VI itself, did things fall apart. Yes, I was running other applications, so the memory available was low; I might even have been thrashing. But does that justify the corruption of a file during a save?

 

One of your colleagues previously gave me useful tips on how to improve the memory footprint of my VIs. But sometimes we throw together a VI without having the time to optimize its memory use. The burden shouldn't be all on the user's shoulders. LabView could make a backup of the VI prior to saving it, and if the save failed, it could at least restore the original file, instead of both crashing and corrupting the file.

 

Neil

 

0 Kudos
Message 6 of 9
(3,011 Views)

"Usually I don't experience those problems a lot."

 

I have a feeling crashes followed by file corruption happen on Macs more often.

0 Kudos
Message 7 of 9
(3,010 Views)

nbf wrote:

The burden shouldn't be all on the user's shoulders. LabView could make a backup of the VI prior to saving it, and if the save failed, it could at least restore the original file, instead of both crashing and corrupting the file.

 

Neil

 


It could and I was under the impression that it does, but maybe it doesn't since that would slow down saves for larger libraries quite a bit, especially if the target volume is a different physical device than where the temporary directory is.

 

What OS X version is that? I would suppose that other applications shouldn't limit LabVIEW to much in what memory it has available, or does Mac OS X not virtualize each process like Windows does, so it gets to see as much memory as there is in the machine minus what the OS needs, even if there are numerous other applications using lots of memory too? "Out of memory" in LabVIEW really means that the OS refused a memory allocation request from LabVIEW and that means that there is absolutely no operation possible anymore safely. Unless you happen to have 100MBs or more of default data in your VI front panel I can't really imagine a situation that would cause out of memory situations during saving of the VI itself.

 

If you do use large default data in your VI front panel you should definitely reconsider your approach. It's a bad habit and should be replaced by reading that data at initialization time from a file or whatever.

Rolf Kalbermatter  My Blog
DEMO, Electronic and Mechanical Support department, room 36.LB00.390
0 Kudos
Message 8 of 9
(3,006 Views)

> Unless you happen to have 100MBs or more of default data in your VI front panel I can't really imagine a situation that would cause out of memory situations during saving of the VI itself.

 

Hi Rolf,

 

   I think you hit the nail on the head. The VI front panel was indeed displaying more than 100MB of default data when the out of memory situation arose; I probably shouldn't have saved the default data while several large images were being displayed. (I have since switched from LabView to Image J for processing of large images, so this situation shouldn't arise again.)

 

Thanks,

Neil

 

0 Kudos
Message 9 of 9
(2,977 Views)