LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Memory problems

Hi everyone,
we are currently remodeling a Labview 5.0 program for image acquisition but we are encountering some memory problems:
Let me first describe the jobs we are going through:
 
-multiple image acquisitions
-appending images one after another to get one large image (this was necessary to get the images as quickly as possible due to a part of the program written in C)
-bining the pixels after acquisition to get more compact files at an acceptable resolution
-visualizing the final pic (in fact an array of images) on the screen and saving it to the HD.
 
Now that the program is up and running, we get some errors we don't understand. First,  let me point out that we are working at the limit of memory, so often we get "Not nough memory to complete work". But, for some sets of parameters, we get a memory error even if the picture is smaller than others who worked out fine?!!
 
Next, there are some other errors with codes that we don't understand at all:
 
1. "Failure "E:\Iv45\mgsource\image.c" in line 10891 contact NI etc...", best of all there is no drive E: ?!
2. "Error -1074395988 occurred at IMAQ Writefile. Unable to write Data GFI." Part of the image file has been save, something like a third...
 
Does anyone know how to handle large image files in IMAQ and Labview? We don't seem to understand how the images are processed insided the memory.
 
Thanks for your time!
0 Kudos
Message 1 of 6
(3,106 Views)
It is hard to tell without seeing the code, but are you still using LabVIEW 5.0 or is it just an old 5.0 program that you are updating? Consider upgrading if you're still using 5.0.
 
Carefully inspect the code to make sure that you don't create extra data copies im memory.
0 Kudos
Message 2 of 6
(3,105 Views)

Hello again,

we are still using Labview 5.0 but only for updating this specific program, otherwise we are on Labview 6.0. For this one though, due to hardware compatibility problems no one dares to change anything: some hardware components are very old, the OS is something like Win98 or W2k, and moving any of the cards into a new computer seems unnecessary until this one breaks down.

The main problem is that the C code has been programmed by another student who did his Post-PhD in the lab (a few years ago) and the program has been updated and completed all along by different students, who just like me, are no LV pros.

A small part has been wirtten in C to be as fast as possible.The code is meant to do rapid mirror scans, acquire the signal (using a hardware device), put the data into arrays one after another into a large array (this is faster than writing each single image to disk, one at a time), perform some bining and create the final image and save the final image to the disk.

What we are wondering abouot is that sometimes the program crashes for images no larger than ~56 Mb, whereas 5 mins later we are capable of acquiring 200 Mb of data. (Knowing that our PC has 260 Mb of RAM).

Has anyone experienced similar problems or does anyone know where the memory mess-up could come from ?

 

0 Kudos
Message 3 of 6
(3,083 Views)
Boy, talk about a "target rich environment"! Take your pick:
  1. Old operating system/computer
  2. Old version of LV
  3. Old version of IMAQ
  4. Heavily patched code

It might be time to pull the plug on this puppy and start over...

Good news: You probibly won't need any C code

Mike...


Certified Professional Instructor
Certified LabVIEW Architect
LabVIEW Champion

"... after all, He's not a tame lion..."

For help with grief and grieving.
0 Kudos
Message 4 of 6
(3,079 Views)
We got an error message telling us to contact National Instruments.
 
Failure : "E:\Iv45\mgsource\image.c" - line 10891 LabView 5.0
 
did anyone get the same error or does anyone know what this could mean ?
 
Thanks
 
0 Kudos
Message 5 of 6
(3,066 Views)
Kryer,  I haven't seen that particular error message.  Keep in mind, you may get a better response to that question in the Machine Vision forum.

Without more detail, it's hard to tell, but I'll guess...  On the memory problem, I'm going to suggest looking at the binning algorithm.  If you take a large array of data and start splitting it up, it's pretty easy to generate unnecessary copies.  When the array gets larger, the problem usually gets worse though.  Perhaps on the smaller images you had a larger number of bins generated?

I know it doesn't solve the problems, but at least it's a place to look.

Of course, recommended reading:

http://zone.ni.com/devzone/devzone.nsf/webcategories/D0A775F79852010F862567AC00583FA7
0 Kudos
Message 6 of 6
(3,054 Views)