02-19-2020 11:33 AM
Hello! I am using the Elvis III board and I am facing a problem. The board is compatible with 32-bit software and therefore allocating memory accordingly. This allocated memory is not enough for me. Is there a way to force LabVIEW to allocate more memory without using third party software?
02-19-2020 11:56 AM - edited 02-19-2020 11:59 AM
Third party software won't help you with memory, but maybe your program is written very inefficiently. We can help with that.
Also give more details (LabVIEW version, OS, etc.)
02-19-2020 12:40 PM - edited 02-19-2020 12:45 PM
@asf17 wrote:
Hello! I am using the Elvis III board and I am facing a problem. The board is compatible with 32-bit software and therefore allocating memory accordingly. This allocated memory is not enough for me. Is there a way to force LabVIEW to allocate more memory without using third party software?
Can you post your code?
I really want to see what kind of program it takes to actually need more than 3.5 gigabytes of physical RAM
02-19-2020 06:57 PM - edited 02-20-2020 01:05 AM
On a 64bit OS, LabVIEW 32bit can use up to 4GB of RAM.
What does your program do and what are the main data structures. Is the code intelligently designed for maximum inplaceness or is it littered with express vis and dynamic data?
Does it run out of memory immediately or after a while of running?
02-20-2020 12:23 AM
The program acquires a continuous signal (trigger) through the analog input pins. Once the trigger is on, it saves a signal acquired from the oscilloscope into an array. The expected outcome is an array of size 9,000 * 17,000 (double).
It doesn't run out of memory immediately but rather at the 24th trigger. First, it slows down and skips few triggers, and then it crashes with a memory error.
02-20-2020 12:35 AM
02-20-2020 01:19 AM
@asf17 wrote:
The program acquires a continuous signal (trigger) through the analog input pins. Once the trigger is on, it saves a signal acquired from the oscilloscope into an array. The expected outcome is an array of size 9,000 * 17,000 (double).
It doesn't run out of memory immediately but rather at the 24th trigger. First, it slows down and skips few triggers, and then it crashes with a memory error.
As has been said, this is well over 1GB and will not fit, no matter how you slice it. Are you also displaying it (graph, array, etc.)? Again, what is the datatype (plain array, waveform data, dynamic data?).
What is the data rate? Is there time between triggers to stream the data to disk? How fast does the signal change?
How much information is in these points and why do you need that many?
How many points do you get per trigger and are you growing the data structure with each trigger? What's the data size after 24 triggers?
Your AI has a resolution of 16 bits, so why are you doing DBL (4x more bits, i.4. 48 bits/point completely wasted! That's a lot of hot air!)
How are you planning to process it later?
Why can't you show us some code so we get a better idea of the problem and possibly suggest alternatives.