08-11-2025 09:53 AM
I am trying to allocate an array of bytes with more than 2 GB in labview 2021 x64, but it seems that it still caps the max array size at 2^31-1.
Does any of the newer labview versions open this limitation?
Joe
08-11-2025 10:19 AM - edited 08-11-2025 10:20 AM
You can only have 2^21 elements, not Bytes.
If you need more, you can technically implement a file-based approach where you can write to and read from disk instead of memory.
Or you could make every element of the array significantly more than a byte, like U64 instead and then 16GB should be do-able. Bear in mind that memory fragmentation will also seriously limit the largest array you can allocate as arrays must be in contiguous memory.
08-11-2025 01:38 PM
@Intaris wrote:
You can only have 2^21 elements, not Bytes.
Exactly, you can have for example. a CDB array of 2^21-1 elements and the array would occupy ~32GB. What size do you need and how much memory does your computer have?
Maybe you could use a map where the key is a 64bit integer and the value has the type of whatever you want the array element to be. This will be useful if the arrays are sparse.
Can you explain what you need to do with that gigantic array? With any "workaround", you would basically need to rewrite every single array and signal processing operation you want to use. In any case efficiently operating on such large data structures will be expensive. Of course you can forget about having indicators for it.
I am sure the problem can be solved by rethinking the approach.
08-11-2025 03:49 PM - edited 08-11-2025 03:54 PM
@Intaris wrote:
You can only have 2^21 elements, not Bytes.
@altenbach wrote:
Exactly, you can have for example a CDB array of 2^21-1 elements and the array would occupy ~32GB.
The max number of elements in an array in LV 64bit is 2^31-1 (like the OP initially said).
The array element could be a cluster of several integers, which would increase the total amount of bytes... until you reach the physical limit of your available memory.
Without knowing more about the context, I guess loading the data piece by piece from a file for processing seems more reasonable.
Regards,
Raphaël.
08-12-2025 01:53 AM - edited 08-12-2025 01:55 AM
Hi Joe,
@Joe_Guo wrote:
I am trying to allocate an array of bytes with more than 2 GB
My 2ct:
Why do you need to hold more than 2GB of data in memory? (There might be good reasons…)
08-12-2025 11:01 AM
Is 2^31-1 the limit of total number of elements, or number of elements per dimension?
Would this work to emulate a very large 1D array?
08-12-2025 11:16 AM
Thanks for all the suggestions. I understand there are ways to deal with large arrays, but I am just wondering why labview x64 have a limit on the array size and if this limit still exists in newer labview versions (2021+).
08-12-2025 11:48 AM - edited 08-12-2025 11:53 AM
LabVIEW 2024 64bit has the same behavior.
Indexes and dimension sizes for LabVIEW arrays are I32 by design (both on the array primitives and in their low-level representation in memory), hence the 2^31-1 limit (per dimension, as @paul_a_cardinal
08-12-2025 12:19 PM
@paul_a_cardinale wrote:
Is 2^31-1 the limit of total number of elements, or number of elements per dimension?
Would this work to emulate a very large 1D array?
I am pretty sure the total number of elements is limited, not elements per dimension.
A lot of things would break, e.g. trying to reshape to 1D with a length of the product of dimension sizes.
One could do a 1D array of clusters, each containing a 1D array of DBL.