LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

anyway to allocate an array with more than 2^31-1 elements with labview x64

I am trying to allocate an array of bytes with more than 2 GB in labview 2021 x64, but it seems that it still caps the max array size at 2^31-1.

 

Does any of the newer labview versions open this limitation? 

 

Joe

 

0 Kudos
Message 1 of 9
(330 Views)

You can only have 2^21 elements, not Bytes.

 

If you need more, you can technically implement a file-based approach where you can write to and read from disk instead of memory.

 

Or you could make every element of the array significantly more than a byte, like U64 instead and then 16GB should be do-able. Bear in mind that memory fragmentation will also seriously limit the largest array you can allocate as arrays must be in contiguous memory.

Message 2 of 9
(315 Views)

@Intaris wrote:

You can only have 2^21 elements, not Bytes.


Exactly, you can have for example. a CDB array of 2^21-1 elements and the array would occupy ~32GB. What size do you need and how much memory does your computer have?

 

Maybe you could use a map where the key is a 64bit integer and the value has the type of whatever you want the array element to be. This will be useful if the arrays are sparse.

 

Can you explain what you need to do with that gigantic array? With any "workaround", you would basically need to rewrite every single array and signal processing operation you want to use. In any case efficiently operating on such large data structures will be expensive. Of course you can forget about having indicators for it.

 

I am sure the problem can be solved by rethinking the approach.

 

 

 

Message 3 of 9
(261 Views)

@Intaris wrote:

You can only have 2^21 elements, not Bytes.


wrote:

Exactly, you can have for example a CDB array of 2^21-1 elements and the array would occupy ~32GB.


The max number of elements in an array in LV 64bit is 2^31-1 (like the OP initially said).

The array element could be a cluster of several integers, which would increase the total amount of bytes... until you reach the physical limit of your available memory.

Without knowing more about the context, I guess loading the data piece by piece from a file for processing seems more reasonable.

 

Regards,

Raphaël.

0 Kudos
Message 4 of 9
(245 Views)

Hi Joe,

 


@Joe_Guo wrote:

I am trying to allocate an array of bytes with more than 2 GB


My 2ct:

  • you could use an array of strings, with each string holding kiB to MiB of the whole "array of bytes"…
  • you could use an array of cluster of array[U8], with each "sub-array" holding kiB to MiB…
  • or you can simply read parts of the file (as suggested before): reading larger consecutive parts of files usually is quite fast nowadays…

Why do you need to hold more than 2GB of data in memory? (There might be good reasons…)

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 5 of 9
(218 Views)

Is 2^31-1 the limit of total number of elements, or number of elements per dimension?

Would this work to emulate a very large 1D array?

paul_a_cardinale_1-1755014465550.png

 

 

Message 6 of 9
(178 Views)

Thanks for all the suggestions.  I understand there are ways to deal with large arrays, but I am just wondering why labview x64 have a limit on the array size and if this limit still exists in newer labview versions (2021+).

 

 

0 Kudos
Message 7 of 9
(176 Views)

LabVIEW 2024 64bit has the same behavior.

Indexes and dimension sizes for LabVIEW arrays are I32 by design (both on the array primitives and in their low-level representation in memory), hence the 2^31-1 limit (per dimension, as @paul_a_cardinale commented).

0 Kudos
Message 8 of 9
(165 Views)

@paul_a_cardinale wrote:

Is 2^31-1 the limit of total number of elements, or number of elements per dimension?

Would this work to emulate a very large 1D array?

paul_a_cardinale_1-1755014465550.png

 

 


I am pretty sure the total number of elements is limited, not elements per dimension.

 

A lot of things would break, e.g. trying to reshape to 1D with a length of the product of dimension sizes.

 

One could do a 1D array of clusters, each containing a 1D array of DBL.

 

0 Kudos
Message 9 of 9
(145 Views)