09-08-2020 06:30 AM
Like Bert, I get this error as soon as the number of elements is bigger than 2147483647 (7FFFFFFF).
It's curious that Add and Subtract are bugged, but Increment and Decrement work.
This is on LabVIEW 2019 Sp1 19.0.1f3. and an i7-6700HQ.
09-08-2020 06:59 AM - edited 09-08-2020 07:00 AM
@ngene wrote:
@FireFist-Redhawk wrote:
I ran your VI in 2017 and got the correct result. 2017 64 bit Professional, Windows 10 Enterprise, 64GB onboard RAM.
Interesting and more confusing.
These are the results we get for different array dimensions in LV17.
(dimension size) x (dimension size 2) >> Output
1k x 3M >> 1 (incorrect)
2k x 3M >> 11 (correct)
3k x 3M >> 11 (correct)
4k x 3M >> 1 (incorrect)
5k x 3M >> 11 (correct)
5.1k x 3M >> 1 (incorrect)
Well first off, let's all agree that you can initialize an array with a total number of elements greater than max(I32). I initialized a 10k by 3M array (30 billion elements) just fine. So that settles that.
The add function though, is indeed behaving a little weird on large arrays. After I initialize a 2k x 3M array, the add doesn't happen starting at index1 568, index2 around 1,303,000 (didn't wanna crash LV by adding code to figure out exactly where):
Initializing a 3k x 3M array, this same thing happens at index1 = 136, index2 ~ 2,060,000:
But, BUT, initializing a 1k x 3M array, the add fully completes:
Not sure exactly what is happening, but seeing as I'm getting different results than the OP, I'm guessing computer hardware is making some sort of difference. What is your processor and onboard RAM?
Saying "Thanks that fixed it" or "Thanks that answers my question" and not giving a Kudo or Marked Solution, is like telling your waiter they did a great job and not leaving a tip. Please, tip your waiters.
09-08-2020 07:26 AM - edited 09-08-2020 07:27 AM
@FireFist-Redhawk wrote:
@ngene wrote:
@FireFist-Redhawk wrote:
I ran your VI in 2017 and got the correct result. 2017 64 bit Professional, Windows 10 Enterprise, 64GB onboard RAM.
Interesting and more confusing.
These are the results we get for different array dimensions in LV17.
(dimension size) x (dimension size 2) >> Output
1k x 3M >> 1 (incorrect)
2k x 3M >> 11 (correct)
3k x 3M >> 11 (correct)
4k x 3M >> 1 (incorrect)
5k x 3M >> 11 (correct)
5.1k x 3M >> 1 (incorrect)
Not sure exactly what is happening, but seeing as I'm getting different results than the OP, I'm guessing computer hardware is making some sort of difference. What is your processor and onboard RAM?
Initially we've got the feedback from the customer with LV20, but not sure about their HW.
Then on our side we identified and reproduced the bug on the following 2 PCs: