04-19-2010 05:44 PM
Hello,
I am trying to solve a Ax = b system, and have setup my algorithm to use the LU decomposition followed by the appropriate use of the "Solve Linear Equations" VI. I then check the error output from this VI, and if there is an error I try using the "PseudoInverse Matrix" VI.
I am doing this in a loop with a progressively larger matrix for each iteration.
This works great EXCEPT for when my algorithm gets close to the maximum memory allowed (i.e. - the iteration BEFORE I get an error message stating that the VI has run out of memory).Then, although I do NOT get an error code from my "PseudoInverse Matrix" VI, the values output from this VI are complete garbage (really really large values).
Has anyone come across a similar issue before? Is there a way around this? i.e. - I would like to know when/if the "PseudoInverse Matrix" VI will not provide valuable data output so I can stop my algorithm the iteration beforehand.
Thanks!
04-20-2010 10:17 AM
Zamjir,
Can you post the VI and the matrix needed to replicate the issue? I didn't find any other instances of this happening but that doesn't mean that it's not expected. I understand that things ramp up in memory and just before it tops out it shows large (not correct) values but I don't know if this is expected and is just LabVIEW's way of letting us know that it's about to error out (in case you wanted to build in a shut down program you could base it off of these large values) or if it is a bug. If you could post the code so I can take a look at it and also a screenshot of the "out of memory" error you see, I will see what I can find out for you.