LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Name of data in unwrapped variant input of malleable VI changes on second call

I'm seeing some pretty strange behavior in the course of my usual chicanery and I could use some more eyes on it if there are any here patient enough.

 

Firstly, I'm writing an XML wrapper/unwrapper.  The native LabVIEW one is fine, but I wanted to write my own that, while less robust, provides more readable XML and can tolerate changes in datatype more gracefully.  My problem is with the XML unwrapper, or rather the malleable VI that wraps it (I'll try not to overload this word too much).

 

Image 1

1.png

In image 1 (above), there are 3 VIs: the sandbox for unit testing my XML wrapper and unwrapper, the XML unwrapper malleable VI, and the XML unwrapper core VI.  I'll go through what's happening here.

 

First the sandbox: So my test data is the cluster constant named "cluster".  My XML wrapper VI takes this cluster as input and outputs the XML text seen in the string indicator named "XML".  Then the XML unwrapper malleable VI takes this XML text and the original cluster as inputs, and tries to output a cluster of the same structure with all the values populated from the XML.

 

Within the XML unwrapper malleable VI, the XML is input as a string (unsurprisingly) and the cluster is input as a variant (though this could conceivably be any datatype).  This malleable VI is mostly just a wrapper for the core VI, where all the real magic happens.  The core VI takes the same cluster (or whatever other datatype) and tries to create a binary string of that datatype using values from the XML.  The unwrapper malleable VI then unflattens that string into the input datatype and outputs it.

 

With me so far?

 

Image 1 also shows the results of executing the sandbox as so configured.  Looking at the sandbox front panel, we can see that the vale of the cluster indicator "Anything Out".  This is bad, it should be populated with the values from "XML".  There is a simple reason for this, which I will explain.

 

We can't see anything from the malleable VI, because it is reentrant, but we can see the control values of the core VI, because it isn't reentrant.  Here we see the cluster name is "Anything In", the name of the control in the wrapper VI.  However, the cluster element in the XML is named "cluster", after the original cluster.  Since there are no elements named "Anything In" in the XML, the core VI doesn't find any appropriate vales and outputs a binary string full of nulls.

 

The malleable surprise renaming the data is the first issue, but I think I had a work around for that.  I remember seeing in the openG toolkit years ago something about unwrapping variants.  In some circumstances, the datatype in a variant can be LV variant, but if you "unwrap" (this word is getting really overloaded now), the variant, the type in the variant becomes the original type.  That's not exactly what's happening here, as we can see the type in the core VI is cluster, but something's going on.

 

Image 2

2.png

Image 2 shows the workaround and the results of running the sandbox with it.  In the malleable VI, we "unwrap" the "anything in" input so the type, and hopefully the name, turns back into the original (a cluster named "cluster").  This is done by converting the variant to another variant of type "void".  I don't know how it works either.  We run it and... Huzzah! The values of the sandbox indicator "Anything Out" are populated.  They're all the same values from the XML.  The name of the cluster in the core VI control is "cluster."  The variant unwrapper worked and now the XML unrapper works!  Or does it...

 

Image 3

3.png

In image 3, I've changed the number of calls to the unwrapper malleable VI from 1 to 2.  It's the same code as before, and the same inputs as before both times, but now I'm getting different output.  We can see the "anything out" indicator is zeroed out again, and in the core VI control, we can see the name of the cluster has changed back to "Anything In."

 

So what's going on here?  Is "unwrapping" the variant not the right fix?  Why is the malleable VI renaming the data anyway?  Why does it work for a single call, but not for multiple calls.  There is something going on under the hood here and I think only a NI software engineer can answer.

 

Anyway, thanks for reading about my silly problem.  Any comments or suggestions are appreciated.

 

Edit: also I see the error wires and realize people might think an error is causing the VI to not actually execute a second time.  This is not the case.  I've inspected the output since taking the screen shot and there is no error.  I only added the error wire shift register in to eliminate parallelism.

0 Kudos
Message 1 of 9
(3,452 Views)

I decided to strip this problem down to the fewest possible parts where the behavior is still observed.  Attached are three pictures demonstrating the same behavior as before.

 

Picture 1 shows the "Get Type Information" VI getting the name of the VIM's control instead of the name of the string constant in sandbox.

1.png

 

Picture 2 shows what happens if you "unwrap" the variant within the VIM.  One call returns the name of the string constant, the other call returns the name of the VIM's control.

2.png

 

Picture 3 shows what happens if you increase the loop count from 1 to 2.  Now all 4 calls return the name of the VIM's control.

3.png

 

Here's a bonus.  If we replace the loop count constant with a control of value 1, both calls return the VIM's control name.

4.PNG

 

I've also attached these 2 files for anyone else to try to make sense of this.  Version is 2019 64-bit.  Any insight is appreciated.

 

Download All
0 Kudos
Message 2 of 9
(3,364 Views)

VIMs are probably not a good choice for this.

 

"Variant" is actually the expected output. If it sometimes outputs "String", than that's a bug. VIMs adapt there input, but the input should not change name (believe me, I tried). So you should always be retrieving the label of the VIM input, not what is connected to it.

 

If you'd make a normal VI with a variant input, it will work. The variant gets it's label from the labeled source. But the VIM will adapt and keep the label of it's input. Theoretically. Some inlining optimization magic might cause this to work in some situations, but it shouldn't.

0 Kudos
Message 3 of 9
(3,315 Views)

I did actually see that thread but wasn't entirely satisfied with the explanation given (also it sort of veered off into a NXG discussion).  I thought maybe I had found a workaround that was being held up by some bug, or, as you observed, a bug that was resulting in inconsistent behavior, either of which someone at NI might want to know about.

 

Making a normal VI with a variant input does indeed work, and in fact, that's exactly what my "core" VI is.  All the VIM is supposed to do (which only it can do) is take the core's binary string output and unflatten it to the target datatype.

 

The current solution is to do what you suggest: scrap the VIM and unflatten the core's output everywhere a call is made, which is not the worst, just some duplication.  I think really I'm just bothered that the VIM framework is tantalizing close to acting like real primitives (such as the native unflatten XML, which I was hoping to replace in many situations) and is the best new feature to LabVIEW in years in my opinion, but it's utility is limited by this one behavior.

0 Kudos
Message 4 of 9
(3,298 Views)

Note that a big problem with flattening cluster\classes (to XML and flatten to string) that the data changes (disappears) if the cluster\class is default. An intentional 'feature', but it sure makes a programmers life a living hell...

 

What are you actually trying to do here? I mean, what is the function of the VIM?

0 Kudos
Message 5 of 9
(3,265 Views)

There's going to be a lot of overloaded use of the word "unflatten" here.  Sometimes I mean unflatten from an XML string, sometimes unflatten from a binary string.  So try to keep the context in mind and hopefully this is intelligible.

 

In my first message, there are two functions, a FlattenToXML-like function that I call "WrapAnything", which doesn't need to be a VIM; the input is a variant and the output is a string, and a UnFlattenFromXML-like function that I call "UnWrapAnything", which does need to be a VIM as the structured data input/output could be anything (cluster, array, string, numeric, enum).

 

(I'm not using the native XML functions because I want XML that is more compact/readable, and more tolerant to differences in datatype.  For example, this is to be used to save/load configuration files.  However if we add a new parameter to the configuration (a new element to the cluster), the old files become unreadable by the native XML unflattener.  This can be frustrating to the user if they have to reprogram an entire complicated configuration.  In my function, the new element is populated by a null value (since it is not found in the XML), but at least all the other elements will load, saving the operator time.)

 

The unflatten "core" VI (not a VIM) takes in a XML string and a target data structure (as a variant) and generates an appropriate binary string.  The function of the unflatten VIM is to take that binary string and unflatten to that target data type.  The "core" VI "knows" the structure of the data, but can't output it as that type.  Only the VIM can do that.

 

So the Variant input is needed for 2 distinct purposes here: 1 to instruct the core VI what element names to look for in the XML and describe the structure of the binary string to be generated from those values, and 2 to inform the VIM how to unflatten that binary string.

 

This is why the name of the target data type is important.  It is, presumably, the wrapping tag around the entire XML string, and because of the recursive nature of the core VI, this name is the first thing it looks for.  I could conceivably make modifications to the unflattener core to ignore this wrapping tag and the "top-level" data name, but that is not my preference at the moment.  For now I would rather scrap the VIM and couple the UnFlattenFromBinary primitive with every call the the UnWrapAnythingCore VI.

0 Kudos
Message 6 of 9
(3,249 Views)

But why? What is the purpose of it all?

 

I am assuming you want to replace or retrieve some values from a cluster?

 

I can of course study it to find out (it might be obvious), but I really have to get some work done.

0 Kudos
Message 7 of 9
(3,227 Views)

Remake the Unwrap part.

Send in the Cluster and use Variant to Data to convert it to an array of Variants.

Loop through this array and check the datatype, try to convert the tags datatype accordingly with the right bitness and build up a binary string.

If a cluster is found, do a recursive call.

When all tags are read you should have a binary string that can be converted through Variant to data to the original Cluster.

 

I've made a general ini-file reader that should be easily modified for this. (It's probably fairly similar to OpenG's version)

 

This is a good reference. Get-Cluster-Names

 

/Y

G# - Award winning reference based OOP for LV, for free! - Qestit VIPM GitHub

Qestit Systems
Certified-LabVIEW-Developer
0 Kudos
Message 8 of 9
(3,218 Views)

@Yamaeda wrote:

I've made a general ini-file reader that should be easily modified for this.


Me to. It's here.

Message 9 of 9
(3,211 Views)