LabVIEW Idea Exchange

cancel
Showing results for 
Search instead for 
Did you mean: 
0 Kudos
philipbear

Asynchronous Call on Malleable VIs

Status: Declined

National Instruments will not be implementing this idea. See the idea discussion thread for more information.

Bottom Line Up Front: The Asynchronous Call functions should maintain functionality of Malleable VIs when called.

 

Background: LabVIEW 2017 added the Malleable VI (https://www.ni.com/docs/en-US/bundle/labview/page/malleable-vis.html) which allows a VI to adapt to it's inputs at edit time if the inputs don't break the subVI block diagram. When attempting to start an Asynchronous Call on a Malleable VI the input terminals are forced to the types the VI is saved with. This is most likely due to the Strictly Typed VI Reference (https://www.ni.com/docs/en-US/bundle/labview/page/creating-strictly-typed-refnums.html) required to start the call. This type of reference loads the connector pane of the selected vi which seems to lock it to the saved terminal data types instead of allowing the malleable vi functionality through.

 

Proposal: Malleable VIs are a huge leap towards highly reusable code while maintaining the strict type paradigm standard to LabVIEW. By abstracting that functionality one step further to the vi reference/asynchronous call functions, possibilities for highly parallelized "type independent" code frameworks would be greatly improved.

 

Example: A framework to build/deploy/maintain many connections to a separate application is desired. This could be accomplished through TCP/IP or Network Streams or the like. By allowing these connections to be handled by one asynchronous vi each the setup time/maintenance time can be highly parallelized. These connections could support many different types of data transfer but would need to have each case coded and maintained separately. With asynchronous calls to Malleable VIs a single handling VI able to handle arbitrary data types internally would be sufficient for each communication method greatly reducing VI maintenance time and risk.

 

Conclusion: The Asynchronous Call functions should maintain functionality of Malleable VIs when called.

Philip Bear
Certified LabVIEW Developer
8 Comments
AristosQueue (NI)
NI Employee (retired)

TL;DR: This is a fun dream but impossible to make real... the feature you need instead is LabVIEW interfaces/traits.

 

Details: As much as I wish this could happen, I literally have no idea how it could ever possibly be done in any sort of useful way. It's so easy to state what we wish would happen, but there's some logical contradictions in the request.

 

If you wire a VI reference to a Call By Ref or an Asynch Call By Ref, you get a conpane. First of all, there's no way to know if that reference is for a malleable VI or not. But let's say we made that part of the strict typing. There's still no way to know at compile time what VI will be passed in at runtime. It could be any VI. Which means there's no way for the (A)CBR node to know whether or not a given set of wired types is going to be usable by the malleable VI that is eventually passed at runtime.

 

Let's say we moved that evaluation to runtime -- in other words, allow the (A)CBR to accept ANY wires if the VI refnum type encodes "I guarantee this will be a malleable VI at runtime." That would seriously blow away type safety, but we could do that. But then we would face the next barrier: there's no compiler in the runtime engine. VIs have to be compiled to run. So even if the malleable VI could adapt to the provided types in the editor, there would be no way to do that adaptation and do the code generation in the runtime environment.

 

In short, the only way to make this happen is to abandon wire type safety, move a bunch of compile errors to be runtime errors, and bloat the runtime engine with the complete LabVIEW compiler. Or LabVIEW would have to become an interpreted language, at least for those VIs. Those are such substantial downsides that I would not encourage LabVIEW to ever do this.

 

The long-requested feature known as "interfaces" or "traits" allows for this kind of run-time polymorphism in a way that doesn't require recompilation at runtime AND preserves type safety. It would not allow a single VI to adapt on primitive types (numerics, strings, Bools, arrays, clusters), but would allow a single VI to adapt to different unrelated LabVIEW classes, each of which implements a common interface in a way that makes unified execution possible. If you want interfaces or traits in LabVIEW, please send that feedback through as many communication channels as you have to National Instruments. It will only happen if enough users say, "This would be useful for me."

AristosQueue (NI)
NI Employee (retired)

One more point: It might be possible to have a "generate specific malleable VI instance" node that is a strict VI reference that also is wired with types to fill in the types of the VI terminals. It would be a very strange and complex node, kind of like the halo that you get on a subVI node if you right click on it and choose "Enable Database", but it would handle the case you're asking for here. Fairly surreal, and something I would likely only embark on if I saw a fairly large swath of users needing the functionality.

Darren
Proven Zealot
Status changed to: Declined

National Instruments will not be implementing this idea. See the idea discussion thread for more information.

philipbear
Member

Unfortunately there is a fine line between "can be implemented in 5 minutes" and "impossible to implement ever". I figured it was either easy or impossible. I'm not sure I completely follow your description of "interfaces" and "traits". Is it kind of like java interfaces where a class can only inherit from one parent but can implement the methods of others through an interface? I think that would be quite useful. I'll see if I can't find the threads around on it.

 

Referring to your last point, I had thought it might be something that could be handled within a property node, but that still has the potential to have the runtime issues you mentioned. I can't say I picture large swaths of users needing this particular functionality. I appreciate you taking the time to explain why it wouldn't work by the way. Understanding better how the strict reference works I get why it wouldn't work.

Philip Bear
Certified LabVIEW Developer
AristosQueue (NI)
NI Employee (retired)

That hypothetical node I mentioned might look something like this.

Untitled.png

AristosQueue (NI)
NI Employee (retired)

I've got various posts around on ni.com and lavag.org discussing interfaces and traits. It's a long-held dream of mine.

crossrulz
Knight of NI

Here is the Idea Exchange for LVOOP Interfaces.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
thols
Active Participant

That link to the Idea Exchange should be spread to everyone that wants interfaces, and everyone that does not yet know they want them. Please help AQ fulfil his dream.

 

Until then, there are implementations of interface in G# and GOOP to play with and give feedback to improve on. There was also an presentation (by Andrei Zagorodny) at the European CLA summit and a following discussion and code-review session on how to improve them, and AQ did use malleable VIs in his follow-up example. But all of these templates have their drawbacks that you will need to evaluate if you accept or not, and won't come close to having a native interface-impementation in LabVIEW.

 

 

Certified LabVIEW Architect