LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

LabVIEW, ActiveX, one EXE to work with two OLBs(?)



@rolfk wrote:


@tbd wrote:
 
Hi Rolf,
      Not sure what you meant by "in this way you can only support one Office version in a specific build application" - example supports two OLBs (call to 11.0 is in a different case) but method could support an unlimited number of different Office versions.
 
      My [likely] mistake re: "late binding" somes from this post which is discussing LabVIEW polymorphism (see post by Greg McKaskle.)  A search on this subject shows other (VB) programmers encounter this same problem - which they solve through explicit late-binding syntax.
Cheers!


I was refering to the Report Generation Toolkit. The different Office version libraries contain the same VIs compiled for the specific Office version. You include the one you want to support on the target into the build but can not include multiple of them into the same application.

Late binding means a lot of things to a lot of different people. Basically it really means that the linking (binding) is not done on load time but at the first time the function is called. And from that LabVIEW does do this in Active X if I'm not mistaken. That doesn't solve the problem  of an Active X method changing it's calling interface since LabVIEW defines the calling interface at compile time (except probably for Variant defined parameters which LabVIEW can adapt to at runtime). But for non Variant parameters or when suddenly there is an extra parameter or one less, LabVIEW has to throw the towel and refuse running that method since the Call Interface was compiled and prepared at build time and can not change at runtime. The same is true for C(++)  where the compiler defines the method interface and as it seems even Visual Basic does the same. Of course you can prepare your code to run for different versions by implementing all the method call versions and detect which is necessary at runtime. Which you do now but while this is late binding too it's not the only way of late binding. Even Windows DLLs have delay load imports (which is also a form of late binding) which means Windows will only intilialize the
function import stub at the first time the function is used. This is still strictly linked but has been introduced to allow for circular references of DLL imports. Without delay loading Windows would get into a problem loading a DLL that references another DLL that references back to our first one and the loading would fail since all direct imports need to be fully satisfied before the load of a DLL can succeed.

Rolf Kalbermatter


Message Edited by rolfk on 06-11-2008 08:36 PM

Very-much appreciate this insight. Smiley Happy
Call it "source-grapes", but, if I can resolve the parameter-list discrepancy by merely re-selecting a method in the editor, perhaps the run-time engine could be smarter and resolve "calling interface" differences at run-time, assuming the selected method and supplied parameter names are supported on the target. (Even if possible, I wouldn't expect this method to be efficient ).
 
Thanks/Cheers!
"Inside every large program is a small program struggling to get out." (attributed to Tony Hoare)
0 Kudos
Message 11 of 15
(884 Views)


@tbd wrote:

Call it "source-grapes", but, if I can resolve the parameter-list discrepancy by merely re-selecting a method in the editor, perhaps the run-time engine could be smarter and resolve "calling interface" differences at run-time, assuming the selected method and supplied parameter names are supported on the target. (Even if possible, I wouldn't expect this method to be efficient ).
 
Thanks/Cheers!



Ahhhhhh, but LabVIEW is a compiler!!!! And such changes require changes to the compiled code so no I do not think that is something LabVIEW could really support.

Rolf Kalbermatter
Rolf Kalbermatter  My Blog
DEMO, Electronic and Mechanical Support department, room 36.LB00.390
0 Kudos
Message 12 of 15
(879 Views)


@rolfk wrote:


@tbd wrote:

Call it "source-grapes", but, if I can resolve the parameter-list discrepancy by merely re-selecting a method in the editor, perhaps the run-time engine could be smarter and resolve "calling interface" differences at run-time, assuming the selected method and supplied parameter names are supported on the target. (Even if possible, I wouldn't expect this method to be efficient ).
 
Thanks/Cheers!



Ahhhhhh, but LabVIEW is a compiler!!!! And such changes require changes to the compiled code so no I do not think that is something LabVIEW could really support.

Rolf Kalbermatter


Of course the edit step requires a compile.  What I'm postulating is that an optional method of invoking the method could be implemented at run-time (when necessary) to use an invoke-by-name strategy instead of a statically-linked strategy(?)
"Inside every large program is a small program struggling to get out." (attributed to Tony Hoare)
0 Kudos
Message 13 of 15
(859 Views)
Follow-up:
This link:http://support.microsoft.com/kb/245115 is titled "Using early binding and late binding in Automation".
 
LabVIEW behaves as if it uses uses a hybrid form of binding called "DispID" binding.  In the article "How Does LabVIEW Expose Its ActiveX Methods and Properties?" the term "DispID" appears, though "DispID binding" is not explicitly stated.
 
According to the MS article, with DispID binding: "If the COM object is known at design time, the dispids for the functions that are called can be cached and passed directly to IDispatch::Invoke without the need to call GetIDsOfNames at run time.. This can greatly increase performance, because instead of making two COM calls per function, you only need to make one."
 
This would explain why LabVIEW EXEs behave as if statically bound to a specific version of COM interface.  The article goes on to say:
 
"Late binding is still useful in situations where the exact interface of an object is not known at design-time. If your application seeks to talk with multiple unknown servers or needs to invoke functions by name (using the Visual Basic 6.0 CallByName function for example) then you need to use late binding. Late binding is also useful to work around compatibility problems between multiple versions of a component that has improperly modified or adapted its interface between versions."
 
So while (it appears) LabVIEW is binding to the specific COM interface at compile-time, It's possible for other languages (including the language LabVIEW was written in) to avoid this and invoke functions by name at run-time.  If only LabVIEW allowed Objects to be declared/instantiated that way! Smiley Wink
 
Cheers.
"Inside every large program is a small program struggling to get out." (attributed to Tony Hoare)
0 Kudos
Message 14 of 15
(834 Views)


@tbd wrote:

So while (it appears) LabVIEW is binding to the specific COM interface at compile-time, It's possible for other languages (including the language LabVIEW was written in) to avoid this and invoke functions by name at run-time.  If only LabVIEW allowed Objects to be declared/instantiated that way! Smiley Wink
 
Cheers.



Unfortunately it is not that easy I think. Late binding may be a technique that allows an environment to adapt to incompatible versions of ActiveX interfaces (if the method name doesn't change which in the case of Office isn't true either). As they say themselves that is a rather big performance degradation so definitly not what one wants LabVIEW to do by default.

That still requires the calling environment to adapt to the new interface and here lays the real problem. If an additional parameter was added to a method or the datatype was modified that would result in the modification of the Method node on the diagram, a very clear edit time action that LabVIEW is not designed to allow in a compiled application without recompiling.

Adding here an interface that could translate between what the LabVIEW program was compiled to use and what late binding
has discoverd is necessary would be a very complicated piece of software with lots of limitations and an even bigger performance degradation. Where should it stop?

Only adapting between different datatypes? That would be probably doable treating all parameters as Variants since Variants can deal with datatype differences at runtime. The result might be rather unexpected however since datatype translation (for instance string<->non sting) could be anything but unambigous.

Allowing additional or removed parameters? Hmm, a removed one would be also not so difficult, just drop it unless of course it is an output parameter! Additional ones? oh well, give them the LabVIEW default value which in most cases is not what the method would expect and therefore would fail anyhow with a runtime error or something.

All in all lots of problems that can't be solved well and good for a lot of situations so it would be one of those features that creates more problems than it solves.

Rolf Kalbermatter
Rolf Kalbermatter  My Blog
DEMO, Electronic and Mechanical Support department, room 36.LB00.390
0 Kudos
Message 15 of 15
(827 Views)