LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Linking to .Net nodes - "An error occurred trying to load this assembly"

Solved!
Go to solution

So it works fine if the assembly is in the same directory as the executable?  


Yep, it turns out it does work fine when the assembly is in the same directory as the executable.

 


I wouldn't be too surprised since LabVIEW needs to validate everything at edit-time as well as run-time, since the VI is compiled as you edit it.


Here's why that explanation doesn't quite make sense to me...

 

  1. The assembly and assembly's dependents have already been compiled.  Compiling the vi doesn't cause the assemblies to recompile.
  2. If AssemblyA has a dependency on AssemblyB and AssemblyB is missing that will generate a run time error, not an edit time error.  In Visual Studio I can still have a reference to AssemblyA in my source code and compile that into an executable.  In other words, the Fusion engine (the .Net component responsible for finding the correct assembly) doesn't execute the code of the assembly it is looking for or trace through all the dependencies--it just finds it and analyzes it using Reflection.
  3. LV documentation indicates that when it is looking for an assembly that task is delegated to the Fusion engine. 

Since LV's .Net functionality is broken if the dependency is missing, and since Reflection does not require dependencies to be present, that implies LV is actually executing the assembly's code at edit time.  I haven't been able to find any documentation indicating that is true.  I really hope it isn't, as that would have major repercussions on our ability to develop software using licensed third party run time components and have a huge influence on how I write assemblies designed for LV developers.

 

For example, we have a robot we use that requires a set of licensed libraries at runtime.  It is not practical for us to do all our dev work on the computer that controls the robot, so the plan has been to write a thin proxy assembly that developers can install on their computers.  By programming against the proxy instead of the actual libraries we avoid requiring every developer to install the robot application suite just to develop a robotic application.

 

0 Kudos
Message 11 of 18
(2,069 Views)

Hi Daklu,

 

I'm not quite sure whether it does execute anything in the assembly at edit time. I might have to investigate this a little further and see if I'm able to find an answer.

Jared S.
Applications Engineering
National Instruments
0 Kudos
Message 12 of 18
(2,053 Views)

 


@Daklu wrote:

I don't know which version of the framework was used for development, but framework versions 2.0 - 3.5 all compile to the CLR version 2.0.  I have the following versions of the framework installed:

 

I can add a reference to the dlls from Visual Studio even if they aren't in the executable's directory.  I get all the intellisense and namespace information in the dev environment.  Visual Studio appears to reflect remote.dll for the necessary information and stop there.  Perhaps Labview loads remote.dll AND tries to load all of the dll's dependencies, even though they're not needed until run time?


I have a hunch that LabVIEW is not or not only using Reflection to get at the .Net component during edit time. Most likely it attempts to really load the component into memory. LabVIEW is most likely not attempting to load any dependencies of the component since Windows will do that if an application request a component to be loaded.

 

 

The question why LabVIEW loads the component rather than only using Reflection to get at its information to build the necessary link data might be interesting but only in a academical way. One reason might be that the LabVIEW .Net funtionality was developed for .Net 1.1 and Reflection alone might not have been a (workable) option then. It is obvious that .Net as it is now implemented in LabVIEW will require a fully loadable assembly with all its dependencies properly resolved even at edit time of the VI. With the current tight coupling of LabVIEW source and binary compiled code in one VI this seems to me also a fairly logical albeit not always desirable requirement.

Rolf Kalbermatter  My Blog
DEMO, Electronic and Mechanical Support department, room 36.LB00.390
0 Kudos
Message 13 of 18
(2,049 Views)

 


Daklu wrote:

 

  1. If AssemblyA has a dependency on AssemblyB and AssemblyB is missing that will generate a run time error, not an edit time error.  In Visual Studio I can still have a reference to AssemblyA in my source code and compile that into an executable.  In other words, the Fusion engine (the .Net component responsible for finding the correct assembly) doesn't execute the code of the assembly it is looking for or trace through all the dependencies--it just finds it and analyzes it using Reflection.
  2. LV documentation indicates that when it is looking for an assembly that task is delegated to the Fusion engine. 

Finding an assembly and loading it might be quite different things. Most likely LabVIEW lets the Fusion engine find the assembly and also might get most of the necessary information to build the correct link records to the assembly in that way. But when the VI is pre-compiled (what happens regularly during edit time) it will most likely load the assembly into memory and unresolved dependencies might make that loading fail, espcecially if these dependencies are unmanaged code that is not marked to be delay loaded (and the only person who can change that is the developer of the assembly that uses the unmanaged code).

 

Rolf Kalbermatter  My Blog
DEMO, Electronic and Mechanical Support department, room 36.LB00.390
Message 14 of 18
(2,048 Views)

 


Daklu wrote:

 

  1. If AssemblyA has a dependency on AssemblyB and AssemblyB is missing that will generate a run time error, not an edit time error.  In Visual Studio I can still have a reference to AssemblyA in my source code and compile that into an executable.  In other words, the Fusion engine (the .Net component responsible for finding the correct assembly) doesn't execute the code of the assembly it is looking for or trace through all the dependencies--it just finds it and analyzes it using Reflection.
  2. LV documentation indicates that when it is looking for an assembly that task is delegated to the Fusion engine. 

Finding an assembly and loading it might be quite different things. Most likely LabVIEW lets the Fusion engine find the assembly and also might get most of the necessary information to build the correct link records to the assembly in that way. But when the VI is pre-compiled (what happens regularly during edit time) it will most likely load the assembly into memory and unresolved dependencies might make that loading fail, espcecially if these dependencies are unmanaged code that is not marked to be delay loaded.

 

Rolf Kalbermatter  My Blog
DEMO, Electronic and Mechanical Support department, room 36.LB00.390
Message 15 of 18
(2,047 Views)

@rolfk wrote:

 

...it will most likely load the assembly into memory and unresolved dependencies might make that loading fail, espcecially if these dependencies are unmanaged code that is not marked to be delay loaded 


 

Thanks for the reply Rolf.  I have almost no experience in developing unmanaged windows code.  Whereas .Net assemblies are delay loaded by default it sounds like unmanaged code has to be specifically told to delay load its dependencies.  Is changing a library to delay load its dependencies as simple as using a different compiler option, or does it require source code changes?  (Other than inserting run-time error handling if the load fails.) 

 

I'm assuming the calling code gets to decide whether the dependency is delay loaded, not the called library.  Is that right?

0 Kudos
Message 16 of 18
(2,001 Views)
Solution
Accepted by topic author Daklu

 


@Daklu wrote:
Is changing a library to delay load its dependencies as simple as using a different compiler option, or does it require source code changes?  (Other than inserting run-time error handling if the load fails.)  

I'm assuming the calling code gets to decide whether the dependency is delay loaded, not the called library.  Is that right?


I have no experience with .Net at all. But in C/C++ there are basically two forms of delay load. One is explicitedly by writing code that does the LoadLibrary() and GetProcAddress() calls for all the imported functions. I have done that in various projects to fully control when specific libraries are loaded as it often doesn't make sense to load a whole hierarchy of libraries just for the one out of 1 million cases where maybe a function might be needed in two years after the process got started. By creating my own dynamic import interface I have complete control and can gracefully fail, if the dependency can't be loaded for one or the other reason at the moment I might need it.
The other option in newer Visual C versions is a specific linker switch that replaces the addition of the static import library to the project. Instead of simply linking to the import library that uncoditionally imports the library into the current process and fails the loading and instantiation of the entire process if even one dependency can't be satisfied you specify the shared library name with a special switch /delayload:<dllname> and the linker creates the necessary code that will load the library and link the individual used functions at runtime as soon as they are first called. Look here for a start into this option. I'm not sure how that applies to the entire .Net unmanaged stuff though.

 

Rolf Kalbermatter  My Blog
DEMO, Electronic and Mechanical Support department, room 36.LB00.390
Message 17 of 18
(1,981 Views)

 


@Daklu wrote:
Is changing a library to delay load its dependencies as simple as using a different compiler option, or does it require source code changes?  (Other than inserting run-time error handling if the load fails.)  

I'm assuming the calling code gets to decide whether the dependency is delay loaded, not the called library.  Is that right?


I have no experience with .Net at all. But in C/C++ there are basically two forms of delay load. One is explicitedly by writing code that does the LoadLibrary() and GetProcAddress() calls for all the imported functions. I have done that in various projects to fully control when specific libraries are loaded as it often doesn't make sense to load a whole hierarchy of libraries just for the one out of 1 million cases where maybe a function might be needed in two years after the process got started. By creating my own dynamic import interface I have complete control and can gracefully fail, if the dependency can't be loaded for one or the other reason at the moment I need might need it.
The other option in newer Visual C versions is a specific linker switch that replaces the addition of the static import library to the project. Instead of simply linking to the import library that uncoditionally imports the library into the current process and fails the loading and instantiation of the entire process if even one dependency can't be satisfied you specify the shared library name with a special switch /delayload:<dllname> and the linker creates the necessary code that will load the library and link the individual used functions at runtime as soon as they are first called. Look here for a start into this option. I'm not sure how that applies to the entire .Net unmanaged stuff though.

 

Rolf Kalbermatter  My Blog
DEMO, Electronic and Mechanical Support department, room 36.LB00.390
Message 18 of 18
(1,980 Views)