NI TestStand

cancel
Showing results for 
Search instead for 
Did you mean: 

Minimizing deployment time with multiple deployments & LabVIEW

I'm in the process of architecting a new deployment scenario using TestStand 2010 and LabVIEW 2010.  Our scenario includes a large number of tests for many products, so we're worried about the amount of time it will take to deploy.

 

One of the things we'd like to implement is multiple deployments: We'd like to only deploy the code for, say, one product family, instead of the entire body of tests.  I can see how this would work if there were no code shared between the deployments, but how would this scenario handle a library of common code?

 

I envisioned the following scenario:

Product A has some sequences and LabVIEW VIs (in a Project library) associated with it

Product B has some different sequences and LabVIEW VIs (also in a Project library) associated with it

A common Project library contains VIs called by the VIs above

Three different deployments are done separately (Product A, Product B, Common)

 

In the above scenario, it seems like all the links between the Product A code and the common code would be broken.  How does one go about updating a subset of the LabVIEW code in a TestStand deployment?  Or am I forced to re-deploy the entire body of code?

0 Kudos
Message 1 of 8
(4,236 Views)

Hi JoeDG,

 

In your scenario, when you build the Product A deployment, it will contain all of the dependencies associated with the project unless you specifically tell it not to include some files.  Project B would do the same.  It would then be unnecessary to have a Common deployment, because all dependencies would already be installed.

 

If you would like, you could specify files that the deployments will ignore, and then create another deployment to install those files, but that sounds like unnecessary work.  The most efficient way will likely to be to use as few deployments as possible.  You can then use options like "Remove Block Diagrams" and "Consolidate Files Shared by Projects" which will reduce the size of the VIs and prevent multiple copies of the VIs from being installed.

 

TestStandBuildOptions.PNG

 

If you would like to create some common directory that shared files would be installed, you could manually set the Installation Directory of your common files in all of your projects to the same directory.  As long as you don't enable "Force File to Install", it will not write over the common files when you install the deployment if they already exist.  Also, the links between the sequences and the VIs will not be broken since the VIs will be located in the expected directory.

 

TestStandDeploymentOptions.PNG

 

How large are the deployments you are looking to create?  It seems to be that the installation time for a deployment would be negligible compared to the development time.  I may be thinking too small, but I have a hard time imagining any deployment that would take up more than 1 GB of space, and even an installation of that size would likely finish installing in under 30 minutes.  Do you have any estimates on the deployment sizes you will have?  Have you run into problems in the past with installations taking too long?

 

Regards,

 

Brandon V.

Applications Engineering

National Instruments

www.ni.com/support

 

 

0 Kudos
Message 2 of 8
(4,198 Views)

Brandon,

 

Thanks so much for the thoughful answer.  You're on the right track:  I need the ability to control where my files are deployed, so that I can specify common folders for the shared elements.  Ideally, it would be similar to a source distribution in the LabVIEW Project Explorer, where you have much more control over where everything ends up, including keeping the same folder heirarchy.  In the TS deployment utility, there doesn't seem to be this level of control.  For instance, when I try to deploy some sequences that call VIs contained in LV projects, the deploy tool wants to put all the VIs for each project (including the vi.lib files!) in their own folder called "VIs for X" where X is the project name.  In the deployment configuration, I don't seem to have the ability to select a different destination for these file. (The options are greyed out, as you'll see below)  This will replicate the vi.lib files once for every project their used within.

 

TS Installer.JPG

 

Our codebase comes in at just under 1GB.  In our manufacturing environment, 30 minutes is an eternity.  But that's only one motivator for desiring multiple deployments:  The code includes tests for multiple product families that are worked on by multiple developers, so it doesn't make sense that a minor code change in one test of one product flavor in one product family should require re-deploying the entire code base.  There is just too much risk.  An error that causes a problem with the entire code base could cause multiple line stoppages.  We need mitigate risk by deploying the smallest feasable code blocks.  But we also want to be able to re-use code between product families.

 

Here's our currently proposed deployment scenario, perhaps you may have some advice or ideas to help improve upon it:

 

We're just implementing usage of Perforce for source code control. The test code would be developed by multiple developers and shared/versioned via Perforce.  

When new code needs to be deployed to the factory floor, we'll use a previously defined source distribution in the LV project explorer to make a deployment image that only includes the code for the product in question.  TestStand sequences and code common to all products is also included in the LV project, so it gets included in the deployment image as well, but code specific to other products does not get included.  The original directory structure is maintained, so that common code is always in the same place.

The deployment image is submitted to Perforce.

The test systems on the factory floor update their code by updating their deployment images through Perforce.

 

One huge disadvantage to this scenario is that it doesn't use the deployment utility, so issues with sequences (missing, broken VIs, absolute paths) are not caught.  We would have to do this checking ourselves with custom tools.  But it allows us to choose the scope of our deployment

0 Kudos
Message 3 of 8
(4,181 Views)

Joe -

 

You may want to consider using LabVIEW Packed Project Libraries. Here is some information that you might find useful:

 

Effectively Using LabVIEW with TestStand - LabVIEW Packed Project Libraries

 

Hope this helps.

Manooch H.
National Instruments
0 Kudos
Message 4 of 8
(4,172 Views)

Manooch:  I've looked at packed project libraries, as well, and have tried them out in our proof-of-concept setup.  They would certainly speed up the deploy process, but lack one key aspect (as far as I can tell):  They are not friendly to a developer that needs to develop both the PPL and the code that uses the PPL.  More specifically, the developer needs to be able to easily move between a PPL and a non-packed library in a given project.  (These issues are discussed here)  Without the ability to do this, the development process for a developer working on code that uses a PPL and also working on the code in the PPL itself is extremely cumbersome.  The developer is no longer able to just hit "run" to test his code, he must recompile the PPL before every test.  .

 

It looks like PPLs are well-suited for shared code that won't change much, and is developed very separately from it's usage.  That's not us, yet...

0 Kudos
Message 5 of 8
(4,163 Views)

Hi JoeDG,

 

You can change the directory that the subVIs are saved to by going to LabVIEW Options in the TestStand Deployment Utility.  There is an option for SubVI Location that lets you control the directory.

 

SubVI Directory.PNG

 

Regards,

 

Brandon V.

Applications Engineering

National Instruments

www.ni.com/support

 

 

0 Kudos
Message 6 of 8
(4,145 Views)

Brandon,

 

I figured out what was complicating things:  If I go through all my sequences and remove the path to the LabVIEW project in the step settings, I get the deploy behavior I expect (and you have been describing).  That works great.  However, when I include a Project path, the deployment does a couple of very inconvenient things:

 

1) All the VIs in the Project are moved into a folder called "VIs for XXX" where XXX is the project name.  This wouldn't be that big of a deal since the linkages are updated.  But I don't understand why it does this...

2) All the common NI VIs from vi.lib are replicated inside a "Support VIs" folder, located in the "VIs for XXX" folder.  This is very strange, and unacceptable since these VIs will be replicated once for every project that is used in a deployment.  This seems like strange behavior to me: I would expect the Support VIs to all be sent to the same folder, regardless of the Project they are called by.

 

I could just proceed without specifying a LabVIEW project, and everything should work just fine.  But ideally I'd like to fix (or work around) the behavior, and use Projects the way they were intended.

 

Any ideas on how to fix this, or reasons for this behavior?  

 

-Joe

0 Kudos
Message 7 of 8
(4,135 Views)

Joe,
Imagine the following:   You make a seemingly harmless change to a common VI and find that some steps in the test system are broken.   You load the VIs up in LV and they work fine, but you notice the callers are marked as modified.   You can’t see any reason they should have changed- as you didn’t change the common VI’s connector pane.  Eventually you find that the common VI’s inplaceness changed and now you have to redeploying everything.
If you use Packed Project Libraries(PPL) - you wouldn’t ever have that headache because PPLs have a checkbox:  Callers adapt at run time to Exported VI connector pane state.  This checkbox makes the calling VIs independent of inplaceness change in the common VIs.
(Inplaceness is a LV property that determines if LV can reuse memory for the controls & indicators on a subVI.   For example if you have a large array that passed into a control the subVI modifies the array and passes it out via an indicator.   It is much faster if LV doesn't have to make copies of that array, but can modify it in place.   However, changes to the wiring of the subVI may require LV to make a copy to execute the VI correctly.  When that happens all of the VI's callers need to recompile.  For more details on inplaceness look here:https://www.ni.com/en/support/documentation/supplemental/10/ni-labview-compiler--under-the-hood.html )
> They are not friendly to a developer that needs to develop both the PPL and the code that uses the PPL.  More specifically, the developer >needs to be able to easily move between a PPL and a non-packed library in a given project.  (These issues are discussed here)  Without >the ability to do this, the development process for a developer working on code that uses a PPL and also working on the code in the PPL >itself is extremely cumbersome.  The developer is no longer able to just hit "run" to test his code, he must recompile the PPL before every >test.  . 

 

You could do the following: have a project that builds the PPL- in that project everything is in VIs.   Note that if you put unit test VIs in this project your developers can just run them without a recompile.   It's easy to make lots of changes and they don't have to build a PPL to test.  Once all of the unit tests pass you build the PPL and run system tests calling into it.  


Another alternative- TestStand can call VIs in PPLs directly and you can easily change between a VI and a VI in a PPL by changing the VI Path for a step.   You can have a program use the TS API to change an entire sequence from using a PPL to using the VIs from a directory far faster than rebuilding the PPL.    Once the developer has figured out and fixed the problem they can rebuild the PPL and then switch the sequence back to using it.


-Rick Francis

0 Kudos
Message 8 of 8
(4,117 Views)