LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

LabVIEW RT-Linux GUI and other advice

I have developed with LabVIEW RT in the past where the target was based in VxWorks, or in Phar Lap ETS. These systems were essentially headless self contained 'black boxes' where a client would connect via Ethernet. 

 

I have recently seen a LabVIEW RT-Linux system with some very impressive GUI interfaces that were designed for a touch screen. 

 

I know very little about the LabVIEW RT-Linux platform, but I am very experienced with the LabVIEW environment deployed to a  VxWorks and Phar Lap target. Clearly the RT-Linux target has a lot of advantages.

 

Is there anything that I should know about GUI development for RT-Linux? Does it differ from the standard Windows GUI development? Are there caveats or limitations?

The LabVIEW RT-Linux project that I witness had very unique clean cut GUI for touch screen. Is there a special module for developing complex GUI's for RT-Linux?

 

I've tried searching for online training packages for RT-Linux, but could only find the traditional LabVIEW RT courses that I have already taken years ago.

Could somebody point me to additional resources specific to RT-Linux specifically how it differs to other RT targets. 

Is it possible to develop on a RT-Linux platform, or do you need to develop on a windows system and deploy to the target like the other RT platforms?

This RT-Linux looks like a totally different beast compared to the other RT deployment solutions.

 

Any advice on how to investigate?

 

Thank you!

 


Engineering - The art of applied creativity  ~Theo Sutton
0 Kudos
Message 1 of 8
(5,180 Views)

i have built a crio based solution using a tablet as touch screen interface for the gui.

 

when i started there were no crios with display out, i think there are now, but i don't know anything about that.

 

what i did was to get a (windows) tablet, that connects via ethernet to the crio.

then built the controller software running on the crio

and the gui software running on the tablet.

 

to get data from one to the other i used Network Streams.

 

the gui was built with standard labview and the stuff you can install via VIPM (e.g. UI Control Suite*) (also i made a lot of xcontrols)

there are also some themes you can buy for money, but i don't find the link atm.

 

so if you are already familiar with RT, i don't think that you will find it very different. (but since i don't know vxworks/pharlab it is only a guess)

 

:cheers:

 


If Tetris has taught me anything, it's errors pile up and accomplishments disappear.
0 Kudos
Message 2 of 8
(5,162 Views)

What you described is the typical solution that I have built in the past. Where a second computer acts as a thin client for the GUI. And that is the approach that I'm most familiar with.

 

This LabVIEW-RT Linux seems to have just as many GUI options as a regular windows based LabVIEW solution. 

 

I'm excited at the prospect of having a deterministic platform with a GUI built into it, because I can see where this would save on development time.

I have a lot of data to manage, and not needing to deal with defining data stream definitions for Ethernet for the thin client to interpret would be nice. 

 

If I get a PXI system with LabVIEW-RT Linux pre-installed do you know if I could develop on the system in the Linux environment, or would I need to deploy from windows like the traditional LabVIEW RT? 

If I could develop on the Linux system on the target then this would help greatly with developing the GUI. 

 

 

 

 


Engineering - The art of applied creativity  ~Theo Sutton
0 Kudos
Message 3 of 8
(5,148 Views)

I have several cDAQ devices running the embedded Linux RT OS and have played around with the UI for applications a bit.

 

Things aren't as good as they are on a desktop environment like Windows.  Most things work well enough but you can't do lots of things you could in Windows which is why every demo you see look fine as a crisp clean basic UI, because it can't do complicated things very well.  Here's a quick list of things I've found issues with.

 

For instance one major limitation is that you can only show one front panel.  You won't see multiple floating windows, and you won't see splash screens.  This is because there is only one FP to be seen and no other can be popped up for something like a prompt.

 

Issue number two.  The window you see on your embedded UI is always (and only) the main VI.  Your top level VI will be the one seen.  This also means things like the actor framework don't really work well because these are designed to have multiple asynchronous loops where one of them is maybe the UI Actor.  Well that can't happen.  The main top level VI is what is seen on the UI.

 

Issue number three.  If you are designing a UI that can run on the embedded UI, or remoted in using remote front panels, you will be stuck with having to poll your controls.  The embedded UI does have an event structure and can register value changes as a user clicks a button for instance.  But in the remote front panel clicking that same button does not generate that event.  That means for me I have to still design my UI with control polling in mind, since I wanted it to be flexible and be deployed to both or either without having to change the code.

 

Number 4, touch panel toolkit.  Remember that cool touch panel toolkit NI made so that you could do basic gesturing?  Yeah that doesn't work on Linux RT...the thing that has an embedded UI, and is typically used on touch screens.

 

Number 5, .Net and ActiveX containers don't work.  This should be obvious but you might not realize a toolkit uses it to do cool UI stuff but that doesn't work.

 

Number 6.  I can't remember if this is still the case but right click menus didn't (don't?) work.  Right click a control and your custom popup menu for items is gone.

 

I had issues with several drag and drop operations and am unsure if that got resolved but I just redesigned my UI to not use them.

 

The dream for me was to write my application once, and then have it deploy to Windows, or Linux RT for the embedded UI, or Remote Front panels and have the same code run, and have the same UI and UX.  That isn't the case.

0 Kudos
Message 4 of 8
(5,145 Views)

Thank you for the information. 

Those are some serious drawbacks.


Engineering - The art of applied creativity  ~Theo Sutton
0 Kudos
Message 5 of 8
(5,142 Views)

Yes this is one of the reasons I pushed NI to post a public Linux RT VM (also here is the idea exchange) so that we can test things like ethernet communication, and UI functionality without needing the hardware.  You might buy the hardware with some design in mind only to realize it is either impossible, or would take much more effort than expected.

 

Then there's the fact that you can now leverage many Linux packages and call them using command line of call back nodes.  Something us Windows guys don't have experience on in Linux.  Being able to test and deploy code to a VM to see how these things work without hardware would be extremely useful.  After some discussion a way was described for creating a Linux RT VM, but doesn't include the UI and isn't very easy to upgrade since MAX doesn't recognize it.

Message 6 of 8
(5,138 Views)

@Hooovahh nice bump for the idea exchange, i star'ed that long ago, that would have definately helped in the dev process.

 


If Tetris has taught me anything, it's errors pile up and accomplishments disappear.
0 Kudos
Message 7 of 8
(5,132 Views)

Good Morning,

I am had planned to use some of the touch screens on a RT (linux) Actor project.  Can you point me to any documentation that describes the limitations you mention?  I am running into some of the same problems you mentioned and would like to see if there is a work-around.

Thanks!

0 Kudos
Message 8 of 8
(4,529 Views)