LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Does Labview take advantage of graphics accelerators?

We have an application which is updating LOTS of indicators, boolean and numeric, several times a second while handling a lot of serial communication and data parsing.  We have already seen that on a relatively high spec laptop, it is causing very high load and overheating of the laptop at times.  Yes, there are lots of optimizations yet to take advantage of, and have a developer working on it now, but in general, if one has front panels w/ lots of displayed data that is refreshed a lot, or lots of graphs, would a graphics accelerator card take load off the processor, allowing the application to run more efficiently?
0 Kudos
Message 1 of 11
(5,180 Views)

I believe I remember seeing something about the new 8.2 taking advantage of graphics cards, but at least until now, LV did everything in software to make the platform indepedence easier to support.

In any case, simple numeric and boolean indicators should not be your problem.

There are several options which come to mind.

  • The loop-with-no-wait option. This seems unlikely, as it would normally take 100% CPU all the time and you would notice it.
  • The overlapping elements option. This is a tricky one. When you have overlapping elements, LV redraws them all the time. Note that the overlap will be caused by the full rectangle of the control, not just the part you see. I'm not sure whether labels and captions also cause this.
  • The bug option. I have seen cases in the past where an apparent bug with the tab control caused high CPU usage even when "the guilty party" was not displayed. This might be your case.

If you try searching for "optimization" and "display" you should hopefully find some answers (like changing the Z order of your elements).

In any case, it would be hard to give concrete advice without seeing the code.


___________________
Try to take over the world!
0 Kudos
Message 2 of 11
(5,160 Views)
Another trick is to defer panel updates when you are submitting a block of changes.  To access it, use VI server to get the Front Panel reference to your VI.  Use the Defer Panel Updates property of the panel.  Set to TRUE before you start a block operation, then FALSE afterwards.  Depending upon how well you can modularize your activities, you can save a lot of processing time.
Message 3 of 11
(5,107 Views)
In addition to the excellent answers so far, here are a few more tips:
  • Only update indicators (especially graphs) if their value actually changes. (Place their terminal inside a case structure).
  • It is foolish to send ten million data points to a graph indicator that is 500 pixels wide. Do some data reduction first.
  • Don't use charts with excessive history lenght.
  • Disable autoscaling for your graphs and charts. It is more expensive to do the scaling and recalculate and redraw the axes .
  • If you have "autoadjust scales" enabled (default), the plot area might even change with each scaling operation. turn it off!
  • Don't set your indicators to "synchronous display".
  • Use simpler indicators or even subsitute a different indicator (for example, instead of using a huge 2D array of fancy shaded LEDs convert your boolean array to 0/1 and display it on a fixed size intensity graph.
  • You shoud definitly do your serial communication in a seperate loop so it does not depend on the speed of the UI loop.
  • Don't overcomplicate the front panel. The operator might have a hard time following the action.
  • ...

Most of the power of todays high-end graphics adapters is for 3D gaming and rather irrelevant for things such as a typical LabVIEW front panel. However, the 3D graphs as well as the new 3D picture control (LabVIEW 8.20) can take advantage of 3D acceleration.

Message 4 of 11
(5,086 Views)

In addition to the fine points made earlier, I am curious about the phrase,

"

We have already seen that on a relatively high spec laptop, it is causing very high load and overheating of the laptop at times. 

"

How is your memory usage?

If your application require more memory than the physical memory present in the machine, virtual memory will be used and will result in excessive disk I/O.

Ben

PS: Posting code will hlp us help you.

 

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
0 Kudos
Message 5 of 11
(5,076 Views)
LOL yes. If a laptop overheats, it has a mechanical flaw unless you are using it in Death Valley at noon in summer. A laptop should never overheat but might run hot if processing demands are high.
 
Open your task manager (if you are using windows) and look at the "processes" tab. What is the memory use of LabVIEW during a run?
0 Kudos
Message 6 of 11
(5,070 Views)


altenbach a écrit:
  • Only update indicators (especially graphs) if their value actually changes. (Place their terminal inside a case structure).
  • ...
  • Don't set your indicators to "synchronous display".

Hum... I thought that indicators where only updated if their value was changed ... even when set to synchronous display. At least this bis what is suggested by the attached vi... 😉


Message Edité par chilly charly le 09-05-2006 06:14 PM

Chilly Charly    (aka CC)
Download All
0 Kudos
Message 7 of 11
(5,072 Views)


@chilly charly wrote:

Hum... I thought that indicators where only updated if their value was changed ... even when set to synchronous display. At least this bis what is suggested by the attached vi... ;)


CC, I haven't studied this in detail for a long time so maybe things have improved in recent LabVIEW versions :). Back in my early days I did extensive testing on this and there was a dramatic difference, especially with more expensive indicators such as graphs (updating a scalar is peanuts ;)). I'll do some testing....
0 Kudos
Message 8 of 11
(5,066 Views)
Not sure, but I think that indicator updates were optimized in LV 5. You are right, that was a long time ago and ... yes, LabVIEW has been slightly improved since 😄
Chilly Charly    (aka CC)
0 Kudos
Message 9 of 11
(5,059 Views)
Actually, I thought that change was made for LabVIEW 6i.  Nevertheless, I do rather agree with Christian that when in doubt, it's better to explicitly cut out the updates rather than trust the NI gnomes' optimization (just in case!).
0 Kudos
Message 10 of 11
(5,019 Views)