LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

dedicate and fix a part of CPU to a while loop

Hello,
I have a problem:
in my .vi there are about 10 XYgraphs on screen.

The problem is that after an amount of time (and registered points) the value of sample time, I have superimposed at the beginning of the test, increments during time spent becouse the CPU in not able to process and refresh in another while loop the amount of points in the graphs.
 
I verify this because when I hide the graphs on screen the sample time return at original value.

My question is this:
is possible with labview to dedicate and fix a part of CPU to a particular while loop?
in order to give priority to this while loop and guarantee this loop is executed in any condition?

0 Kudos
Message 1 of 14
(4,544 Views)

Hi Lesterino,

 

your problem is mostly not the priority of a while loop, but the amount of data collected and displayed in a graph...

If you would limit the amount of data then your timing problems will vanish! It's senseless to display more than ~1000points in a graph!

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 2 of 14
(4,539 Views)

You can control which CPU code runs in by placing the code inside a Timed Seq structure and use it to dictate which CPU it executes in.

 

Updating UI at greater than 30 Hz does not makes sense. It you update it more often, reduce the update rate.

 

You can "defer front Panel updates" before you update a graph chart then undefer after (serach this site for Defer.FPUpdate or something like that).

 

Ben

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
Message 3 of 14
(4,532 Views)

 


Lesterino wrote:

is possible with labview to dedicate and fix a part of CPU to a particular while loop?
in order to give priority to this while loop and guarantee this loop is executed in any condition?


As Ben mentioned, timed loops can be assigned to a specific CPU core. This also means you need a multicore CPU. You cannot dedicate "part" of a CPU to a certain task. Also, be aware that on a multipurpose OS, nothing is guaranteed. If you need a truly deterministic loop, you need LabVIEW RT.

 

In my opinion, you are looking in the wrong place for a solution to your problem. If you hammer 10 xy graphs with near infinite amounts of ever growing data, you are simply not doing it right. Most likely, you have constant memory reallocations of your data structures. What good are graphs if they should not be updated because other parts of the code needs the CPU?

 

Are all graphs visible at the same time? How big are they?

How much data is in the xy graphs? Can you show us some code?

 

0 Kudos
Message 4 of 14
(4,517 Views)

0 Kudos
Message 5 of 14
(4,495 Views)

@Lesterino wrote:


Your question is one of the classic LV questions ("There is nothing new under the sun." Eclesiastes?) and has been answered many times already. In fact I have a Tag Cloud devoted to LabVIEW Performance related threads that can be found here.

 

Of particular interest is Dr Grey's white paper on optimizing for large data sets (search this site for Large Data sets) whre he discusses a number of techniques that can help you.

 

If after reviewing those threads you still have trouble, post back and we will take to from there.

Ben

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
0 Kudos
Message 6 of 14
(4,466 Views)

Thanks for the kind words, Ben.

 

As has been mentioned before, plotting too many points to a graph than you have pixels to display is rather pointless.  However, you are using an XY graph, not a waveform graph, so the decimation of this data for display is a bit different than the algorithm and code given in the Managing Large Data Sets in LabVIEW tutorial.  To decimate for XY, at each point, determine if the next point will go outside the current pixel location.  If so, add it to the plot; if not, delete it.  You can also cull any points that are offscreen due to zooming.  This is most easily done by adding the first point that is offscreen, then adding a NaN data set to break the graph.  Then add the point before the first next point that is on-screen.  This ensure your lines will be correctly drawn onto screen.  Check for single points going offscreen, and do not add the NaN set in that case.  I have attached a VI set that implements this (LabVIEW 8.2.1).  The top level VI is xyDec_DecimateForXYPlot.vi.  The inputs are a reference to the XY graph and the X and Y arrays of data.  Outputs are the original inputs and the decimated data.  Two notes on the code:

 

  1. Note that in very noisy data cases, it will actually slow you down, since all the points will be plotted, but you still run through the algorithm.
  2. The code assumes you are plotting vs. the first axes in both X and Y (this is hard-coded on the block diagram).

Decimation of unevenly sampled waveforms plotted on an XY graph is left as an exercise to the reader.  Note that a variant of the standard waveform decimation in the tutorial that uses time instead of number of points to determine the decimation interval works well.  I may write a post on this early next year.

 

Good luck!

Message 7 of 14
(4,454 Views)

Another option would be to map the data into a 2D array an show it as an intensity graph. Now the memory footprint is constant and you can draw an infinite number of points over time.

 

Decide on a useful x and y resolution, initialize a 2D array of zeroes, then replace elements with 1 as data appears. For mutliple plots, replace with other integers, one for each plot.

 

 

As a starting point, have a look at my old example discussed here. Your situation is even simpler, because you would use a constant z and not a 2D histogram where bins accumulate over time.

 

0 Kudos
Message 8 of 14
(4,430 Views)

Hello Dr Grey,

I think your attach is very interesting.

 

 

I ask you how can I use it with a xygraph with two (or more) signals.

In this case I have the same X but different Y.

0 Kudos
Message 9 of 14
(4,391 Views)

Another option would be to map the data into a 2D array an show it as an intensity graph. Now the memory footprint is constant and you can draw an infinite number of points over time.


 

 

It's a possibility, but how can I use it instead of a xygraph?

I don't understand where need to put X,Y arrays.

 

thanks

0 Kudos
Message 10 of 14
(4,373 Views)