LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Making Sub VI's work!

Hello,

I'm relatively new to LabVIEW, and am experiencing a problem and would be extremely grateful to anyone who can help!

Basically I have a number of proximity sensors attached to the back of a climbing wall. As someone climbs the wall, data is sent to the computer about the wall deflections. A VI (called data.vi) takes the data, and through a number of filters, completely reduces noise in the signal.

I then have another VI (called main.vi) which displays the data onscreen through its front panel. This VI is quite complex and uses a number of while loops within its block diagram.

The data.vi does not talk directly with the main.vi, with another vi (called channel.vi) acting in between. Channel.vi is a sub vi of main.vi, and data.vi is a sub vi of channel.vi.

Channel.vi exists because, main.vi was built on a different computer away from the climbing wall. Hence wiring data.vi into channel.vi was a quick way of implementing the system.

A final fourth vi is used (called random.vi). This VI just generates random data, and can be used as a sub Vi in channel.vi instead of data.vi. This allowed main.vi to be tested away from the climbing wall with dummy data instead of real data.

Now if I run main.vi, with channel.vi as a sub VI, and random.vi as a sub VI to channel.vi supplying dummy data, then main.vi runs fine with Windows XP saying approximately 30% of the processor is being used.

If I run channel.vi, with data.vi running as a sub VI inputting real data from the wall, then the front panel of channel.vi shows the data real-time, with Windows XP again reporting about a 30% processor usage.

However if I run main.vi, with channel.vi as a sub VI, and data.vi as a sub VI of channel.vi, (the intended system), it runs very slowly with data being updated every 5 seconds or so, on the front panel of main.vi.

As stated above, main.vi does contain a number of while loops, and I have experimented by putting millisecond delays between each cycle of the while loops, to ensure the processor does not reach 100% usage, but to no avail.

I am using LabVIEW 7 Express, not on a great machine (Celeron 800MHz, 256Mb of RAM, 8Mb shared video RAM) - am I asking too much?

Help! Any comment would be greatly appreciated.

With thanks

Lawrence
0 Kudos
Message 1 of 3
(2,718 Views)
If the the update rate on the front panel is faster when you use dummy vi, then it probably is the vi that collects and analyzes the data slowing down. It is still hard to see without looking at the source code, so other possiblities also exist.

joe
0 Kudos
Message 2 of 3
(2,718 Views)
If channel.vi can function as a top level VI or a sub VI is there any changes in channel VI�s state?
Does channel vi run as a parallel loop with the other while loops in the main vi? Or is the channel vi placed inside the while loops in main vi requiring the channel vi to run once as a sub vi where it runs continuously as a top level? If the later is the case you may be forcing a reconfiguration of the data acquisition each time the vi is called. If it is running in a parallel loop you may be missing a time delay in the version you made to run in this mode.
Try the profiler found in the menu tree Tools>Advanced>profile vis. Run the profiler with the timing statistics and details selected to see which vi is running away with the program.

Good luck
Ra
ndall
Message 3 of 3
(2,718 Views)