LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

How to keep diagram clean with 500 tags?

I can use a tabbed interface to limit the data visisble
on the front panel, but if I use the HMI wizzard to connect to the tags, I end with a diagram that is really unwieldly. Since I can't use clusters or arrays, how am supposed to keep it clean?
0 Kudos
Message 1 of 12
(5,124 Views)
This is a great question. 500 tags is a really close to a limit for the # of front panel objects - I believe it is little over 550 (555?) (at least you can use tabs to hide some of them to an operator). Some of the code generated by the HMI wizard is not neccessary. Have you seen this? - http://digital.ni.com/public.nsf/websearch/29adb3e95c4ab87086256b4300698b82
In general, I wouldn't recommend using tabs in HI tag count apps. You'll get better performace and smaller diagrams if you split the app to multiple VIs (what is now a tab should be a VI). Then you can switch between these VIs with VI server. For example when user select to open another panel (for example from a list box of panels), you'll dynamically load the corresponding VI and unload the current VI.
This way you only have loaded what is really neccessary in the memory. It's smaller -> executes faster and has smaller diagram.
Message 2 of 12
(5,124 Views)
Hi all,

I am confussed as to why you cannot use clusters and arrays.

See attached screen shot where I am using arrays of clusters to display the most interesting 576 tags of about 2500 I am watching.

Ben
Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
0 Kudos
Message 3 of 12
(5,124 Views)
Given the current state of affordable hardware, I don't think memory size is an issue. I would rather
load everything into memory, and alleviate the swapping.

I guess I should be more specific.
If I use HMI wizard to create a datacocket
link to a tag, (I'm connecting to an
Applicom OPC server) the link gets broken
when I place the tag into a cluster or array.
Labview 6.0 HUNG when I tried this, Labview 6.1
does not hang, but it does not permit the connection.
National aknowledges the problem, but so far hasn't
offered many solutions.

The examples National have online only seem to demonstrate 5 or 10 tags. For that many points, the HMI wizard is fine, but it seems an inappropriate tool
for large numbers of tags. I was hoping to se
e examples that demonstrated techniques for hundreds
of points. I assume you can programmaticaly open the scf file, and use create tag from string to create
them in a loop, I'm just not sure how that ties in to the frontpanel, and how you manipulate the various front panel items as a function of the current state of the points. i.e. colour, flashing, text messages...

thanx
😆
tb
0 Kudos
Message 4 of 12
(5,124 Views)
I agree,

Using the HMI Wizard to do a high channel count application yields a very dis-organized diagram.

It is great for low channel count, simple app's.

For these high channel count systems, I will design the architcture ahead of time and then code it manually. This gives me complete control over what happens when and also allows me to implement any of the exotic requirements like hand-shaking, control and status, and custom scaling or formatting.

Ben
Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
0 Kudos
Message 5 of 12
(5,124 Views)
Ben:

Have you any sample code you either post or send to me directly?
I'm looking at writing a HMI for a Siemens S5 PLC.
The video card for the system is no longer available, so I'm trying to prepare for the inevitable.

Given what I see so far, I think 500 tags is
pushing the limit for Labview.
If you visit any of the control message boards, very
few people mention Labview. They talk about WinCC,
Wonderware, and other scada specific packages.
Since most people start talking about those applications at around 500 points, I wonder where
I should draw the line.

Still, I have Labview, and am reasonably comfortable using it. If possible, I would prefer to save the cash and use an environment I know. If National exam
ples
better illustrated how to do this, it would help.
I could use Labview CVI to build parts ( is that what
you meant by code it manually??) if it would help.

But my priority is building a HMI that functions correctly and is reliable, useable, clean and maintainable.

Any help or direction would be appreciated.
I'm tbe@camhpet.on.ca
😆
thanx
0 Kudos
Message 6 of 12
(5,124 Views)
I can't tell you about that message board.

I have not done a Siemens s5 PLC, but one of our engineers did a S7 app.

The limit on the number of tags is way above 500. Like I mentioned earlier, the screen shot I posted watch 2300 tags. That was running on a 1GHz machine. The new 2GHz should be able to better than that.

I write all of my code in LabVIEW, LabVIEW-DSC, or LabVIEW-RT.

Have you thought about replacing the PLC with a FP-RT unit?

We are currently upgrading a plant where we are replacing all of the PLC's with FP-RT devices. This make the support cleaner because everything from the PLC level up through the MMI to the facility monitoring level is implemented in LV!

Just because there are not alot of people are us
ing LV for this type of app is not an indication that this can not be done or or is a bad idea. I think it is more indicative that not many people have figured out how to do this.

LabVIEW DSC is a great tool for doing this kind of project. I have written apps that;

1) Monitor Plastic factories (Dryers, weigh-scale blenders, Extruders, loaders, etc) and control all functions from any where in the world.

2) Track Customer Invetories around the world.

3) SPC of DVD production.

The list goes on.

Re: sending or posting sample code.
I do not have the rights to the code I develop. To post any of my code would put my bosses intelectual property rights in jeopardy.

If you would like to get started right on this project, I would suggest you start by contacting one of NI's Select Integrators to help you work out a design and get you started. My many repeat customers have indicated that is was well worth the money to get us involved.

Please do not get me wrong. I am not tr
ying to drum-up buisness, just help. If you have specific questions, please post them here and one of should be able to get you over any obsticals you may encounter.

Ben
Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
0 Kudos
Message 7 of 12
(5,124 Views)
I like Labview. I think it's great glue for tying all
kinds of things together.

But it's still appropriate to use the right tool for the job. Just like I wouldn't use a 1 lb. hammer to drive railroad stakes.

Silly examples aside, it comes down to 3 or 4 things.
Determinance, reliability, redundancy and scaleability.
After that then consider ease of implementation.
The board I'm talking about is control.com.
The people who regularly write to that board have applications that control ten's of thousands of points and have people's lives dependent upon the systems working. Nobody uses any dos/windows/apple boxes for theses systems. This is the territory of the big boys.
The Siemens, Allen Bradley, Bailey, GE plc boys do this for a living. They are professional control engineers, and as such they have to deal with the big questions. To my knowledge, the only National product that begins to answer the question of determinance is Fieldpoint RT, but they are limited to a max. of 9 (dependent upon current draw) i/o modules per control module. Besides that Fieldpoint RT is a relatively new
product, less than 2 years if memory serves. I use Fieldpoint, but in slow chemistry or control situations, where I can design in safety backups.
To use Fieldpoint RT for my system would require at least 5 control modules with 8 or 9 i/o modules each.
My Siemens system has been running 24 hours a day for the last 10 years. Would I get that kind of reliability out of Fieldpoint??? Would I be my job on it?

I rarely see redundancy mentioned in any of the National hardware or software literature. The big companies all have redundant systems available.

The fact that National only illustrates small examples points to the fact that small jobs are the intended audience.
Your bmp shows a 5 second loop time.
5 seconds for 2500 points, is not a big load.
My requirement is 500 points in under 150 ms.
That's pretty close to two orders of magnitude
of difference. A 10 year old plc does that job
just fine.

The difference is operating system, or lack thereof...

Where would programming be today if K&R had adopted the "Go talk to an integrator approach"???

We all learn from code snippets from others.

My 2 cents.
tb
0 Kudos
Message 8 of 12
(5,124 Views)
Hi tb,

I am trying to help.

Now that I have an idea of what kind of update rates you are talking about, that sheds a different light on the picture.

Please forgive my previous statements.

I grant you that FP has not been around for ten years, and it still requies alot of work in getting it do do the higher power jobs.

The main reson I mentioned it was because of the ease of support. I will leave this point to the side.

LV-DSC (and hopefully myself) can still still be quite useful. It is an excellent environment if you want to tie your PLC's to the outside world (as you said) or if you have requirements that can not be fully met by the other packages you have mentioned.

The BMP I posted monitored a system where the changes where expected every minute or so. The five second update rate was more than fast enough for the web-page it served. It allowed plant operations personel to monitor a flexible number of PLC implemented test stations. The actual tag update rate was on the order of 500 ms. Using dedicated 100 base-T network, we managed update rates that where quite a bit faster and performance of the DSC app to PLC link appeared to have quite a bit of "head-room" that we where never able to tax using the hardware the customer had involved. We tried. The PLC's, their hardware, and user interface required only subtle changes when I served a database of customer requirements to them using LV-DSC. Before the customer had me develop the above, the oprators where entering all of the requirements by hand for each of the test stations. Using LV-DSC I was able to monitor the apporpriate status registers, query the database and fill all of the blanks. When the DSC app saw the test was done, the order status DB was updated.

In this case, the customer only required "scalablity". The determinism, and reliablity where handled by the PLC's. There was no-such-thing as redundency for this customer.

I grant that operations mentioned can be done using other packages. LV-DSC indeed may be the wrong tool for what you are trying to accomplish.



Code snippets of complete large scale applications are hard to come by. Aside from the very specific example that ship with LV, I can not say I have seen large examples.

If could share a little information about what you are trying to accomplish, I or others maybe able to be of more help.

Re: "integrator"
"Select Integrators" are a sub-set of the NI Alliance members. They are experienced in helping users of NI products decide what is the right sized hammer to use.

Hoping I can help,

Ben
Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
0 Kudos
Message 9 of 12
(5,124 Views)
Hi Ben, I'm not trying to be snitty, I just find that
Labview's restrictions and it's hidden costs make it hard to find a balance between useability of the interface, readability/maintainability of the code, and execution speed.

Example, I find that arrays are not suitable for most displays, in that I need to put some kind of text with each of the elements. The array elements are just to close. Using clusters, I can set the display up as I wish, and using bundle and unbundle by name, makes the code really pretty, but when you read the fine print, National says to avoid them whenever possible to prevent memory allocation/deallocation whenever a cluster is passed to a subvi.

In my application ( a medical cyclotron) the plc querries about 500 points 7 times a second. Cylcotrons have 7 main subsystems. Water cooling, magnet, vacuum, diagnostics, target and the plc itself. So the logical approach is segment the user screens among those subsystems. That is excactly what the original HMI does. The S5 to video and human i/o connection is done
through 256 bytes fy0 through fy255. My intent, is to have Labview read and maniputlate these 256 bytes to emulate the video/io card. Many of tags are single bit, and are transferred on a byte wide basis.
My intent and my testing is to use an applicom ethernet card to transfer the data to and from the plc. The s5 ethernet implementation permits transfer of up to 250 some bytes per transaction.

I can keep the user interface relatively clean, but the diagram is a lot trickier. At present, my direction is to try and use
arrays and shift registers as much as possible, and use array to cluster at the end for presentation to the user. I still run into other problems like trying to assign property nodes to provide the required visual properties, all the property nodes add up when
you are looking at 20 to 40 tags per page. The diagram just ends up messy. Much of this is redoing the work already done in dsc. But so far I can't find a way, using the applicom driver, of reading a byte or series of bytes and having dsc interpret them as a series of bits. Probably I just haven't played with it enough yet. But, so far, I can only find a way to interpet them as analog values. Then I have break them into arrays, swap them into an array that holds all the values for that subsytem and use array to cluster to drop them into a display at the end. So far, I think the analog tags should be fine using dsc's built in functions.


What interested me in your original response was the word structure. I'm looking for ways to help me structure the code to help all the elements of the balance. The first response of many people, is to break it into smaller vi's, but I'm wary of doing this. I would rather have have six simple transfers of my complete data set than have to manipulate which data is transferred dependent upon which user screen is active. The other thing I came across which doesn't thrill me with the multiple vi approach is time. The couple of times I tried this approach before, neither
the calling or called vi were accurate wrt time.
Again, that maybe my error, but I do recall some posting about that being a common problem

So far, Labview looks like it can do the job.
It just seems really awkward.

I've talked to some of the integrators at the NI days
in Toronto. Most of them seem to know Labview a little to a lot better than me. Few if any have worked with a S5 plc and none of them know my cyclotron. It would be a lot more work for me to get them up to speed, than it would be for me to figure out how to do it.

My intent behind these postings is to try and get some idea about how and why peope structured their code when tackling bigger jobs.

thanx
😆
tb
Message 10 of 12
(5,124 Views)