LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

DSC plantwide architecture

I would like to get some opinions.  I am developing a large DSC application (LV 8.2 hopefully), that will utilize perhaps 20 different "process computers" running LV DSC, and logging the data back to a central "server" containing the Citadel database.  Each of these process computers will have a unique set of shared variables defined, and a unique process name.  Each process will be monitoring perhaps 50 shared variables (say).

What is better:
a)  logging ALL processes back to a single database, OR
b)  having one database for each process (so the server has 20 databases running concurrently)

I am looking for opinions in terms of PERFORMANCE but also MAINTAINABILITY.  Is there any advantage/disadvantage to either method, or are they equally good?

FYI, I will be developing a custom historical data viewing application that will reside on the server, and users from the corporate network will utilize Remote Panel to use it.  The historical data viewer will have to be able to access data from all processes at the same time (for example, for comparing performance curves of two different process centres).

Thanks,

David Moerman
TruView Technology Integration
0 Kudos
Message 1 of 7
(7,777 Views)
Sounds like you will be busy for the next couple of years.Smiley Very Happy How would a single hard drive failure affect 20 different databases versus a single database? Could you save it on the 20 computers, and archive on the server. Then read the data out of the archives. Is there a way to merge databases in the 8.20 world?? I'm stuck in 7.0. If the network goes down, will nothing work when using a single database? I've never dealt with a database where it was really critical if the data was lost. Mine were just traces of data while parts were being machined. If one of the 20 computers gets powered down, how well will it autoconnect to the central database??

Message Edited by unclebump on 08-17-2006 08:36 PM

0 Kudos
Message 2 of 7
(7,771 Views)

From my experience, there are a lot of things you have to consider like:

- the computers are running critical processes (control, high speed data aquisition, etc)?. In this case do not put a database on the same computer.

- how big citadel will become (lots of data updates or not). If you have rapid updates from sensors across, use a central database with a lot of computing power instead of 20 good computers in which are gona be stuck in citadel

- also, is better to have a single central logger for outside clients, all the data can be read from a single location

- for redundancy you can log a data with a few hour lifespan in the other 19 computers and in case of data lost (network failure, etc.), recover the data

- also for redundancy, use a backup computer, which became the master logger in case of the current logging computer fails.

- also, is it worth to pay 20 DSC run time licenses?

cosmin

0 Kudos
Message 3 of 7
(7,765 Views)
Thanks for your ideas.

I like the idea of having the 20 DBs distributed on different machines but mirroring that data back to a central server.  Sounds great for redundancy.  Problem is, I don't know how to do that with DSC or any other method for that matter.

As for server failure, I could do several things.  First of all, I've already tested (in 7.1) what would happen if the Citadel database "unplugs" while a process is running (that is, say the server suddenly loses power).  The process computers can still run normally. (Perhaps there will be an error on the chain that I could trap, but I didn't look that hard).  Also, at the time I could bring the server back on line and it just started logging again as if nothing happened, except for a gap in the data.  This is where local caching might be a good idea -- I'll keep that in mind.

Also thinking of server failure:  we could use RAID to continously mirror the server HD, as well as using a scheduled event to backup the database folder to somewhere else on the corporate network.  Should work I think.

BTW, this is a paint mixing plant.  So I will be using "data sets" to store batch-oriented data.  As for DSC run-time licences, I'm not worried -- NI is still much cheaper on this than their many competitors in this field!  But that's because DSC is, shall we say, "less proven" than the alternatives like Wonderware, etc...

I will not be acquiring super-fast, but we could expect 1 data point per second.  Still, over many years the database will get big.  And yes, having all the data on a central server is done so outside clients can access simply 1 single location.  Also, it becomes a convenient dividing point between the process network and the corporate LAN.  I was thinking of 2 ethernet cards in the server.

Well so far I haven't yet heard any technical preference over 1 big DB vs 20 smaller DBs (albeit both on the same server).  And if anyone knows how to do automatic DB mirroring with DSC...

Thanks again,
-Dave
0 Kudos
Message 4 of 7
(7,741 Views)
'I was thinking of 2 ethernet cards in the server'
 
don't forget that you can multihome an ethernet card and give it up to ten ip addresses on the same domain. Then each remote computer can send data to a unique ip address. You would be amazed at the amount of traffic one ethernet card can handle. We can do 20 simultaneuos ghost backups over the same ethernet port at work with ghost version 8.2. I don't know what kind of card it is.
0 Kudos
Message 5 of 7
(7,730 Views)
Hi,
just keep those in mind:
- on big citadel database, the recover of lost data from a remote citadel can be a very long and memory consuming process (I've got some 2.4 P4 with 512 ram who got stuck for hours in those process), so use those core duos
- use the 2 network cards , it's a common practice for redundancy in critical process networks
- if data integrity is very important, use the second computer as server backup, and make a hardware link beetwen, not network link
- also for secure data integrity, monitor the citadel process and take actions on failure (make sure you do not corupt the data)
 
cosmin
0 Kudos
Message 6 of 7
(7,721 Views)
I am working on a similar application and would be interested in talking with you.  My email is joel.jones@srnl.doe.gov.
0 Kudos
Message 7 of 7
(6,518 Views)