Hi Yukon,
I will generally jump on anything that will bring in money for my company,
BUT,
I read this memo yesterday (see below) on Info-LabVIEW.
If he is not willing to share/sell his code, then you could contact an Alliance Member or a Select Integrator (like Data Science Automation for example) to support you in any way you feel is appropriate.
Ben
Quote:
George,
I've worked on just such a project in the past. This was for a large plant floor data collection system which was responsible for reading RF tags that contained production data for each part that was built. There were many "data collector" PCs distributed throughout the production line; each responsible for gathering detailed machining data for a portion of the
manufacturing process.
The software for the data collectors was written using LV with the help of the Internet Toolkit. These data collectors simply gathered raw binary data and FTP'ed them to a master server. The data for each part was represented as a small (8K) binary file. The beauty of this system was that if the server went down for any reason (maintenance, upgrade, fault) the data was buffered locally at each data collector. I would balance the system by having a settable buffer threshold of X parts before opening an FTP connection to the server. This would prevent a flood of incoming connections to the server. If the server was unreachable; the data would stay buffered in a local directory. Once the data was successfully transferred; it was deleted from the data collector's hard disk. The Internet toolkit also allowed remote status monitoring of all the data collectors. Each data collector would present its status through an HTML copy of its front panel (heartbeat, buffered parts, time alive, etc.). This scheme was a very efficient solution for daily performance monitoring as I wouldn't have to walk around the entire plant to determine the status of the data collectors.
On the server side, also written in LV & Inet TK; it would process the incoming data which would just appear in a directory structure (as FTP'ed by the data collectors). The LV portion would process these raw data files; write the data to a database and use the Inet TK to process queries and display reports to the system's users. The advantage of this method was that I only needed one database license and that the server software (LV) was responsible for decoding the raw data. This made my life easier when the data structure stored on the RF tags changed. Though these types of changes only occurred every few months; it was much easier to manage this change process only from the server side; rather then update all of the data collectors.
Though the system I've described above was designed with my specific constraints and features; I thought some of its design aspects may benefit you during your design process.
Milan Podhorsky
Visteon Corporation
Unquote