07-28-2011 03:28 AM
Hi,
I have LabvVIEW and LV Database Connectivity Toolkit. I am creating a test sequence and I need to log the result and the data acquired to a mysql database. I want to ask for your opinion on what will be the most effective approach to logging to database. Shall I open conenction once and then continously stream the data to the database and then once I am done, I will just close the connection (similar to the concept of File IO in LabVIEW. Or shall I open connection everytime I need to communicate to the database and close it immediately and then open it again once I need to log data again. SO basically, my program flow will be:
1. Login (Operator or Engineer) ----> this information needs to be logged.
2. Select Test Sequence -------------> selected sequence name needs to be logged
3. Start Test -------------------------------> uut serial number, data and result needs to be logged
and so on.
so the cycle will just repeat, it can be that the operator will continue to another test sequence, or proceed to another uut. But all the information will be logged to database.
So what will be the most efficient approach to database communication? Is it best to open connection before log in and then log the needed data on the fly and then after the operator is done with the test, upon stopping the program, the connection will also be closed.
or
Everytime I need to log to the database, I will open connection, log the data, and then close (i.e. for Login of Operator, the database connection will be opened, log the operator id, then close the connection, then when a test sequence is selected, the database connection is opened again and then log the test sequence name, then close connection, etc)
This will be my first time dealing with database and I am not sure if the approach to database is the same as the approach to logging to file using binary or text. I know for binary or text, it is recommended that we do data streaming, but for database, will it also follow the same concept? My concern is that these tests may take hours to complete. I am not sure as to how stable it is to keep on holding the database connection open.
Any help or opnion from anywone, will greatly appreciate it. Thanks so much in advcance!
Solved! Go to Solution.
07-28-2011 07:17 AM
Hi there,
When working with databases, the same principles utilised within file I/O generally apply. The most efficient way to write to a database is to open once, write all the data and then only close once you're finished.
Opening and closing the database are also the operations which typically take the most time and could therefore significantly increase the total time taken for a test sequence (not something I'd imagine you desire when it could take hours as it is!). Keeping the database open until you are finished shouldn't have any implications on the stability of the database.
Let me know if this helps / answers your question.
All the best!
07-28-2011 08:06 AM
I think that both the architectures are good. Choosing the best one depend on too many factors. How many connection do you have simultaneously to that database? How is the load of the net in your LAN? How many time can happen that you lost the connection with the database? How much data do you want to transfer in each transaction?
I think a third good approch is to log the data of the test in the local machine and transfer the data only at the end of the test. This is a safety approach that it is also suggested by Microsoft using the ADO connetor to SQLServer. The Dataset indeed are a copy of the data on the local, but I think it is better to save the data in a log file and when the acquisition is finished, transfer them to the SQLServer and then kill the file.
07-28-2011 08:32 AM
Some good points there Ricky, you're right there are lots of factors to be considered.
The other option which I would consider is batch filing. If you write data in batches of a specified size to the local machine during logging, you can open the database once you have a complete batch and write it across in one go. Then close the database while you obtain the next batch and repeat. The benefit to this method is if you encounter a file corruption during writing you will only lose that batch of data, not the entire lot.
Hopefully that's a few things to be thinking about now anyway, let us know how you get on.
Best wishes.
07-28-2011 08:41 PM
Hi Paul, Hi Ricky,
Thanks so much for your insights! I greatly appreciate it! You do have a point in the data corruption part. That was one of my concern and also what if I lost connection to the database, what will happen to my data. I like the idea of logging first in home PC and then just transfering to the database after I am done.
Thanks so much for all your help! Have a great weekend!
Cheers!
Anne