02-13-2013 12:51 PM
LWCVI
SQL Toolkit
I have data logging software that logs to an MS Access database (*.mdb) with 4 internal tables. As the database gets large (> 100,000) records, the write times become very long, and I have instances of software shutdown. Even it it does not shutdown, the write times become several minutes, which is longer than the data logging interval (every 2 minutes).
I am currently manually archiving and emptying the records on a monthly basis, but it makes it difficult for the user to find data that they need.
Is this an inherent problem with MS Access? This is the older version of SQL toolkit running under CVI 5.1. This may be remedied by an upgrade for CVI/SQL or both. I do not want to spend the $$ if this is a database structure problem.
I can move over to another format, such as MySQL, etc. Previously, I used a dBase structure/ODBC and did not have these issues.
Any suggestion, help, etc is welcome. Thanks in advance,
David E.
02-14-2013 01:09 PM
Hi Diverdke,
I doubt this is a CVI issue. My impression is that SQL is better than Access for large databases. I'd probably look into optimizing your database first (a quick search brought up several references to this: http://www.granite.ab.ca/access/performancefaq.htm). Also, if your previous structure had no issues, what prompted a switch?
02-14-2013 04:20 PM
Humphrey,
I moved to MS Access because I did not have any way to view the data in the old system remotely (dBase format). dBase is long gone. Access will open the dBase database, but it converts it.
I created some nice custom reports in Access to display the data for my customers.
--
I actually have found a solution to the large file problems. Rather than log all the data into a single table (within the database), I create a new table each month. This allows the table size to stay reasonable, and still allows a search routine to find the data (e.g., May13, June13, etc).
If I keep the number of records in a table < 1M, then the write times are reasonable.
Thanks for the help. I appreciate the quick reply.
PS the optimization link is a dead link.
David
02-15-2013 10:37 AM
Great - glad you found a solution.
(BTW, the link for some reason added the ")." to the end of it so that's why it was broken)
02-20-2013 04:10 PM
I have had no problems with mySQL.
At present we have 572,380,165 individual reading records in the database, and are adding at 1,508,409 reading records a day at a peak rate of 4 records every 0.1 seconds. Each reading record holds a key and a single (double) reading value. We use the default MyISAM storage engine. Obviously we use a very simple schema for this, and our queries are simple. It is rare for queries to relate to data older than 24 hours.
02-17-2016 09:18 AM
Diverdke,
Could you perhaps share your insights into how you set up your client LabWindows application to write to your external MS Access database?
I've got an applicaiton that I've been maintaining for several years and its data logging scheme is antiquated. ASCII log files! It's a long story.
I've been wanting to graduate to this level of network logging for some time now, particularly for the reporting capability it would provide me. Thanks for any help you can provide.
02-17-2016 09:17 PM
Will do. Not simple, but workable. You will need the SQL Toolkit. You do NOT need MS Access, as the ODBC driver is built into XP and Win 7.
I have run into the 4Gb file size issue with XP, but that is a HUGE number of records.
Relatively fast to write and search.
Dave
02-26-2016 09:39 AM
Based on my experience, MS Access is not a real option when the number of records is huge. As larger becomes the .mdb file, as longer becomes the time needed to add a new record (up to several seconds).
My suggestion is to install a local instance of SQL Server Express and use the CVI SQL Toolkit (as you do if you want to use MS Access).