LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Database connectivity toolkit delay prevents missing records?

I noticed that when I execute a large SQL statement (mine has around 7500 INSERT INTO commands) using the DB Tools Execute Query VI followed by a DB Tools Free Object VI, I lose records. I say "lose records" because when I use my database manager to look at the data, there are only about 550 records present. I experimented by placing a delay between the two VIs and found that the number of records that make it into the database is approximately proportional to the length of the delay value. Since my LV and database skills are a bit lacking, I am wondering if any of you gurus might have a suggestion for how to solve this problem. I hate to resort to a canned value or even a calculated value to feed a delay mechanism since I could be wasting a bunch of time unnecessarily or end up with not enough. What is the preferred method of insuring that the DB Tools Execute Query VI has successfully done its job?
0 Kudos
Message 1 of 28
(3,655 Views)

So you have a single SQL command string that contains 7500 separate INSERT statements in it?

 

Well, that is problem #1. What exactly are you trying to do? What does the data look like that you are trying to save?

 

Also, what database are you using and what version of LV?

 

Mike...


Certified Professional Instructor
Certified LabVIEW Architect
LabVIEW Champion

"... after all, He's not a tame lion..."

For help with grief and grieving.
0 Kudos
Message 2 of 28
(3,650 Views)
"So you have a single SQL command string that contains 7500 separate INSERT statements in it?" Yes. "Well, that is problem #1." Please help me understand why. This method actually works except for the timing issue that I am exploring. I initially issued one command per query, but that took forever to execute, then I issued one INSERT command with many record's worth of data and that worked but the code was a little more complex to add the INSERT preamble and the suffix so I tried the multiple commands method and it has worked, as far as I could tell. This timing issue with missing records is not something that I noticed before but it could have always been a problem. "What exactly are you trying to do?" I am trying to save the intensity vs. wavelength data from three spectrometers each with a different range and resolution. "What does the data look like that you are trying to save?" A snippet of the 7500 lines looks like this: ... INSERT INTO TestDetails VALUES (610, 1010, 0, 10, 0, '879.781671', '1936.666667', 1); INSERT INTO TestDetails VALUES (610, 1010, 0, 10, 0, '881.501556', '1929.000000', 1); INSERT INTO TestDetails VALUES (610, 1010, 0, 10, 0, '883.221459', '1918.666667', 1); INSERT INTO TestDetails VALUES (610, 1010, 0, 10, 0, '884.941382', '1915.000000', 1); ... where 610 is the ID of the test, 1010 identifies that this is spectrometer data from spectrometer x, [0,10,0] is undeveloped yet, 879.78 is the wavelength, 1936.66 is the intensity, and 1 means the test passed. "Also, what database are you using and what version of LV?" I am using MS SQLServer and LV 8.5.1 with the database connectivity toolkit.
0 Kudos
Message 3 of 28
(3,611 Views)
Well, that sure looks crappy.  I'm going to restart my browser and repost.
0 Kudos
Message 4 of 28
(3,610 Views)

"So you have a single SQL command string that contains 7500 separate INSERT statements in it?"

 

Yes.

 

"Well, that is problem #1."

 

Please help me understand why. This method actually works except for the timing issue that I am exploring. I initially issued one command per query, but that took forever to execute, then I issued one INSERT command with many record's worth of data and that worked but the code was a little more complex to add the INSERT preamble and the suffix so I tried the multiple commands method and it has worked, as far as I could tell. This timing issue with missing records is not something that I noticed before but it could have always been a problem.

 

"What exactly are you trying to do?"

 

I am trying to save the intensity vs. wavelength data from three spectrometers each with a different range and resolution.

 

"What does the data look like that you are trying to save?"

 

A snippet of the 7500 lines looks like this:

 

...

INSERT INTO TestDetails VALUES (610, 1010, 0, 10, 0, '879.781671', '1936.666667', 1);

INSERT INTO TestDetails VALUES (610, 1010, 0, 10, 0, '881.501556', '1929.000000', 1);

INSERT INTO TestDetails VALUES (610, 1010, 0, 10, 0, '883.221459', '1918.666667', 1);

INSERT INTO TestDetails VALUES (610, 1010, 0, 10, 0, '884.941382', '1915.000000', 1);

...

 

where 610 is the ID of the test, 1010 identifies that this is spectrometer data from spectrometer x, [0,10,0] is undeveloped yet, 879.78 is the wavelength, 1936.66 is the intensity, and 1 means the test passed.

 

"Also, what database are you using and what version of LV?"

 

I am using MS SQLServer and LV 8.5.1 with the database connectivity toolkit.

0 Kudos
Message 5 of 28
(3,606 Views)

kc64,

 

Have you looked at DB Tools Insert Data.vi located in the Database palette.  If you look into the depths of this VI it uses the SQL command "INSERT INTO".  Putting this VI in a loop could speed up your application.

Regards,

Jon S.
National Instruments
LabVIEW NXG Product Owner
0 Kudos
Message 6 of 28
(3,583 Views)

Why are you saving the data as individual datapoints? Logically it sounds like the data is actually a set of three waveforms, not a collection of individual values. Save it that way and you get all your data into the database with three inserts (one for each waveform).

 

The database needs to mirror the logical structure of your data. You are generating a huge amount of redundant data. In addition, you will run into similar speed issues getting the waveforms out of the database.

 

Mike...


Certified Professional Instructor
Certified LabVIEW Architect
LabVIEW Champion

"... after all, He's not a tame lion..."

For help with grief and grieving.
0 Kudos
Message 7 of 28
(3,581 Views)

Jon S--

 

The insert method you mentioned simply performs the same function as an INSERT INTO SQL statement as I read the docs.  I don't see an advantage to using it over the general SQL command.

 

Am I wrong on this?

 

Kevin

0 Kudos
Message 8 of 28
(3,568 Views)

mikeporter--

 

I appreciate the advice but don't understand how that would work with SQL Server.  Is there a built in data type for LV waveforms in SQL Server?  The only way I know to save data like a collection of data points is to break it down into the individual points or to encode it into another data type like a string or XML.  Neither of those appeal to me from a normalization point of view.

 

Maybe I am missing your point.  In any case, the database structure is already set even though there are several fields yet to be defined as mentioned earlier.

 

kc64

0 Kudos
Message 9 of 28
(3,567 Views)

You don't need a built-in datatype because SQL Server supports a datatype called BLOBs or Binary Large OBjects. The idea is that from the database standpoint the BLOB data is just a chunk of memory that you can put stuff and then get it back when you need it. The database has no idea what the data's internal structure is, and it doesn't really care because it assumes that when you read it out you will know what to do with it. BLOBs can be used to store binary datasets, files, pictures, documents - you name it.

 

In the case of your data, you would take the waveform, flatten it to a string and write the string to the BLOB field in your database. To read it back, simply take the data from the field (which will be in the form of a string) unflatten it and hey, presto you've got your data. Because LV's fundamental data structures are based on industry standards, this even works if you are wanting to read your data into something other than LV.

 

Mike...


Certified Professional Instructor
Certified LabVIEW Architect
LabVIEW Champion

"... after all, He's not a tame lion..."

For help with grief and grieving.
0 Kudos
Message 10 of 28
(3,558 Views)