LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Database connectivity toolkit limitations

Hi

I use LV 9 and the matching connectivity toolkit version.

 

I would like to use the connectivity toolkit to create a visualization of data from a large MySQL database.

 

The database contains 250 tables, each with over than 10 million rows

each row consists of a timestamp (double - representing the seconds from a given time including milliseconds) , and a value (double). 

Each row= 16 bytes.

Each table is dedicated for a specific variable, and the data in the rows is infact the progression of the variable in time.

 

My demands are to be able to Select by query, data from 10 tables , 1 million rows maximum from each table,

and to plot all of the data at the same graph.

The graph will be - one column as a function of the other,  Value of the variable as a function of time,

 

I have succeeded to do so with selecting 100,000 rows from 6 tables,

and encountered some inconsistency when trying to increase the number of rows selected by query, by the connectivity toolkit.

The inconsistency was -

1. Process time

2. Errors of 'type mismatch' ?

3. labview pop up of - 'not enough memory'

 

Theoretically I am able to loop over smaller Select query, but it will still cost a lot of time....

Can I perform better,

or am I asking too much from labview+the connectivity toolkit.

 

0 Kudos
Message 1 of 6
(3,499 Views)

You probably don't have enough continuous memory if you are getting the 'not enough memory' error.

 

You really need to read smaller chunks. You can't possibly display that number of points in a graph. Think about how many horizontal pixels your display has. How much of that is the graph and how much is the rest of the front panel? Now think about how many of those 100,000 rows are being discarded because of the limitations in your display.

Message 2 of 6
(3,484 Views)

Hi Dennis,

Thank you for your reply,

 

but what is exactly continuous memory?

Can I change in the configuration? or should I buy a stronger PC?

Right clicking My Computer gives me -

 

Intel(R) Core(TM)2 Dou CPU

E8400 @ 3.00 GHz

2.99GHz, 3.25 GB of RAM

 

Are you familiar with the way NI are doing it at MAX when viewing graphs at Citadel?

You can look at a variable trace -  look at a very large number of points very fast...

0 Kudos
Message 3 of 6
(3,457 Views)

I think your memory is enough.

You have to be careful when you use data of array type because when size of array is changed, memory notice that new array is made.

So if you use array type data, you have to set the size of array and rewrite data on the array.

How about dispersion data?

Actually sometimes there's overflow when a lot of data is handled in LabVIEW.

It's disadvantage of LabVIEW.

LabVIEW's function icons use lots of momery.

0 Kudos
Message 4 of 6
(3,443 Views)

What I meant to say was 'contiguous' and there is an article here.

0 Kudos
Message 5 of 6
(3,414 Views)

I'm having a similar problem. My database is simpler but my query is more complex. I have a 3 tables and my query is using aliasing and joins to return a data set of 65k records.  The querry works but it takes some time to execute.  the execute query VI often times out and I don't see anyway to overide this.  When I'm lucky enough to execute the querry without timing out I also get out of memory.  This application is targeted to a server running 64 bit OS so I'm not worried about the memory problem.  I'd just like to know how to override the timeout inside the execute querry VI

0 Kudos
Message 6 of 6
(3,302 Views)