User | Kudos |
---|---|
10 | |
3 | |
2 | |
2 | |
2 |
I'm writing a VI for a lab which writes to a database. The number of measurements will vary depending upon which fixture is set up in the lab, and whether the experimenter decides to add extra ones. This results in a varying size array of doubles which gets written into a single database with generic column names c1-c200. The column names are cross-referenced in another table keyed to the lab fixture.
The method I found here is to convert from array->cluster->variant->database insert. The problem with this is that a cluster's size must be available at compile time, making the varying array size more difficult. It also limits the size of the array/database entry to 256 elements in the array->cluster conversion.
I know I could just populate and write a 200 element array into the database, but that increases database activity , is inelegant, and forces me to put zeros in the unused columns, instead of leaving them null.
Attached is a VI which demonstrates what I would like, though it only works for doubles and is still limited by the 256 element limit. It could very easily be changed for integers or strings, but it would be best if it could take any type of array input, with a much larger size limit.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Any idea that has not received any kudos within a year after posting will be automatically declined.