02-19-2009 10:58 AM
Hi all,
Here is my question. I'm currently working with some very fast data acquistion instruments. Some with samplerates of .1ms. The problem I'm having is that there is the possibility of selecting that samplerate and measuring for say 30 minutes, which if you add up is about 18 million data points. The way my data acquistion is working now, is that data is being written to a binary file instantly by the drivers of the device to a temp file on my computer. However, each samplerate has its own binary file and no two devices can share the same file even if the sampelrates are the same.
To clear up an example would be : Device 1, has samplerates of .1ms for channels 1-4 and 20ms for channels 5-8; This device now has two binary files for channels 1-4 and channels 5-8. Now Device 2, has sample rates of .1ms on channels 1-2 and 10ms on channels 3 and 4, this device also has two binary files. The setup now has 4 total binary files being written by the driver.
I also need to be able to align the timestamps from each binary file together. Attached is an image of how the file needs to look.
The problem I'm having is that I need to take these binary files and merge them together into a formatted excel spreadsheet file AND do it as fast as possible.
Things I can do so far:
I can read in the binary files and get them into a 2D array, however because some of the files are so big I recieve a memory error from labview most of the time.
I also created a VI the aligns all the timestamps together, but again, it is currently stored in an array that is as big as the fastest timestamp (say for instance the 30min example, it would be a 2D array of demensions 18 million by 13 (including the timestamp)) Again, memory error occurs.
Again I've attached what the file should look like. Any help you guys could give me would be greatly appreciate.
Let me know if I need to explain this a little better as I'm not sure if I did a good enough job.
Thanks!
Brad
02-19-2009 11:11 AM
First do some math on how big that final file is going to be (concider it will take multiple bytes for each number, don't forget the delimiter).
LV has some limits on the size of arrays in that the allocated buffer for the array must be contiguous. The best i was ever able to get into an array was about 1.2 G before LV ran out of memory.
If this is going to work I suspect you will have to read one record at a time from each of the files check the time stamps as you go and build the final file one line at a time.
Ben