04-13-2016 06:12 PM
Hello,
I am trying to save all of my voltage data points, over 6 million points, that I have collected using the LabView DAQmx functions into a file. The code I have made works fine and will plot all of the points onto an XY graph but when I try to export the plot data into a txt or excel file LabView wants to shut down. I have figured out how to use the TMDS functions in the LabView functions toolkit and it keeps the program from shutting down; however, it will return a large set of data points but not all of the data points. I was wondering if there is a way to split the data into smaller quantities and save into multiple files. I was thinking I could maybe index my data points into different array rows and save each row into a file of its own but I am not sure how fast the program can create and save new files. Any suggestions? My sampling rate is ~412,714 per second over a time span of 15 seconds and I am collecting 6,190,708 samples total.
Solved! Go to Solution.
04-13-2016 06:37 PM
Have you looked at a producer/consumer architecture that streams them to the TDMS file as you collect them rather than waiting until the end?
04-13-2016 06:45 PM
I haven't checked that out yet but that might be the issue. Where would I find the producer/consumer architecture?
04-13-2016 07:15 PM
You haven't said what version of LabVIEW you are using, but try this:
This will produce a Sample program that you can study and adapt to your needs. The key elements is that there are two loops, a Producer loop where data are "produced" (typically DAQmx code) and placed into a Queue, and a Consumer loop where the data are processed (such as written to a file). The Queue "buffers" the data and uses the "free time" that the Producer spends waiting for data to be acquired. The two loops run at independent rates, with the Consumer at most "keeping up" with the Producer (since if there is nothing to Consume, the Consumer just waits).
Bob Schor
04-13-2016 07:26 PM
My bad, I'm still new to LabView. I am using the 2015 32-bit verson of LabView, a PCI-6115 DAQ card and the computer has 8 GB of RAM memory. I never about the frameworks and design patterns of LabView codes... that seems like it would make life a lot easier. Thanks for the tip! I am going to test it out when I get back to my lab!
04-13-2016 07:41 PM
Do you know/understand the concept of Data Flow, central to LabVIEW? One consequence of Data Flow is that if you have two While Loops not connected serially to each other (i.e. you don't have an output from one going to the input of the other), they will both run in parallel, "sharing" the CPU. So if one loop is acquiring data from a DAQ card, which means if the card is programmed to deliver 1000 points at 1 Khz, that loop will basically "wait" maybe 998 milliseconds, then spend 2 milliseconds delivering the 1000 points (it probably is even faster than that). So if one loop requires 2 milliseconds per second, that leave 998 milliseconds per second of CPU time for the other loop to do whatever it needs to do.
That's the "magic" behind the Producer/Consumer pattern. The two loops are connected at their inputs by a Queue, so they can run in parallel. The Producer, when it gets its 1000 points, simply puts all the points on the Queue and it's done. The points "travel" in the Queue (if you want to think of it that way) to the Consumer, where they are dequeued ("reconstituted") and then plotted, written to disk, FFT'd, or whatever during the "copious free time" left over after the Producer had its aliquot.
Bob Schor
04-13-2016 08:15 PM
I am a HUGE supporter of the Producer/Consumer. But in this case, there is a much better alternative. DAQmx can log the data for you! There is a function in the DAQmx Advanced Task Options called DAQmx Configure Logging. Call that function before starting your task. DAQmx will actually stream the data straight to a TDMS file for you. This eliminates a lot of overhead and makes things A LOT simpler on you.
And doing the math, your 6M+ samples will come out to about 47MB. That is not enough to worry about splitting between files.
04-13-2016 09:02 PM
04-14-2016 04:16 AM
@lmille32 wrote:
and it did attempt to file data points but I received an error saying that I had too many data points to collect.
That is because the hardware buffer cannot handle that many data points. You will need to read the data in a loop.