LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Exporting csv data in timed batches doable?

Hi everyone, just wanted to say thanks for all the help so far in helping me resolve random issues/questions I've posted here. With that said, here is another question I have recently run into. Maybe I'm just not using the right terminology for what I'm trying to accomplish, so if that's the case, please let me know. Onto the question:

In my program, I currently am able to connect to an external device (Keysight), gather the data based on the mode of control I want, and export it to a csv file (DataFile2.csv). The question I have is, do I have to append and write the data every loop cycle, or is there a way to say, record data in like an array or something, and every 100 loop iterations or after every 1 minute of recording data, open the DataFile2, append it with a batch of all the current data, then close it? That way the file isn't constantly being written to every 250ms. Think long term testing. If I'm running a test that lasts for a week, writing data every 250ms seems to be overkill, but dumping all the data recorded every minute is much more manageable.

0 Kudos
Message 1 of 4
(233 Views)

If you are ultimately dumping all data, it does not really matter how you do it. The HD caching algorithms of the OS will optimize the saving. Just make sure you open the file once and use low-level file IO. Do not open and close the file for each write. If you need less time resolution, you can average the data for a while and only occasionally save the averaged values.

0 Kudos
Message 2 of 4
(226 Views)

How you do File I/O in LabVIEW depends a lot on the nature of the data (including size, format, structure, how much needs to be written, and how crucial it is to not lose data because of a hardware or software issue (think power failure).

 

How are you writing your .csv files?  If you are using LabVIEW's "Write delimited spreadsheet" routines, and your data rates are not too high, they are pretty efficient, even at the cost of their "open, append, close" cycle.  If you add a "Producer/Consumer" design where you stream a "record" (one tab-delimited row) to a Consumer that collects, say, 1000 records before writing the Delimited Spreadsheet, you'll decrease almost 1000-fold the extra time involved in writing one record at a time, at the cost of a slightly more complex piece of code.

 

I'm currently acquiring behavioral data where events happen up to several times a second, with event times recorded as "number of milliseconds since the start of the trial" (which can handle elapsed times of several weeks as an I32 value), and find Write Delimited Spreadsheet works just fine (though at the end of the Trial, I read the text file and re-save the data as a (possibly-multi-sheet) Excel Workbook (using the Report Generation Toolkit).

 

Bob Schor 

0 Kudos
Message 3 of 4
(188 Views)

@brar45 wrote:

do I have to append and write the data every loop cycle, or is there a way to say, record data in like an array or something, and every 100 loop iterations or after every 1 minute of recording data, open the DataFile2, append it with a batch of all the current data, then close it? That way the file isn't constantly being written to every 250ms. Think long term testing. If I'm running a test that lasts for a week, writing data every 250ms seems to be overkill, but dumping all the data recorded every minute is much more manageable.


Sure there is. I'd send every measurement to a queue and have a Consumer queue that reads and adds this to an array, which when it reaches 100 elements write to disk and empties.

Check out Help --> Find examples --> Queued message handler 

G# - Award winning reference based OOP for LV, for free! - Qestit VIPM GitHub

Qestit Systems
Certified-LabVIEW-Developer
0 Kudos
Message 4 of 4
(153 Views)