LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Best way to pause and restart saving to log file

Solved!
Go to solution

Hi, I'm working on a program to control two different serial instruments simultaneously and also save data collected by them. I have accomplished this through parallel loops and queues and everything is working smoothly.

 

When the VI first executes, it creates a log file in which to save all the data. Each of the instrument control loops feeds data into this loop via a queue, and the data is written to a .csv file. I am trying to implement an option to press a "save as" button to save only the desired data out of the dozen or so columns in the log file at any point during execution. However, I'm running into an issue with this process because LabView is still saving into the log in one loop while another loop is trying to read the log.

 

I am looking for suggestions on how best to implement this "save as" feature without the issue described above. I have considered pausing the log file so that no data is being added to it while the "save as" subVI is running, but I have then been unable to get the log file to restart via queues, local variables/property nodes, and semaphore. It would also work to start a new log file under a new name when "save as" begins, but again, I've had no luck. I'm probably missing something but thought I would ask about best practices before I go digging for bugs.

Screenshot.PNG

 

In the attached .zip, there is an abbreviated form of my VI (MAIN VI.vi) containing only the saving features (but none of the instrument communication stuff) as well as an example log file and abbreviated "save as" file.

Many thanks.

0 Kudos
Message 1 of 11
(2,533 Views)

Hi

The best you can do is to start with a log file for every instrument.

So, in your case this means two logfiles.

After the measurement you can select from both logfiles the data that need to be merged. And be sure to add a timestamp to each logging.

 

greetings from the Netherlands
0 Kudos
Message 2 of 11
(2,491 Views)

Hi Albert.Geven, thanks for the idea. However, I'm not sure this will resolve the problem as I'm trying to do a save feature while the measurements are running, not after. I already merge data from the two instruments in realtime without an issue, it's just trying to read this log while it's still being written into that is causing me problems.

 

Unless you're suggesting that trying to read from this log any time before the final measurement is impossible.

0 Kudos
Message 3 of 11
(2,486 Views)

Another option is to log to another queue and read that queue to save those data that temperarely were buffered in the queue to the logfile. Instead of a queue you also can bufer in a second file but I would use a queue.

How much data is generated when readin the logfile or maybe better how big does that queue grow?

greetings from the Netherlands
0 Kudos
Message 4 of 11
(2,481 Views)

The current queue that collects instrument data never grows very large because the instruments send data about every 1 second, while elements can be dequeued every 0.1 sec by the "save to log" loop. Each write to the log file records 1 row of 10 columns of data.

 

However, this program may need to run for several hours so the log file or a queue of all data collected so far may grow quite large.

0 Kudos
Message 5 of 11
(2,473 Views)

The queue is only meant to buffer data while reading files, call it the pause saving command accompanied by a continue save command. At that speed you can pause a long time.

greetings from the Netherlands
0 Kudos
Message 6 of 11
(2,463 Views)

So the second queue is meant to hold all the data that should be written to the master log file while the master is being read? And then, once reading is complete, the data in the second queue is written in?

 

That sounds plausible, although then I'd need to implement something that lets new data in the original queue entering after the master read is complete be written to the log at the appropriate time. As in, not before the old data in the second queue is written.

 

Is this what you're suggesting?

0 Kudos
Message 7 of 11
(2,438 Views)

Hi

You need just one queue per system. That queue is written to a file or not written to a file depending on if the file you are writing to is blocked by you because you want the system to do inbetween. If you stop interfering, it depends on the eventual new statis of each system if data is thrown away or still logged to a file.

greetings from the Netherlands
Message 8 of 11
(2,426 Views)

I can't see your attached VIs but I'm assuming you are using the read delim spreadsheet inside of the "array to data" VI, which opens the file as "read-only"", which would be what you want.  I'm not sure what the exact issue could be, but maybe there is problem with file properties being adjusted by the logging process as you are attempting to read.

 

I would work around this problem by just saving a full copy of the source file to a temp directory, and then performing your data column extract and save-as routine on that.  That way the source file can remain in its current logging session without having to worry about starting/stopping.

Message 9 of 11
(2,415 Views)

@Albert.Geven wrote:

Hi

You need just one queue per system. That queue is written to a file or not written to a file depending on if the file you are writing to is blocked by you because you want the system to do inbetween. If you stop interfering, it depends on the eventual new statis of each system if data is thrown away or still logged to a file.


OK, that's what I had originally intended to do. Just not sure how to practically implement that.

 

Perhaps a case structure such that there are no elements being dequeued and written when my "save as" feature is running? I will try that and see if it works.

0 Kudos
Message 10 of 11
(2,405 Views)