LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Logging more or less samples than required

Solved!
Go to solution

@Amr95 wrote:

 

One idea: do not evaluate the local variable of "Current state". Just use a boolean. You can set this to TRUE with the start of the "logging" state, but it gets set to FALSE by the finish of the file logging - or use a queue or a notifier for that..).


Sounds good. I just wonder how reading a local variable of a boolean is going to resolve the issue of  the synchronization compared with reading a local variable of an Enum. Using a queue or a notifier makes sense too.


Boolean is a "solution" if you create an extra logging loop, independent of your main state machine. The main state machine would tell the DAQ loop to start to send data to the logging loop, the logging loop would tell the DAQ loop that is has saved enough samples.

 

Regards, Jens

Kudos are welcome...
0 Kudos
Message 11 of 12
(493 Views)

@JensG69 wrote:

@Amr95 wrote:

Hi Jens, 
thanks a lot for the thorough explanation. Flushing the queue before the "logging state" and requesting 100samples almost solved the issue.
I just have a couple more questions, would really appreciate if you could take the time to answer them.

2) Make sure that you read the value of the local variable "Current State" only after DAQmx Read has finished.


I am not sure how to do that, could you suggest a way to make sure that I read the local variable after DAQmx Read has finished? I am now getting 2900 samples per files, sometimes 3000, no samples from previous iterations which is great but not perfect. Ideally I would get the whole 3secs worth of data. 

Just enforce dataflow, e.g. with something like this:

JensG69_1-1626985332044.png

 


3) Do not use a software timer in the "logging" state of your upper loop, but calculate the waiting time by "number of DAQ-packages aquired", so for a 50 ms DAQ loop time and 3 s you need to receive and save 60 packages. Keep in mind that the first package will include some values that were measured before you switched to "logging", so if you do not want to include these values, omit the very first queue element.

I tried implementing what you said by timing the logging state by the number of samples I flush, but then it took way too long to get the 3k samples, I kept track of the number of samples in the array and as it reached 3k I sent the data to be logged. It took around 11 or 12secs for the logging state to finish, which was way too long. I was expecting a lil bit more time to elapse but not 4 times more than what it should take.


Several ideas might improve or solve this:

1) Create the tdms file in the state before.

2) Do not use flush queue anymore, but just wait for another queue element.

OR

Create a separate file logging loop.(EDIT: Although this again might lead to the case that your result file does not incorparate exactly 3 s of data. Think about a way to "tell" your DAQ loop that your logging loop has aquired the required amount of data. One idea: do not evaluate the local variable of "Current state". Just use a boolean. You can set this to TRUE with the start of the "logging" state, but it gets set to FALSE by the finish of the file logging - or use a queue or a notifier for that..).

 

Regards, Jens


The separate file logging loop worked out great. I just had a problem passing the file reference from the upper state machine loop to the logging loop. The loop always got stuck. I fixed it by using a local variable for the file reference. I am pretty sure it's not the best way to pass the reference from one loop to another but it's working, but I am curious how would I ideally pass it between different loops.

0 Kudos
Message 12 of 12
(452 Views)