07-01-2021 09:20 AM - edited 07-01-2021 09:22 AM
I am having a simple program used to control some valves of a test rig using cDAQ and doing some data acquisition and logging. I am having a problem with the logging as its a bit random sometimes and the number of samples logged is inconsistent. The way the logging takes place is that when the current state is referring to the logging case the sensor data gets queued and flushed and then gets sent to the data logging subVI where the samples are written to TDMS file. The acquisition rate is 1KHz so I am expecting for a 3sec of a logging duration 3k samples but sometimes I get 200-300 samples more or less in the TDMS file and the 200-300 samples seem to be from a previous iteration. I am not quite sure why is this happening and how to get the right 3k samples for the 3sec of logging duration. I attached the project so you could understand what I am talking about. The MainVI's name is INJ TR2.
Any advice to how to improve my design is also very welcome 🙂
Solved! Go to Solution.
07-01-2021 10:39 AM
A few things I see that at least could be an issue:
1. When you read from your DAQ, read a specific number of samples. Do not use the "-1" to just read everything in the buffer. Instead, specifically read 100ms worth of data. You currently have a 1kS/s rate, so read 100 samples at a time. Since the DAQ read will now set the loop rate, you do not need the wait in your DAQ loop.
2. After you read the 100ms worth of data, do not just read another sample. You can use Delete From Array to get the last data points of the channel you care about.
3. You are only really logging the first set of data you get from the queue per second. The TDMS Open should actually be throwing errors when you try to write to the same file since you cannot create a file that is already there. What you probably want to do is create the file before the "Logging" state, read from the queue and write the data as it comes in until your time is up, then close the file.
07-01-2021 10:43 AM
You could use the TDMS Logging inherent to DAQmx. Unless that's too easy.
07-21-2021 09:31 AM
Hi Crossrulz,
thanks for the reply.
@crossrulz wrote:
A few things I see that at least could be an issue:
1. When you read from your DAQ, read a specific number of samples. Do not use the "-1" to just read everything in the buffer. Instead, specifically read 100ms worth of data. You currently have a 1kS/s rate, so read 100 samples at a time. Since the DAQ read will now set the loop rate, you do not need the wait in your DAQ loop.
I actually need to read samples with 1kS/s rate, as I want to log 3k samples in 3secs of logging duration. I just put the wait function of 100ms to update the values on the front panel in a slower rate.
2. After you read the 100ms worth of data, do not just read another sample. You can use Delete From Array to get the last data points of the channel you care about.
I couldn't quite understand this part.
3. You are only really logging the first set of data you get from the queue per second. The TDMS Open should actually be throwing errors when you try to write to the same file since you cannot create a file that is already there. What you probably want to do is create the file before the "Logging" state, read from the queue and write the data as it comes in until your time is up, then close the file.
That's not true. The logging duration is set to 3secs and during this time the data is being enqueued and once the logging time elapses the queue is flushed and logged in the TDMS file.
07-21-2021 01:08 PM
@Amr95 wrote:
Hi Crossrulz,
thanks for the reply.
@crossrulz wrote:
A few things I see that at least could be an issue:
1. When you read from your DAQ, read a specific number of samples. Do not use the "-1" to just read everything in the buffer. Instead, specifically read 100ms worth of data. You currently have a 1kS/s rate, so read 100 samples at a time. Since the DAQ read will now set the loop rate, you do not need the wait in your DAQ loop.
I actually need to read samples with 1kS/s rate, as I want to log 3k samples in 3secs of logging duration. I just put the wait function of 100ms to update the values on the front panel in a slower rate.
I just realized that this is wrong. I applied your advice and set DAQ read function to read 1k samples. I am getting more consistent results but I am still having a problem with the file saved on the first run. Sometimes it saves 2000Samples and sometimes it saves 3000 samples. Still not sure what's causing this issue. The rest of files look good and exactly 3k samples get logged.
07-21-2021 01:38 PM
Hello Amr95,
that is a neat little race condition between the two loops.
You start the transfer of data to by queue by reading the local variable of "Current State". That indicator gets updated most of the time each ms.
The DAQ loop runs with a loop time of 100 ms (or 1s in your last trial).
For the further explanation lets stay with 1s loop time for the DAQ loop, that makes it easier:
It might happen that inside the DAQ loop a new DAQmx read has just started (waiting up to 1 s to read 1k of data), in parallel "Current State" has already been read, so no data will be put inside the queue at the end of the DAQmx read.
Now it might happen, that current state becomes "logging" just shorty after the start of the DAQ loop. So the first time data is transferred into the queue might be nearly two seconds after you upper loops start the 3s software timer inside the state "logging". So only one other data packet will be received on your first "Start sequence".
It is highly probable that anohter 1s data set gets send to the queue even after the upper loops leaves the case "logging".
So on your second sequence you will get 3s of data BUT the first package is the last data set of the previous sequence...
I hope that you could follow my explanations.
Regards, Jens
07-21-2021 03:07 PM
Hello Amr95,
here's a suggestion how you could fix your problem:
1) Reduce the loop time of your DAQ-loop to 50 ms or 100 ms by always requesting 50 or 100 samples.
2) Make sure that you read the value of the local variable "Current State" only after DAQmx Read has finished.
3) Do not use a software timer in the "logging" state of your upper loop, but calculate the waiting time by "number of DAQ-packages aquired", so for a 50 ms DAQ loop time and 3 s you need to receive and save 60 packages. Keep in mind that the first package will include some values that were measured before you switched to "logging", so if you do not want to include these values, omit the very first queue element.
4) Flush the queue sometime after the end of "logging" (or alternative at the start of the waiting case before "logging").
Regards, Jens
(THINK Dataflow)
07-22-2021 01:04 PM
Hi Jens,
thanks a lot for the thorough explanation. Flushing the queue before the "logging state" and requesting 100samples almost solved the issue.
I just have a couple more questions, would really appreciate if you could take the time to answer them.
2) Make sure that you read the value of the local variable "Current State" only after DAQmx Read has finished.
I am not sure how to do that, could you suggest a way to make sure that I read the local variable after DAQmx Read has finished? I am now getting 2900 samples per files, sometimes 3000, no samples from previous iterations which is great but not perfect. Ideally I would get the whole 3secs worth of data.
3) Do not use a software timer in the "logging" state of your upper loop, but calculate the waiting time by "number of DAQ-packages aquired", so for a 50 ms DAQ loop time and 3 s you need to receive and save 60 packages. Keep in mind that the first package will include some values that were measured before you switched to "logging", so if you do not want to include these values, omit the very first queue element.
I tried implementing what you said by timing the logging state by the number of samples I flush, but then it took way too long to get the 3k samples, I kept track of the number of samples in the array and as it reached 3k I sent the data to be logged. It took around 11 or 12secs for the logging state to finish, which was way too long. I was expecting a lil bit more time to elapse but not 4 times more than what it should take.
07-22-2021 03:26 PM - edited 07-22-2021 03:37 PM
@Amr95 wrote:
Hi Jens,
thanks a lot for the thorough explanation. Flushing the queue before the "logging state" and requesting 100samples almost solved the issue.
I just have a couple more questions, would really appreciate if you could take the time to answer them.2) Make sure that you read the value of the local variable "Current State" only after DAQmx Read has finished.
I am not sure how to do that, could you suggest a way to make sure that I read the local variable after DAQmx Read has finished? I am now getting 2900 samples per files, sometimes 3000, no samples from previous iterations which is great but not perfect. Ideally I would get the whole 3secs worth of data.
Just enforce dataflow, e.g. with something like this:
3) Do not use a software timer in the "logging" state of your upper loop, but calculate the waiting time by "number of DAQ-packages aquired", so for a 50 ms DAQ loop time and 3 s you need to receive and save 60 packages. Keep in mind that the first package will include some values that were measured before you switched to "logging", so if you do not want to include these values, omit the very first queue element.
I tried implementing what you said by timing the logging state by the number of samples I flush, but then it took way too long to get the 3k samples, I kept track of the number of samples in the array and as it reached 3k I sent the data to be logged. It took around 11 or 12secs for the logging state to finish, which was way too long. I was expecting a lil bit more time to elapse but not 4 times more than what it should take.
Several ideas might improve or solve this:
1) Create the tdms file in the state before.
2) Do not use flush queue anymore, but just wait for another queue element.
OR
Create a separate file logging loop.(EDIT: Although this again might lead to the case that your result file does not incorparate exactly 3 s of data. Think about a way to "tell" your DAQ loop that your logging loop has aquired the required amount of data. One idea: do not evaluate the local variable of "Current state". Just use a boolean. You can set this to TRUE with the start of the "logging" state, but it gets set to FALSE by the finish of the file logging - or use a queue or a notifier for that..).
Regards, Jens
07-22-2021 05:07 PM - edited 07-22-2021 05:07 PM
Thanks a lot again Jens.
@JensG69 wrote:
Several ideas might improve or solve this:
1) Create the tdms file in the state before.
2) Do not use flush queue anymore, but just wait for another queue element.
So you are saying that processing of creating the TDMS file and flushing the queue is what's causing the 3sec to expand to 11sec? Might be true but I wonder why it doesn't take that long with my previous implementation with the software timing. And what do you mean by wait for another queue element? Do you mean that I should use the dequeue function instead?
One idea: do not evaluate the local variable of "Current state". Just use a boolean. You can set this to TRUE with the start of the "logging" state, but it gets set to FALSE by the finish of the file logging - or use a queue or a notifier for that..).
Sounds good. I just wonder how reading a local variable of a boolean is going to resolve the issue of the synchronization compared with reading a local variable of an Enum. Using a queue or a notifier makes sense too.