02-26-2012 04:41 PM
We are using a timed loop to read, display, and write data from a CompactDAQ device. Normally, the data reads, displays, and writes correctly, however there have been several occurrences where the write function failed without notice. When we went back to open up the data after a test, we found that the data ended early, ranging from 30 seconds in to a minute into the test. Has anyone encountered an error like this? Thank you for your help.
02-26-2012 05:06 PM
Hi ANelson,
Thanks for joining the forums!
How are you reading/writing the data? If you are using buffered acquisition/generation then you could be having overflow/underflow issues. Do you have any error handling in your code? If so are there any errors reported?
You mention opening the data after the test - is this data that you have logged from the acquisition? We need to try to narrow this down a little ![]()
02-26-2012 06:08 PM
Thank you for the quick response. I have attached our VI for reference. To answer your questions:
I apologize for the vague first post. Hopefully the VI sheds some light on the problem. Thank you for your help.
02-27-2012 09:52 AM
You need to create an error out indicator. Simply right click on the error out of the timed loop and select 'Create Indicator'.
I think Peter's guess is correct. I'm not sure why you are even using a timed loop but with the write to file inside, that is pretty sure to cause overflow problems. You will have to change to a producer/consumer architecture if you want to use the slow Write to Measurement File.
p.s. Doing the Split Signals and then a Merge of the same signals is silly. You could simply have wired the output of the DAQ Assistant straight to the write file.
02-27-2012 10:15 AM
Thanks for the advice. I will create the error out indicator and fix the split signals architecture for a cleaner program.
Regarding preventing overflow, is there a better loop architecture to use for our application?
02-27-2012 11:02 AM - edited 02-27-2012 11:04 AM
@ANelson wrote:
Regarding preventing overflow, is there a better loop architecture to use for our application?
As Dennis said, you may want to try a Producer-Consumer Architecture. There are several great examples if you search through the forums. This essentially splits the data acquisition and writing to file into two seperate, parallel tasks.
Edit: perhaps "parallel" is not the correct word, because they are not running independently. Rather, the consumer loop will wait until data is available from the producer loop. Or if the producer loop is faster, the data will queue up and feed through the consumer loop at its own speed. This way they do not need to run at exactly the same speed.
02-28-2012 08:26 AM
Thank you for the advice. After reading through many examples of producer-consumer architectures, it appears that these architectures are past my current skill set. We are short term testing at this time(at most 10 minute long tests) and have a very short deadline(2/29 at midnight) so a complete DAQ program re-write is not the best use of our time right now. I will surely begin learning more about it once we have passed this near deadine.
For the short term, are there more stable read/write settings that we can use. Currently, we are reading 22 samples(some voltage through 9205 and some thermocouple through 9213) with the following specs:
We are also running our "write to spreadsheet" function within the same timed loop with a period of 0.0333 seconds.
Are these parameters within a reasonable range to prevent overwriting? Thank you for your help.
02-28-2012 02:28 PM
You can switch from continuous to finite samples. This means that no acquisition is going on while you do your file write. The buffer will not overflow. You could also replace the Write to Measurement File with just about any other file write function. You've chosen the slowest option. Even selecting Binary (TDMS) might work. Lastly, you could try using the Append Signals function with a simple shift register and do the file write outside the loop.