11-02-2009 12:53 AM
I have a lot of questions...but essentially here is what I want to do:
Using the USB-6251 I want to continuously sample 8 analog input channels with each channel sampled at 80 kHz, and I want to write this data to a file (probably just a .txt file).
Lets assume I decide to modify the C example provided in DAQmx "ContAcq-IntClk". So, everyNSamples im going to go to a Call back function that goes to DAQmxReadAnalogF64 and load a buffer size N of my choosing.
1st Question: What buffer size N should I be choosing since I'll be collecting data from 8 channels?
2nd Question: Will fopen(filename, a+) (to append) and then fwrite be fast enough? If I am sampling 8 analog channels at 80k samples a sec with 16 bit resolution I'm looking at having to transfer and write 1.25 MB/s of data. I guess I'm afraid that fwrite will take too long in the callback function and will slow down the "everyNSamples" function, eventually resulting in a buffer overflow.
Are there other ways of data writing I should be thinking of, is fwrite to basic for my implementation?
The reason I am so worried: I am unable to use signalExpress to read the 8 analog channels at 80kHz and write to a txt file, I get a buffer overflow error. Is this an error with signal express or could I be doing something wrong? (I've tried tons of different values for the buffer size, hence my 1st question above)
The USB-6251 can sample up to 1MS/s aggregate...meaning the maximum sampling frequency for each channel (assuming 8 channels) is 125,000 S/s...correct?
Thanks,
D
11-02-2009 06:46 PM
Hi dasaniisgood,
To answer your first question, it depends on your application, but most of the time, I start with a samples per channel equal to about 1/10 of the sample rate then adjust this up or down depending on what I am doing. You may need a little higher samples per channel because of the file I/O but I would still start in the 1/10 range.
For your second question, what is the error code you are getting? 200279? A better option for writing to file is the producer consumer architecture. This will allow you to perform your acquisition in one loop and write to a file in another loop. The following link describes how to do this in LabVIEW, but the ideas are the same.
Application Design Patterns: Producer/Consumer
I have also found a C# Example that might be helpful:
11-03-2009 07:59 AM
Hey,
To make things even simpler install DAQmx 9.0 and use DAQmxConfigureLogging API which will stream data directly to disk.
int32 DAQmxConfigureLogging (TaskHandle taskHandle, const char filePath[], int32 loggingMode, const char groupName[], int32 operation);
This is amazing feature which allows you to stream data on to disk.
Cheers
Lab
11-03-2009 04:17 PM
Thank you guys for your responses:
The Producer/Consumer C code supplied is great, but it doesn't log data continusouly since you have to input a maximum number of samples, and (to be quite honest) the use of classes etc is a little over my head so I don't really know how to approach the code.
DaqmxConfigureLogger only uses TDMS....this bothers me becuase the headers appending to every write increase the filesize by A LOT. ~10 Seconds of data comes out to be 122 MB...I need to be able to sample 8 channels at 80 kHz for at least an hour and hopefully not go over 1GB.
D
11-03-2009 10:48 PM
The logging feature built into DAQmx 9.0 (used by calling DAQmxConfigureLogging) does not increase file size at all. Most of the time, it will save space dramatically. A header is generally only added in between calls to DAQmxRead if you change the read size (which most people don't). Even so, a header is 512 bytes. If you have a decent read size, this cost is a fraction of the entire streaming operation (and is almost negligible).
If you are reading at 80kHz on 8 channels for 10 seconds, I don't see any possible way that you could have a file size of 122 MB. The file size should be about 1/10th that size at about 12.8 MB. Aside from the fact that the file will generally only have one header (at the beginning), raw data is streamed (rather than scaled data). This means that each sample only takes up 2 bytes on disk (as opposed to 8 bytes of double data, or 1 byte per digit of resolution in a text file). The file size cannot really get any smaller than this.
Even with this optimized data compression, if you are acquiring for an hour at this rate, you will still exceed 1 GB worth of data. 80 kHz * 8 channels * 3600 seconds * 2 bytes per sample = 4.6 GB. For comparsion, if you save this in a text file and care about (for example) 10 significant digits, you would be looking at 23.04 GB worth of data. I would highly recommend using this feature if disk space or performance is a concern.
As well, note that if you do not need to access the data while streaming (that is, you only care about the data being logged to disk and don't need graph/analysis), I'd recommend using the "Log" mode as opposed to "Log and Read". In that case, you would just configure logging, register a done event, and start the task (the done event would be used for any error reporting by stopping the task in the event callback).
Let me know if you have any follow up questions on this.
11-04-2009 12:07 AM
"For comparsion, if you save this in a text file and care about (for example) 10 significant digits, you would be looking at 23.04 GB worth of data. I would highly recommend using this feature if disk space or performance is a concern."
Actually, I forgot about the decimal point and a tab or new line (to separate samples). Therefore, that would be 27.648 GB.