10-28-2009 12:14 PM - edited 10-28-2009 12:19 PM
I am trying to sample 2 channels at 400 Khz using NI 6251 USB DAQ and I want to acquire the data for approx. 20 minutes, for which I am using continuous sampling mode. But I keep getting buffer over flow errors. To solve this I have used the input buffer.vi so I can manually state the size of the buffer, which I set 100 M samples but I still get an error. If I set the buffer size higher than 100 M samples I get another error which says requested memory can't be allocated (Error -50352 occurred at DAQmx Start Task.vi:1).
My first question is, how is the buffer size allocated, is it in the DAQ or on the computer memory and what's the maximum?My second question is in relation to saving the data to a file,does no of samples per channel determine how many samples are saved in each file, if not how can I determine this? Lastly,what should value should i use for no. of samples per channel (samples in each file)? I am new to labview so I would really appreciate any information or suggestions. I have also attached my vi.
Thanks,
Solved! Go to Solution.
10-28-2009 12:52 PM - edited 10-28-2009 12:53 PM
With such a high sample rate, you would be much better off using a binary save instead of the very time intensive and inefficient Write to Measurement File. That's why you keep getting the buffer overflow. Look at the shipping example called Cont Acq&Graph Voltage-To File(Binary). You might also want to think about a producer/consumer architecture. See File>New>From Template?Frameworks>Design Patterns.
If you want all samples, you can leave the number of samples unwired.
10-29-2009 09:50 AM
Hi Dennis,
Thanks for your reply. Is there a particular reason why the buffer size has to be such a huge value (100 M samples)? And how do you determine its size for other sampling rates.Could you please upload the vi for the Cont Acq&Graph Voltage - To File (Binary), as I don’t have this example.Also in the producer/consumer template, do i place the daq related blocks in producer loop and place the write to measurement file block in the consumer loop?
Thanks,
10-29-2009 10:06 AM
The buffer does not have to be that large. In the majority of the cases, you let DAQmx automatically determine the size of the buffer. It will use the sample rate and number of channels to set it.
What version of LabVIEW do you have? Don't understand why you would not have the example.If you can't find it by going to Help>Find Examples>Hardware Input & Output>DAQmx, then you should be able to locate it at \Examples\DAQmx\Analog In\Measure Voltage.llb.
Yes, the producer would be the DAQ code and the consumer would be the file write code.
10-29-2009 11:13 AM
Hi Dennis,
Thanks i found the example,as i am not that fimiliar with labview i was looking the wrong place. In my vi, if i was to change the file format in 'write to measurement file ' block to binary(TDMS) would it be the same as using the Cont Acq&Graph Voltage - To File (Binary).vi?
10-31-2009 10:54 PM
You might find it worthwhile to update to DAQmx 9.0 for this development. A new feature in DAQmx 9.0 will log all the data to a TDMS for you. Besides the main benefit of speed, this feature will also produce a file 1/4 the size than writing scaled data. It's really easy to use this feature (compared with other options) and it can save you a lot of time. If you do install DAQmx 9.0, an example can be found in the example finder under Hardware Input/Output>>DAQmx>>Analog Measurements>>Voltage>>TDMS Streaming - ... (there are 5 examples there).
11-02-2009 10:31 AM
Hi Andy,
Thanks for your reply,I can now save data for 20 mins while sampling at 400 KHz but when I run the vi in highlight execution mode I get an error,why does this happen? I also got this error when I was using the TDMS streaming examples in highlight execution mode. the setting i used were as follows;
Sampling rate 400 KHz
Sample to read 400 k
Also could you please tell me how can i save TDMS data in series of files?
Thanks,
11-02-2009 10:51 AM
Highlight execution is great for debugging, but doesn't work so well sometimes with data acquisition applications in which you need to keep up with the hardware (or you will get errors indicating that you are not reading fast enough). My only recommendation here is that you might want to use a diagram disable structure over the DAQmx Read calls while debugging so that you can debug other parts of your application.
How do you mean, "in a series of files"? Do you mean that after a certain amount of time, you want to switch to use a different file?
As a sidenote, the TDMS Streaming examples that show "Log" mode would be unaffected by highlight execution. With "Log" mode, all data is streamed to disk in a background thread. Use this mode if you don't need to have access to your data through DAQmx Read or need the most optimal streaming rate.
11-02-2009 11:08 AM
Hi Andy,
Thanks for such a quick response, I really appreciate it. What I meant by "saving series of files", was to save certain amount of samples in one file then create a new file, or even if I could set the size of the file so that when the file is of that size a new file is created and the remaining data is saved in that file.
The reason I want to do this is because I am acquiring data for 20 mins and currently I am saving all the data to one file and the file size is so huge(approx 500 Mb) that I can't view the full dataset. I get an error saying not enough memory.?
Thanks,
11-02-2009 02:30 PM
With the logging feature, the only way to separate data would be to stop the task, change the file path, and start the task again.
That being said, I'm a bit confused on your file size. Even if you are writing raw data to disk, 400 kS/s * 2 channels * 2 bytes per sample = 1.6 MB/s. At 1.6 MB/s, that would be 96 MB/minute and 1920 MB over the 20 minute period. I don't understand how your file is only ~500 MB.
As well, you should not get an error saying not enough memory. You should be able to read through your file without running into an out of memory error. If you try to read back all of your file into memory, you might run into a cap on 32-bit applications; however, you might run into this regardless of whether multiple files are involved. The key here is that you probably shouldn't be loading ALL of that data into memory at the same time. If you're wanting to do some post analysis presentation or analysis on the data, you should probably do it using a set at a time. Where are you trying to read the file? What are you trying to do with the data?
Ideally, this data should all be part of the same file. You might have to read parts of the file at a time, but doing so is pretty common.