09-18-2013 03:25 PM
Hello,
My computer: Windows 7 Entreprise, 64-bits OS, 3.33GHz, 6.00GB of RAM
My hardware: NI PCI-6123 with NI BNC-2110, NI PCIe-1430
I try to do the data acquisition on 8 channels at 500kS/s and I get this error when I run my application: Not enough memory to complete this operation.
Structure of my application: Events start/stop acquisition and stop the application. These events generate messages that go to a message loop. This loop controls other loops with messages. So it starts/stops the creation of a video and the acquisition of data by reading the channels, sending the data in a queue and reading this queue for the data logging. For the video, in a parallel loop, I acquire images from a camera at 60FPS, put the references in a queue and reads this queue for the creation of a video. The output is a TDMS file plus an AVI file.
At low speed (e.g. 2kS/s) everything works fine but at high speed (e.g. 500kS/s) the application jams and pops up the error message and jams LabView.
Can anyone help me to find a solution to this issue please?
09-18-2013 05:14 PM
Without posting your code we can only guess, but I suspect if you create an indicator showing your queue lengths they increase in size until you run out of memory. In other words, your file creation cannot keep up with the higher frame rates and the queue floods.
09-19-2013 03:05 PM
Hello,
This is my code for a better understanding of my issue. I created an indicator but it doesn't show me anything special. I noticed that even for a acquisition speed of 50kS/s it does the same thing. The GUI becomes slow and the display refresh of either the 8 ai channels or the camera are very slow and finally it stalls.
09-19-2013 03:30 PM
Sorry, I can't download VIs here, someone else may be able to. If you can post a pic of the main VI? The slow down is understandable if the system memory is close to full.
So did you use 'Get Queue Status' and read 'no of elements in queue'? Is it continually rising?
09-19-2013 05:25 PM
When exactly do you get the error? As soon as you launch the application, or does something trigger it?
Mike...
09-20-2013 10:04 AM
I put a "Get Queue Status" for reading the number of elements in the queue but it did not show me a continually increasing number. The code is too complex to only post a pict.
09-20-2013 10:19 AM
Everything works fine when I run the application until I start the acquisition at a high rate (more than 50kS/s). A slow rate works fine.
At a high rate, the GUI becomes very slow. Because my "stop" button is only enabled when the acquisition is running, it takes a while before I can press it for stopping the application. Did you have some time to look at my code?
Could it be a case of hard disc writting speed?
Is the problem caused because the GUI tries to update the display of either the 8 channels (500kS/s each) and the camera (60FPS at 640x484 32-bit)?
09-20-2013 10:24 AM
"Is the problem caused because the GUI tries to update the display of either the 8 channels (500kS/s each) and the camera (60FPS at 640x484 32-bit)?"
I don't know- but it's very easy to find out. Just rip the display updates out or disable them with a case structure or disable structure and see what impact it has.
09-20-2013 11:09 AM
Hi,
OK.
Test #1: (100kS/s) I disable the display of both the 8 ai channels and the camera before starting the acquisition and it worked fine! But no live display.
Test #2: (100kS/s) I only disabled the display of the 8 ai channels and kept the live display of the camera and it worked fine too.
Test #3: (500kS/s) I only disabled the display of the 8 ai channels and kept the live display of the camera and it worked but the camera display was scatered and the video was scatered and very fast (not all the frames I guess).
Test #4: (500kS/s) I disabled the display of the 8 ai channels and the camera before starting the acquisition and it worked but the camera display was scatered and the video was scatered and very fast (not all the frames I guess).
09-20-2013 04:39 PM
It sounds like you have established that you can acquire all the data but not display it.
The question becomes this: How fast can your user's eyes detect and interpret display updates? Most human brains operate at ~100 ms intervals for discrete images. Your charts for displaying the AI data have about 650 pixel width and the chart history lenght is 1024. When more than 650 data points are in the chart, code behind the scenes must reduce the data to generate the display. Feeding the displays hundreds of times more data than it can display seems like a recipe for slowdown.
Consider decimating the data before sending it to the displays. Send every other or every third frame from the camera to the image display (30 or 20 FPS display rate). Switch to a graph and send no more data points than the number of pixels in the width of the graph every 100 ms. The logging loops will still process all of the data.
Lynn