LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

dynamically resize queue

Is there a way to dynamically re-size the number of elements in a queue? For example, suppose I have queue initialized with 10 elements; is there anyway to change that while a program is running to a queue with 5 elements? In my case, I have a queue that is storing arrays of data waiting to be saved to disk. In some cases the user of my program may choose to acquire millions of data points, thus I would like to make my queue smaller in order to avoid running out of memory. Thanks for your help. Cheers, Andrew
0 Kudos
Message 1 of 11
(5,385 Views)

Hello,

 

First question: does the array in the queue element have a fixe size or can your program put array of different sizes into two different element of the same queue?

 

As far as I know, the memory is not allocated when you set the max queue size, but only when you fill up the queue and I assume that is because LabVIEW has no way to guess the amount of memory it will need for each queue element (specially if the element contains an array).

 

What I think you could do (wathever is the answer to my first question) is set a limit in terms of total array size contained in the queue and not in terms of max queue number of element. Every time the acquisition routine adds # elements to the queue update a "queue size" variable and same when the save routine extracts data from the queue.

and whatever the user sets in terms of points to acquire you have a max size that you can't go over.

 

Hope this helps


We have two ears and one mouth so that we can listen twice as much as we speak.

Epictetus

Antoine Chalons

0 Kudos
Message 2 of 11
(5,368 Views)

when creating the queue using "obtain queue" you may set the maximum number of elements.

 

how do you want to achieve to not loose data as your aquisition loop probably tries to produces events for the saving queue? Do you have the possibility to wait in the experiment?

0 Kudos
Message 3 of 11
(5,356 Views)

@Questionmarker wrote:

when creating the queue using "obtain queue" you may set the maximum number of elements.

 

how do you want to achieve to not loose data as your aquisition loop probably tries to produces events for the saving queue? Do you have the possibility to wait in the experiment?


They could use a lossy queue which would simply push the older data out of the queue. Even prior to native lossy queues you could achieve this behavior with a bit of extra code when you enqueue the data. NI just made it easier now.

 

Either way, as mentioned earlier the memory allocation is much more dynamic when your queue data contains arrays. It has to be since arrays are dynamically sized and memry allocation is handled by LabVIEW. Even if you preallocate an array you can grow it if you put more data in it. Without your code to control things there is no way to force the size of an array to a fixed size.



Mark Yedinak
Certified LabVIEW Architect
LabVIEW Champion

"Does anyone know where the love of God goes when the waves turn the minutes to hours?"
Wreck of the Edmund Fitzgerald - Gordon Lightfoot
0 Kudos
Message 4 of 11
(5,347 Views)
Thank you to everybody for their help and advice. I do not think it is possible to do what I was asking; however adding a wait function may work. Maybe I should have explained my problem better. I have 3 PXI boards in a chassis that are basically 3 digitizers with 2 channels each. Each PXI board can be thought of as an oscilloscope. The user for my program can select how many points they want to download from each board. So I have one part of my program that acts like a producer of arrays of data points and another part as a consumer that saves the data to disk. Presently the write to disk speed is a bottleneck, even when saving a binary format. So I have a queue that can temporarily store the data that needs to be saved to the disk. What I would like to do is change the size of this queue if the user decides to download many millions of points, so I do not run out of memory. However, adding a wait function as Questionmarker suggested could slow down the data acquisition rate in order for the file save to keep up.
0 Kudos
Message 5 of 11
(5,340 Views)

But this also slows your ability to capture the data. HOw are you saving the data to file? Are you passing the entire array to something like the save to spreadsheet VI? File operations on very large buffers are slow. You can speed this up thouhg by writing smaller chunks of data to the file at one time and using a loop to iterate over all of the data.



Mark Yedinak
Certified LabVIEW Architect
LabVIEW Champion

"Does anyone know where the love of God goes when the waves turn the minutes to hours?"
Wreck of the Edmund Fitzgerald - Gordon Lightfoot
0 Kudos
Message 6 of 11
(5,334 Views)

or buy a SSD Smiley Very Happy The smaller ones are not too expensive and they are really fast.

 

The low budget solution is a RAID 0, almost doubling the speed.

0 Kudos
Message 7 of 11
(5,327 Views)
The data is being saved in a binary format, in particular a Matlab file. The Matlab file adds little overhead to the file save; basically only a header needs to be added. The header is not appended to the data array but saved separately before the data is saved to the disk. The file save subvi uses low level file functions. My disk has a write speed of approximately 80 MB/s according to the LabVIEW benchmark, thus a 1 million points in each channel plus 1 million time points gives 3 million points which approximately 300 ms to save the file. I have problems when I would like to save 5 million points in each channel, my 10 element queue fills up and the computer runs out of memory. (32 bit system)
0 Kudos
Message 8 of 11
(5,316 Views)

Hey mcduff, 

 

I haven't heard anyone mention data compression, were you aware you can compress data in daqmx? A discussion like this may help.

Jesse Dennis
Engineer
INTP
0 Kudos
Message 9 of 11
(5,300 Views)

How offen are you writing to the file ?

You are talking about 3 mill. data points, 1 mill for each signal. Are you only save to file every 1 sec ?

Try to take fever point but say more offen, so you keep the same samplerate.

 

Besides a loop for saving to file and one for data collection, what is else runing ?

Is there some data process going on, or is there any loop with no wait in it ?

How are you collection the data, with DAQmx ?

 

0 Kudos
Message 10 of 11
(5,289 Views)