LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Regular spikes in backlog every 2GB when streaming large files to disk

Dear Community,

I am trying to stream a lot of data (10-20MBytes/sec) for long periods of time (hours). I'm using 11 4472 boards in a PXI chassis and "AI Read" for acquisition. When I graph the backlog from one of the boards, the backlog spikes from 2,000 to 20,000 for a couple of frames. These spikes happen pretty regularly, and it looks like the spike interval corresponds to 2GB increments in the file size.

I am using the "write file" vi to write things to disk. Has anyone ever seen these spikes before? Are they from the "write file" vi or from the OS? I'm using Windows XP and NTFS.

I'd like to avoid these spikes because they make the performance of my application unstable.

Thanks!

Cas
0 Kudos
Message 1 of 2
(2,563 Views)
Cas,

2GB is a special number in Windows. It represents the number of bytes in a file at which a 32-bit signed integer will roll over. There must be some buffer shuffling the goes on when that happens causing your momentary backlog spike. There have been discussions about handling large files on Info-LabVIEW, and I believe on OpenG.org and LAVA as well. For you, the best solution may be to automatically start a new log file when things get up around the 2GB mark. You can use the "offset" output from the Write File function.

Daniel L. Press
Certified LabVIEW Developer
PrimeTest Corporation
www.primetest.com
Message 2 of 2
(2,563 Views)