LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

How to handle 12Gb/ minute data while writing ti binary file.

Solved!
Go to solution

 hello James_McN,

 Like you said it is a cool product but i think i am not using it correctly. i am attaching the only screen shot of the code. in the screen shot you can see that i am acquiring the data from the FIFO and directly writing into a binary file. if i was writing to a single file from single FIFO, then it was working Perfectly.. but when i was trying to write into 2 or more files at once from different FIFOs i am not getting all the data into the files. i thought that if i place "Wait until next ms multiple" it with "1 ms" it might help decrease the load on the processor, but because of the wait function it is taking more time to acquire and to record and i removed the wait function.  i am also attaching the my code, in my code i am hardly acquiring the data and writing into the file. 

Download All
0 Kudos
Message 11 of 24
(2,164 Views)

Hi kiranteja,

 

why don't you just use ONE FIFO for ALL channels? Why don't you write ALL data into ONE file?

 

Btw.: why is the stop condition of the lower loop named "Start"? Why do you need to negate the "stop" value? Why do you negate the resulting "start" value again - in each loop??? Silly Rube-Goldbergs…

 

if i was writing to a single file from single FIFO, then it was working Perfectly.. but when i was trying to write into 2 or more files at once from different FIFOs i am not getting all the data into the files.

Using just one data stream to your harddrive creates much less work than to start two (or more) simultanuous data streams! Now the file system needs to "jump" across the hard drive for each access…

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 12 of 24
(2,152 Views)

Sorry for the wrong Screen shot, my bad. after that i have done modification and uploading now. 

why don't you just use ONE FIFO for ALL channels? Why don't you write ALL data into ONE file?

I tried it also . but i m getting same result. i appended all elements to one FIFO and i am acquiring it and as previously while acquiring there is no problem with data, i am getting each and every sample as my calculations. but while recording it is Fluctuating between 4 sec to 8 sec and sometimes even more. i am out of options that is why i posted in discussion forums.

0 Kudos
Message 13 of 24
(2,144 Views)

As the DAQ Read and File Write are synchronized, the loop speed isn't as high as it could be. Usually, you put the DAQ data on a queue, and read the queue elements in a 2nd loop to write to file. Not sure if that is what's happening here, but it would be the first thing on my list. As an alternative, you could put the data in a shift register, so the DAQ Read and File Write execute in parallel. The queue method has added benefit that the queue can grow when\if the write loops stalls.

Message 14 of 24
(2,133 Views)

Hi 

0 Kudos
Message 15 of 24
(2,125 Views)

@kiranteja93 wrote:

Hi 


Queues should be able to perform better then a shift register. If queues did not help, then it wasn't the bottleneck, or you didn't do it correctly.

 

If you want to try a shift register, you might consider a feedback node. That could be a bit clearer on the wiring. In both cases, make sure to initialize properly with an empty array.

0 Kudos
Message 16 of 24
(2,112 Views)

Queues should be able to perform better then a shift register. If queues did not help, then it wasn't the bottleneck,

Actually queues are working perfectly while data is acquiring  i am getting each and every sample in that queue, when i was trying to write the data from queues into a binary file then  i am getting " not enough memory to complete this operation" error.my problem is not with the data. but writing into a file....Smiley Sad

0 Kudos
Message 17 of 24
(2,101 Views)
Solution
Accepted by topic author kiranteja93

It is really worth running a benchmark to make sure the disk you are writing to is set up properly. I normally use CrystalDiskMark from https://crystalmark.info/en/software/crystaldiskmark/

 

If that comes back nice and high then we know we have a software issue but if it is too low then we know we need to address the disk setup.

James Mc
========
CLA and cRIO Fanatic
My writings on LabVIEW Development are at devs.wiresmithtech.com
Message 18 of 24
(2,088 Views)

Some Questions:

 

  1. You are writing to the binary in a "strange" way. You are writing 100k elements every time and also appending the array size information, which you already know. Is this really what you want to do?
  2. Is the data from the FIFO in little-endian format? If not you should write it in its native format and have your file reader convert the data.
  3. Find out the sector size of your disk. Write the data in multiples of the sector size. This gives a dramatic speed up in write speed.

mcduff

0 Kudos
Message 19 of 24
(2,076 Views)

You haven't mentioned whether you are running the controller using Windows or RT - if Windows, then you might want to look at this example which provides non-buffered file access (no caching) by calling a different file open routine in Windows.  I can write at over 800MB/s using this, so it should easily handle your data rate if the disk is up to it, which it sounds like it is. The only limitation is that writes must be in multiples of the sector size, usually 512 bytes.

0 Kudos
Message 20 of 24
(2,058 Views)