LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Save continuously without loosing data

Hello all,

I am using 7.1, XP, PCI-6115.  I am trying to save chunks of data (~30000 samples) continuously for 100-150s once a trigger has been recieved via an external clock.  I have attached the curent vi I have writen for this application.  Almost everything is working fine but when I look at my data it seems that I am missing some chunks, which means that I am not saving continuously and am experiencing some lag between saving the chunks of data.  Can you tell me if I have used the appropriate approach (see attached vi.) or recommend a proven approach for accomplishing this task?

Thanks a lot!!!


Azazel
Azazel

Pentium 4, 3.6GHz, 2 GB Ram, Labview 8.5, Windows XP, PXI-5122, PCI-6259, PCI-6115
0 Kudos
Message 1 of 13
(4,576 Views)

Perhaps you are losing data during the time it takes to write to a file when you are saving the data.  Maybe you collect all your data first, then save it to a file.  Also, you while loop is hogging 100% of the CPU time.  In order to avoid this, put a delay in your loop, even 1mS will allow the CPU time to do other things, and it won't seriously affect your outcome.

Here is one more piece of advice, and it pertains to all people who submit code for suggestions.  Please take the time to clean up your wiring.  Wires should not be hidden behind structure borders.  Use left to right wiring when possible.  Put some room between loop borders and case borders, especially so that wires can be displayed.  Otherwise, we volunteers have to take extra time to determine the wiring.  It is not obvious to someone looking at it for the first time as it is for the creator.  Thanx for listening.

- tbob

Inventor of the WORM Global
0 Kudos
Message 2 of 13
(4,570 Views)
Thanks for the response...sorry about the wiring...I always forget to clean up my code in my design stage.  If I insert even a 1ms wait in my while loop, won't this affect my data acquisition when I want to save all data?  I know that the amount of data one can write to a file before saving must be limited (ie. RAM, Hard drive) but are there any inherent size limits with labview itself.  The files that I have been saving so far are about 2GB.

Thanks,

Azazel
Azazel

Pentium 4, 3.6GHz, 2 GB Ram, Labview 8.5, Windows XP, PXI-5122, PCI-6259, PCI-6115
0 Kudos
Message 3 of 13
(4,568 Views)
You should not be saving data to a file inside you while loop.  That is probably why you are losing data.  When the write file funciton is writing to the hard drive, no other part of the code is executing, so you miss data aquisitions.  Move this function to after the loop.  Then when the data aquisition is done, the entire file can be written.  The 1mS delay may or may not affect the data aquisition.  It depends on how fast the data is coming in.  If it changes faster that the loop can execute, you will miss data.  Try the delay after moving the write file function.  If you still lose data, remove the delay.  If you still lose data, you will have to optimize the code or find some other faster way to collect the data, perhaps an assembly language routine that can be compiled into a DLL and called from labview.
- tbob

Inventor of the WORM Global
0 Kudos
Message 4 of 13
(4,561 Views)
When I moved my write file outside of my array, I was only saving the last itteration.  I then attempted to index the array and re-shape the array to save as 2D array at the end of the sequence statement but I filled up the labview memory very fast...way before my required 150s of continuous acquisition.  Unfortuneately, if I insert a 1 ms hold in the while loop I will miss 1ms worth of data and since I am trying to acquire continuously I need to find an alternative.  Any other takers or ideas...

Thanks for the ideas tbobSmiley Happy,

Azazel
Azazel

Pentium 4, 3.6GHz, 2 GB Ram, Labview 8.5, Windows XP, PXI-5122, PCI-6259, PCI-6115
0 Kudos
Message 5 of 13
(4,552 Views)

You have not told us how fast you are sampling.

We will need to know that to answer your question.

Ben

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
0 Kudos
Message 6 of 13
(4,548 Views)
Once externally triggered (by a sine wave at 13kHz), my function generating produces 300 pulses at a frequency of 10 MHz which I use as my external clock. This produces a sampling rate of ~3.9MS/s.  I have read that the limit of the 6115 is 4MS/s so I think this is pushing the limit but should be possible.

Azazel
Azazel

Pentium 4, 3.6GHz, 2 GB Ram, Labview 8.5, Windows XP, PXI-5122, PCI-6259, PCI-6115
0 Kudos
Message 7 of 13
(4,544 Views)
Gathering 150 seconds worth of data would produce 585 megasamples.  That is a lot of data.  If your samples are 8 bits each, that is 585MB of data.  If the samples are 16 bits, that would be 1.17GB of data.  I'm not sure how much memory LV allocates to its process, or how to change it.
I thought about using a producer consumer loop, with the producer gathering the data and stuffing it into a queue, then the consumer would dequeue and write to a file.  However, your speed requirements do not allow putting in a delay into the producer loop, so it would hog the CPU time and the consumer loop would never execute, and the queue would fill up very fast.
I'm afraid I don't know an easy answer to your problem, except lots of memory to store all the data before writing to a file.
- tbob

Inventor of the WORM Global
0 Kudos
Message 8 of 13
(4,541 Views)
I do have 2 GB or RAM on the computer and about 1.5GB free when the program is running.  Is it complicated to buffer the data in the RAM than dump the whole file at the end of the vi to the harddisk? I also had a vi with queue's but as you stated and I found out, this did not solve the problem.  

Azazel
Azazel

Pentium 4, 3.6GHz, 2 GB Ram, Labview 8.5, Windows XP, PXI-5122, PCI-6259, PCI-6115
0 Kudos
Message 9 of 13
(4,538 Views)

You are pushing the limits of PC's and PC hardware.

1) LV will have difficulty allocating a buffer that big. LV need contiguous memory for each of its buffers. As a quick experiment try writing a VI that just does an "initialize array" with a buffer that big. You can side step this issue by decreasing the duration of your test.

2) If you decrease the duration AND can allocate the buffer (init array) thengo with Tbob's suggestion of buffering the readings in a shift register that is

  A) Initialized before the acq starts,

  B) Use "replace array subset to move the readings in the the appropriate offset within the buffer.

  C) Only attempt to save after the acq has completed.

3) If you REALLY need to collect for all of that time, then concider "divide and conquer" appraoches. Use two or three PC such that the first PC grabs the first batch, then the second PC takes over etc.

Ben

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
0 Kudos
Message 10 of 13
(4,504 Views)