LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Data collection rate slowing down after hours of collection

Hello, 

I am a novice to lab-view and am trying to troubleshoot a problem I have with labview. The experimental setup is I have a Michelson interferometer where I have laser shining onto a sensor. This sensor measures the intensity of the laser light and gives me data in the form of voltage. I am trying to collect data once per second over a day or longer to see what resonant frequencies we have. 

 

The issue:

After several hours instead of saving data every second it eventually slows down to saving data every 20 seconds. As you can see in the figure below, the time difference between measurements is relatively constant at hour 0-1, but soon slows down.Each sudden shift downwards marks 9 minutes. 

 

Things I have tried:

At first I believed it was an issue with the saving methodology. This is saving onto a hard drive so I though the successful data points weren't being appended correctly and the hard drive had to spin up every second. I then switched the saving system to save the data in ram and only save it to the hard drive every 9 minutes (This is the sudden drops in the graph). The issue persists however, and now I am guessing there is some sort of issue within the code. I thought we might be doing some operations over the whole set of data and after 10,000 lines that operation simply takes longer. Since I am a novice, it is difficult for me to test this. I also switched from the 32 bit to the 64 bit version of lab view 2018 SP1 (which is what this program was designed in) to try and fix it but it did not.

 

image1.png

0 Kudos
Message 1 of 10
(1,660 Views)

Btw, it is spelt "LabVIEW" and not lab-view.

 

The reason it is slowing down is due to your choice of architecture to put everything in a single loop. Ideally, you want to split the data collection and data logging parts into separate loops so that any delay in logging will not impact the acquisition part. In your case, any slowdown in the logging (file size getting bigger or inefficient file format or logging technique) will directly impact your data acquisition as it cannot acquire before it completes the logging.

 

santo_13_0-1674240328698.png

 

Santhosh
Soliton Technologies

New to the forum? Please read community guidelines and how to ask smart questions

Only two ways to appreciate someone who spent their free time to reply/answer your question - give them Kudos or mark their reply as the answer/solution.

Finding it hard to source NI hardware? Try NI Trading Post
Message 2 of 10
(1,646 Views)

Hi kdolghier,

 


@kdolghier wrote:

I am trying to collect data once per second over a day or longer to see what resonant frequencies we have. 

 

The issue:

After several hours instead of saving data every second it eventually slows down to saving data every 20 seconds. As you can see in the figure below, the time difference between measurements is relatively constant at hour 0-1, but soon slows down.Each sudden shift downwards marks 9 minutes. 


Why do you calculate that "once a second" part in such a convoluted way?

Why not simply use ElapsedTime with autoreset?

How many samples do you collect in each second?

Why do you initialize that array with 1 column? Why not use a complete empty array?

Why does your (missing) subVI output clusters with 4 elements instead of arrays?

 

Edit: now you need calculate the "overall elapsed time" on your own as the ElapsedTime function will reset automatically - which is quite simple with subtracting the values from two GetDateTime functions…

Btw. as you set a "rate" for your subVI it should be more easy to simply divide the sample number by the sample rate to receive the elapsed time…

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 3 of 10
(1,639 Views)

Spell check... 🙂

I have implemented two loops, and I will write back if I see the characteristic slowdown and whether or not it has improved. 

0 Kudos
Message 4 of 10
(1,626 Views)

Ya the code is definitely inefficient. It wasn't originally mine and I just started using LabVIEW a few months back. 

 

At this moment I just get 1 data point per second. I can obviously increase the sampling rate and get 10 data point every second. Its not that much data but after hours I get 20,000 rows and there's something that takes I while I suppose. 

 

With regards to the arrays vs elements question; I am not really sure. It works and that's all that matters at the moment. If fixing that can help my issue than I will fix it. 

 

And that timing suggestion would definitely involve less calculations and I'm guessing improve the performance. Can you elaborate what your suggestion would be as with your edits and everything it is a bit confusing for me. 

 

Thanks!

0 Kudos
Message 5 of 10
(1,618 Views)

Something you should note -- one of the worst (meaning "slowest") ways to construct arrays in LabVIEW using a For Loop is to create an empty array, wire it to a Shift Register on the For Loop, generate values inside the For Loop and use Build Array to add the new value to the (growing) Array.  Slow.

 

Better (maybe 100 times faster!) is to simply generate the new Array Elements in the For Loop and bring them out through an Indexing Tunnel.  Here's some code you can try for yourself.  Since you were using a 2D Array of Dbl, I did the same, and filled them with Random Numbers (to slow down the code).

Building Arrays.png

 

I like to attach "pure pictures" of simple code like this to give "early LabVIEW learners" something to copy so they can learn "neatness, straight wires, etc.", and maybe figure out what that "silly unused wire" is doing going from the top Frame Sequence to the bottom one.  [Maybe they'll even say "Why is there a Frame Sequence here?  I thought they were Evil ...".  It's still a Good Idea, but relented and saved this as a LabVIEW 2021 Snippet.

 

So on my Laptop, the top (better) version ran in .0023 seconds.  The bottom (your way) code took 0.198 seconds, roughly 100 times slower.  Give it a try.

 

The other thing to consider is to use the wonderful feature of LabVIEW, a Data Flow programming language, to do "parallel procesing", having one Loop generating these array, and a separate, independent loop, saving them to disk.  Notice that you are using "Write Delimited Spreadsheet", which can be easily read by Excel, but has two "inefficiencies" -- it saves the data as text (rather than the often-more-compact binary), and every "Write" is really an "Open File, find End-of-File, Write, Close File" (which has the virtue that if the program crashes, you at least (probably) have a partial data file on disk.

So look up LabVIEW and "Producer/Consumer Design Pattern".  This, coupled with the 100-fold speed increase in building arrays, should overcome your File Writing woes.

 

Bob Schor

0 Kudos
Message 6 of 10
(1,565 Views)

Without digging too deep, i'd say inefficient arrays and constant file open/close might cause this.

Open the result file once and then reuse the reference instead of using a single Write to spreadsheet file which'll open/close every time.

Combine that with the smarter arrays and make sure none grows to infinity and it should be solved.

G# - Award winning reference based OOP for LV, for free! - Qestit VIPM GitHub

Qestit Systems
Certified-LabVIEW-Developer
0 Kudos
Message 7 of 10
(1,520 Views)

Hi,

as many suggest, here is an easy way to get the open and close outside your loop (with the 'write delimited spreadsheet.vi'), which may helps you understand it. Don't forget to make a copy 😉
You can rightclick the 'write delimited spreadsheet.vi' and select 'replace with subvi content', definitly hit the first check mark, second is your choice. (ctrl + z works)

What you see is the preparation of the data before writing it to a file, you can cut this down to just fit your data and check if it still works.

But we want the writing, you can replace this with subvi content, too. Now you see the 'open', 'go to end', 'write' and 'close' operations. Move the 'open', 'go to end' (before) and the 'close' (after) outside the loop.

Tip: shrink diagramm: ctrl + alt + mouse1 down and drag.

This saves the time to open and close the file.


I don't know what the '?' vi does, but 1000 samples and a rate of 1000, I would say thats a 1 second measurement. Without switching to producer/consumer this can't work.
Sadly I couldn't find it in the example finder, but the transition shouldn't take to long, your '?' VI is the producer, which sends the data to the consumer, where it gets saved (and/or calculated).

Good luck,
 Timo

Message 8 of 10
(1,499 Views)

@Yamaeda wrote:

Open the result file once and then reuse the reference instead of using a single Write to spreadsheet file which'll open/close every time.

Combine that with the smarter arrays and make sure none grows to infinity and it should be solved.


I presume @Yameda means "Do not use Write to Spreadsheet File", which forces the file to close after the write.  In your case, this is probably good advice.

 

"Write to Spreadsheet File" does the following for every call:  Open File, go to End of File (except when Append to File is False), Write Data, Close File.  If you are doing all of this in a loop, you have three "slow" operations (Open, Go to End, Close) and one fast operation (Write) for every loop.

 

Much faster is to first open a New file, then enter the loop, generate each line, formatting lines (yourself) with commas or tabs, end lines with EOLs, and carry the File Reference to the right edge of the While Loop.  When the Loop exits, grab that File Reference and close the File.  Now, inside the Loop, you are only doing the (fast) operation of writing the current Loop's data, which isn't growing in sise.  It does mean you have to do the formatting yourself, and you can't use the "Transpose rows and columns" option (which takes a whole tot more time -- if you need this operation, do it after you exit the loop and close the File.  You'll need to reopen the file, read the entire Array into memory, close the File, transpose the Array, and then save the Transposed Array (I'd use another file name, like "My Array T" (for Transpose) just in case something goes awry.

 

Bob Schor

Message 9 of 10
(1,471 Views)

@Bob_Schor wrote:

@Yamaeda wrote:

Open the result file once and then reuse the reference instead of using a single Write to spreadsheet file which'll open/close every time.

Combine that with the smarter arrays and make sure none grows to infinity and it should be solved.


I presume @Yameda means "Do not use Write to Spreadsheet File", which forces the file to close after the write.  In your case, this is probably good advice.


Yes. "Instead of using" means "don't use" in this case. 🙂

G# - Award winning reference based OOP for LV, for free! - Qestit VIPM GitHub

Qestit Systems
Certified-LabVIEW-Developer
0 Kudos
Message 10 of 10
(1,449 Views)