LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

My Labview DAQ is maxing out my CPU usage, why?

Solved!
Go to solution
I am maxing out my CPU usage while trying to acquare 3 channels of strain data and a counter pulse (with interupts data transfer mechanism).  I'm not doing an elaborate calculations or anything else like that on my data.  In my Labview trainging classes, I recall an example of placing a 10 millisecond wait timer in the for-loop of the DAQ program to reduce the CPU usage.  When I tried this, it had no affect for CPU usage.  What am I over looking?  Is there an issues with using MAX tasks that sample continuously?
0 Kudos
Message 1 of 10
(3,970 Views)
Without seeing your code it is really difficult to give you a definitive answer. You are correct about putting a wait into loops (While or For) but not just the DAQ one. Any loop that doesn't have a wait of some form is a prime candidate for using a lot of CPU.   I you can save your "offending" vi (ideally as "Save with Options", "Development Distribution" and the zip the resulting ".llb" file) it would help us give a better answer.
 
 
P.M.
Putnam
Certified LabVIEW Developer

Senior Test Engineer North Shore Technology, Inc.
Currently using LV 2012-LabVIEW 2018, RT8.5


LabVIEW Champion



0 Kudos
Message 2 of 10
(3,966 Views)

PM,

Thanks for the response.  Attached is my program below.  I don't have the option of saving as a .llb file type.  (I only have the base package which is missing the majority of the nice functions I've learned about and could use, so it wouldn't surprise me if that is way I don't have it available.)  Anyway here it is.

0 Kudos
Message 3 of 10
(3,949 Views)

Your big while loop has no wait if...

"Zero Transducers" is false AND the big case state is not "5". (The normal run states seem to be states 2 and 3).

Add small delay somewhere where it executes at every iteration of the big while loop. (e.g. in the false case of the case structure in the upper right or next to it)

 

Message Edited by altenbach on 07-12-2005 02:32 PM

Message 4 of 10
(3,943 Views)

I'm not seeing where you put the 10mSec wait. If it isn't in the code now try your code with the wait, but make sure it isn't in one of the cases. With a test example of a while loop containing just a 4 case "state machine" that just iterates through the cases, when I have no wait the CPU is at 50% on my machine, with a 1mS wait it drops to less than 1%.  When you had it in before where was it in the code? If you had it in one of the conditional cases it might not have been seen often, so it wouldn't have much effect. I haven't looked at the actual DAQ too closely yet.

 

P.M.

Putnam
Certified LabVIEW Developer

Senior Test Engineer North Shore Technology, Inc.
Currently using LV 2012-LabVIEW 2018, RT8.5


LabVIEW Champion



0 Kudos
Message 5 of 10
(3,941 Views)

The wait (ms) comand was in the big while loop when I tried it before.  I've since tried a couple different things without any success either.  These included changing sampling rates, data transfer array sizes, the sampling mode from continuous to finite, and maybe even a few others that I've forgoten already.  All without any luck.  I've also tried the Wait Until Next ms Multiple comand now that I think about it.

I've written short tryout programs in the past that reduce the CPU usage, but they had no DAQ functions in them.  That is why I'm currious if the DAQ functions of configuration or possilbly the math functions are also causing this.  

0 Kudos
Message 6 of 10
(3,927 Views)
I use a technique I call "throttling".  It only performs a read on an input channel array  when there is sufficient backlog data.  This really seems to cut back on what the processor is doing since you don't need to read scans from the buffer every iteration to keep up.  I.E. if your scan backlog is not contiunually climbing. 

It's simple and provides another parameter to tweak in order to optimize a DAQ loop.  I generally adjust the following to tune a DAQ loop:
-Wait until next ms Multiple
-Read Size
-Allowable backlog size.  (the integer in the case sturcture)


Note: These snapshots are traditional daq but can be easily implented in daqmx using the property "AvailSampPerChan".

 

0 Kudos
Message 7 of 10
(3,909 Views)
Second image attached:
0 Kudos
Message 8 of 10
(3,909 Views)
I am not sure what version of DAQmx you are using.  If you are using <7.4 then I would try upgrading.  We updated the default way in which DAQ polls/sleeps.  There have been other discussions in this forum referring to that...so I will let you peruse for further details.  If that is not in then I am not sure what else might be going on.
 
StuartG
0 Kudos
Message 9 of 10
(3,880 Views)
Solution
Accepted by AMP12
 

Hi-

Stuart's suggestion is perhaps the most useful-  NI-DAQmx prior to 7.4 used a "Yield" mode of operation by default.  This meant that DAQmx would automatically use all CPU time if available but would yield to other applications without resisting whenever the O/S requested.  With NI-DAQmx 7.4 and later "Sleep" mode is the default operation and apparent CPU usage is usually decreased very significantly.

An upgrade of your NI-DAQmx driver would be very useful in determining if this is a driver problem or a programming problem.  NI-DAQmx 7.5 just released and is a free download from ni.com.

Please let us know if you're still seeing the problem after this driver upgrade.

Thanks-

Tom W
National Instruments
0 Kudos
Message 10 of 10
(3,855 Views)