01-07-2013 01:31 AM - edited 01-07-2013 01:33 AM
Hi,
I'm trying to use the "Cont Gen Voltage Wfm-Int Clk-Non Regeneration" example on LabVIEW 2009 with a PCI-6154 DAQ. The VI runs fine with default parameter values. However, the CPU usage increases a lot when the sampling rate gets close to 100 kHz. Here are a few values showing the CPU usage of the LabVIEW process :
50 kHz : < 1%
80 kHz : < 1%
90 kHz : 1%
95 kHz : 3%
97 kHz : 15%
98 kHz : 20%
99 kHz : 30%
100 kHz : 45%
Then I can go up to 250 kHz but the cpu usage will only reach 55%. I've also tried with a simulated device but I don't get this issue. The test was done with an AMD Athlon 64 X2 2.6 GHz CPU.
Is there anything explaining this high CPU usage ?
Thanks,
Alex
01-08-2013 03:23 AM
Dear Alex,
It seems that your Computer is to slow for this sample rate. In the Example you use the automatic regeneration of data has been disabled, so new data has to be provided throughout the duration of the continuous Analog Output operation. So this could be too much for your Athlon 64 X2.
For Benchmarking Single-Point Performance you can look at this White Paper:
http://www.ni.com/white-paper/5423/en
Regards,
Oleg Scherling, M.Eng | Applications Engineering | National Instruments | NIG |