LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

How can I increase the speed of LabView computing?

Solved!
Go to solution
Dear all,
 
I have a problem which I didn`t thought to face.
My problem is the folowing. I have Implemented some applications in Labview but with my great surprise the program is incredibly slow!
One of these was just a simple try to produce sinusoidal signal. I simply made a While loop where the sine of the "i" iteration number was calculated and then
sent to a NI- 6559 DAQ through an USB port and the DAQmx Assistant.
Well this simple program runs at 300 Hz Maximum. A similar program in C can without problems reach 20000 Hz!
Now I am asking if Labview is the wrong choice for more than 100 Hz applications and if it is not convenient to shift directly on C language.
Namely I want to let you know that the apllications I need require a loop speed of at least 800Hz.
Really I cannot understand why the system is so slow. It cannot be otherwise Labview should be unusable for the large part of the applications.
Can you suggest me what I am doing wrong, or maximum speed reachable on a normal 3 Ghz Pentium, 1 Gb RAM?
 
Best regards
 
Principiant
0 Kudos
Message 1 of 7
(3,531 Views)
Hi Principiant,

well 800Hz should be no problem concerning the calculations...

Could you attach your vi? This way it would be much easier to give comments/hints/help...

One tip:
Do calculations as often as possible outside of loops (in every programming language) - LabView is also able to calc with arrays...
Try to send more than one value to your DAQ card. It seems you send just one value per iteration...
Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
Message 2 of 7
(3,529 Views)
Hi Principiant,
I tried to reproduce your LV VI, which calculates the sine from the loop counter and get a cycle time of 0,125 microseconds, calculating the sine of every degree (0 .. 359). This means 45 us / cycle, which is 22222 Hz.



So it is likely that the communication with Your USB- device is the bottleneck. Is this device capable of buffered output? This would be preferable, it puts a block of data (pattern) into a FIFO and the device sends it to the DAC on its own.

greets, Dave


Message Edited by daveTW on 01-23-2008 10:09 AM
Greets, Dave
0 Kudos
Message 3 of 7
(3,515 Views)
Without the code, it is hard to tell what is wrong about it....
but like Gerd, i dare to say:
- The bottleneck is not the device, the USB or something else in hardware
- The bottleneck is completly selfmade by the way the the application is implemented

First (suspected) "failure":
Creation of the outputsignal is done within the same loop as the output itself

Second "failure":
You are talking about DAQmx Assistant; i would suspect (like Gerd does) that you configured the Assistant to ONE SAMPLE. That means: You open a communication channel to the device, write one single value, close the communication, repeat. Opening and closing the communication takes, in direct relation to the PC "power", somewhere around 1-3 ms which explains very good your 300 Hz max.....

Maybe you should first look at some examples supplied with LV and DAQmx on how you can implement such applications without the assistant....

hope this helps,
Norbert
Norbert
----------------------------------------------------------------------------------------------------
CEO: What exactly is stopping us from doing this?
Expert: Geometry
Marketing Manager: Just ignore it.
0 Kudos
Message 4 of 7
(3,495 Views)

"... or maximum speed reachable on a normal 3 Ghz Pentium, 1 Gb RAM?"

Ditto the request to see teh code.

If running under LabVIEW Real-time you could probably do about 150 KHz update rates.

If runing under Windows with hardware timed loops 2KHz is resonable.

If under Windows with no hardware timed-loop then max is 1KHz deterministicly because Windows is stupid.

So please post your code. The contributors to this forum are great at spotting performance issues.

If you don't or can not post code then try reviewing the post tagged under the heading "LabVIEW_Performance".

Trying to help,

Ben

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
0 Kudos
Message 5 of 7
(3,468 Views)

Thanks to all for your support,

I attach here my code so you can see better what happens.

 

Principiant

0 Kudos
Message 6 of 7
(3,418 Views)
Solution
Accepted by topic author Principiant
Hi Principiant,

as we suspected you are sending single values to your DAQ device. Try to make up an array of values (or a waveform), maybe a full sine curve, and send this to your device...
Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
Message 7 of 7
(3,390 Views)