LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

How to find out if my current DAQ setup has reached its performance limit !


@Bob_Schor wrote:

Sigh.  If you plan to go to a PXI system, go "all the way" and allow the PXI to do "what it does best", and the PC do what it does best...

Bob Schor


Not sure if the OP needs the complexity of a RT system; it should be used if deterministic timings are needed. I have a PXI system that runs Windows that is handling 32 channels continuously, each channel 16 bit data, with a sample rate of 20.833MSa/s. That's a data rate of 1.33GB/s. It was actually easier to set up in Windows than RT. The only limitation is disk space, as I can only stream continuously for ~6 hours before I run out of disk space. There are advantages for RT and there are other advantages for a Windows system. It depends on requirements.

Message 11 of 20
(400 Views)

You guessed right. I for sure dont need a RT level performance for the job on hand. 

 

Being an 24/7 test  Rig I will have to  control RPM and Torque of  a gear box in a set pattern over time ( the fastest ramp time is around 500ms ) and capture analog values of RPM / Pressure / Temperature / Flow . That in short describes the whole process. In all I need to record some 14 analog channels. Of course there are very many patterns to program and gear box types to contend with. So the application grows big with time as it has to handle very many variants and also run three to four such tests in parallel to save time. . 

 

I have managed upto 3 parallel tests but anything more the PCI cards start to buckle ... have to resort to switching off plot updates on monitor to reduce CPU load

Raghunathan
LabVIEW to Automate Hydraulic Test rigs.
0 Kudos
Message 12 of 20
(369 Views)

@MogaRaghu wrote:

You guessed right. I for sure dont need a RT level performance for the job on hand. 

 

Being an 24/7 test  Rig I will have to  control RPM and Torque of  a gear box in a set pattern over time ( the fastest ramp time is around 500ms ) and capture analog values of RPM / Pressure / Temperature / Flow . That in short describes the whole process. In all I need to record some 14 analog channels. Of course there are very many patterns to program and gear box types to contend with. So the application grows big with time as it has to handle very many variants and also run three to four such tests in parallel to save time. . 

 

I have managed upto 3 parallel tests but anything more the PCI cards start to buckle ... have to resort to switching off plot updates on monitor to reduce CPU load


Don't know much of your test system or requirements, so just guessing here. If you are collecting RPM / Pressure / Temperature / Flow / Torque then these are "low bandwidth" measurements, under 1 MSa/s I assume. (Unless you are doing shock physics, not sure why you would even need to measure any higher than 1 MSa/s, you can probably measure in the 100s of kSa/s range.)

 

So lets assume you have 14 channels at 1 MSa/s and you use doubles format for ease of use, then that is a 112MB/s data rate. This should be entirely doable. You need to design your analysis to occur in parallel as I do not know what update rate you need. Big red flag "have to resort to switching off plot updates"; you need to do a Min-Max decimation and not plot the whole data stream, that will slow you down immensely. See this link for an example of a Min-Max Decimation. Keep a buffer of the required data, then decimate and plot.

Message 13 of 20
(355 Views)

Maybe it's my own experience, but in several Projects I've worked on (including my "Introduction to LabVIEW" from being an "RT-11 Guru") involved running LabVIEW + LabVIEW RT on PXI and RIO systems, I found the natural "division of Labor" in having one CPU devoted to the "Human" side of things (a PC running Windows, handling setting up TCP/IP messaging with an RT Processor, managing the User Interface, and saving "as much data that you can cram onto a 500 GB drive") and a separate CPU running a Real-Time OS that handled the (sometimes complex) timing of (sometimes independent) protocols, A/D, D/A, and DIO timed sequences running without Front Panel "interruptions", and a fairly-simple flexible "messaging system" based on TCP/IP (in my case, Network Streams) to keep the two processors working both "together" and yet not so tightly linked that if the Host needed a little more time to get another File opened, or to redraw a graph/chart because the Operator wanted to see more (or less) worth the effort.

 

But certainly McDuff has been using LabVIEW and doing Testing longer than I (I've never actually done "testing", per se).  I'll stop beating this "tired" horse.  NI has recently announced some new hardware that might provide you the extra "edge" that you need ...

 

Bob Schor

Message 14 of 20
(346 Views)

@Bob_Schor wrote:

Maybe it's my own experience, but in several Projects I've worked on (including my "Introduction to LabVIEW" from being an "RT-11 Guru") involved running LabVIEW + LabVIEW RT on PXI and RIO systems, I found the natural "division of Labor" in having one CPU devoted to the "Human" side of things (a PC running Windows, handling setting up TCP/IP messaging with an RT Processor, managing the User Interface, and saving "as much data that you can cram onto a 500 GB drive") and a separate CPU running a Real-Time OS that handled the (sometimes complex) timing of (sometimes independent) protocols, A/D, D/A, and DIO timed sequences running without Front Panel "interruptions", and a fairly-simple flexible "messaging system" based on TCP/IP (in my case, Network Streams) to keep the two processors working both "together" and yet not so tightly linked that if the Host needed a little more time to get another File opened, or to redraw a graph/chart because the Operator wanted to see more (or less) worth the effort.

 

Bob Schor


There is absolutely nothing wrong with what you are proposing, all I was suggesting was sometimes all of that may not be needed. There is a lot of work and overhead to designing the system as you propose which takes time and money, both of which can be in short supply.

 

I have a PXIe system that has a 8 core Xeon processor; it can handle a good workload whether in Windows or RT. Updating the UI can be expensive, but the UI is single threaded, so at worst it should only occupy one core.

 

Some examples where Windows ended up being easier and less expensive.

  1. Install a persistent monitoring system at a remote work location. This case seemed to be ideal for cRIO/RT. But, cRIO cannot be plugged into the work network, communicating with a cRIO with a work computer is a hassle due to firewall problems, and it takes more time to program a cRIO/RT system. Solution - make a EXE for Windows and run it as a service as soon as the computer boots up without anyone having to log in. Computer is now on the network, some downtime due to IT updates which is not a major issue, user can remote desktop into the system to get files, cost a lot less in hardware and time, and easy to transfer/duplicate another system.
  2. PXI System - some hardware drivers were not available for Linux, some nice features like filewatcher not available, easier to program and debug

Really depends on what you need. Time critical tasks need a RT system, DAQ systems that are aren't used for active control can deal with a little latency from time to time.

0 Kudos
Message 15 of 20
(332 Views)

@MogaRaghu wrote:

have to resort to switching off plot updates on monitor to reduce CPU load


Are you using lots of Value Property nodes?

G# - Award winning reference based OOP for LV, for free! - Qestit VIPM GitHub

Qestit Systems
Certified-LabVIEW-Developer
0 Kudos
Message 16 of 20
(266 Views)

In order to check out the maximum possible fast loop time, I put together a simple project with a Main VI  and Sub VI as I  normally structure all my applications. 

 

And here I have brought in two Queues - one to move data from Main to Sub and the other in reverse from Sub to Main. 

 

You can just open the project ( LV2024 ) and run the Main . After that you will have a button to load the Sub ... and the period of Main can be controlled also . In my system I am getting stable performance upto 10ms loop time. ( Win 11 on a Intel i7 laptop - display at 2880 x 1800 ) . Anything lower the loop struggles and the Late LED lights up. 

 

Any suggestions and ideas to further load the code is welcome ! 

 

 

Raghunathan
LabVIEW to Automate Hydraulic Test rigs.
0 Kudos
Message 17 of 20
(228 Views)

I understand this is an example, but a few things:

  1. No local variables of arrays (edit no local variable of arrays) in tight loops. Local variables create a data copy. Just moving the local variable outside the loop, since it was never changing, helped with speed.
  2. Indicators should be updated in a separate loop. You want to update your data 200 times a second, like a high end video game. It's not happening. Suggestion - Make a temporary buffer outside the loop, connect it to the loop with a shift register. The buffer is a 2d array of 3 rows and 20 columns. Use replace element to update a column in the buffer; every 20th iteration of your loop update your indicator. Now you can see every 5 ms of data but your indicator updates every 100ms.
  3. Since you are using Windows and a timer, make sure no power throttling is occurring. See this link.
Message 18 of 20
(206 Views)

Great inputs ... will check out one by one !

 

Thanks so much. 😍

Raghunathan
LabVIEW to Automate Hydraulic Test rigs.
0 Kudos
Message 19 of 20
(181 Views)

Update on suggested hacks... 

 

1. Indicators should be updated in a separate loop.  Trying to figure out how this is done. right now we are trying to collect all data arriving at 5ms rate and accumulate them and display after 20 iterations . Then clear the buffer and start over.  Let me see ho it goes. 

 

2. As to the hack below .... tried out with the proper path for my installation with Admin rights and got a message " powercfg/powerthrottling.... no such command " .  Maybe this is not OK for the new gen CPUs ?? 

 

Disable background throttling (need admin on command prompt):

powercfg /powerthrottling disable /path "C:\Program Files (x86)\National Instruments\LabVIEW 2017\LabVIEW.exe"

LabVIEW.exe will have 1ms system timer even if on background

Raghunathan
LabVIEW to Automate Hydraulic Test rigs.
0 Kudos
Message 20 of 20
(154 Views)