06-27-2025 08:27 AM
@Bob_Schor wrote:
Sigh. If you plan to go to a PXI system, go "all the way" and allow the PXI to do "what it does best", and the PC do what it does best...
Bob Schor
Not sure if the OP needs the complexity of a RT system; it should be used if deterministic timings are needed. I have a PXI system that runs Windows that is handling 32 channels continuously, each channel 16 bit data, with a sample rate of 20.833MSa/s. That's a data rate of 1.33GB/s. It was actually easier to set up in Windows than RT. The only limitation is disk space, as I can only stream continuously for ~6 hours before I run out of disk space. There are advantages for RT and there are other advantages for a Windows system. It depends on requirements.
06-29-2025 01:23 AM - edited 06-29-2025 01:24 AM
You guessed right. I for sure dont need a RT level performance for the job on hand.
Being an 24/7 test Rig I will have to control RPM and Torque of a gear box in a set pattern over time ( the fastest ramp time is around 500ms ) and capture analog values of RPM / Pressure / Temperature / Flow . That in short describes the whole process. In all I need to record some 14 analog channels. Of course there are very many patterns to program and gear box types to contend with. So the application grows big with time as it has to handle very many variants and also run three to four such tests in parallel to save time. .
I have managed upto 3 parallel tests but anything more the PCI cards start to buckle ... have to resort to switching off plot updates on monitor to reduce CPU load
06-29-2025 09:41 AM
@MogaRaghu wrote:
You guessed right. I for sure dont need a RT level performance for the job on hand.
Being an 24/7 test Rig I will have to control RPM and Torque of a gear box in a set pattern over time ( the fastest ramp time is around 500ms ) and capture analog values of RPM / Pressure / Temperature / Flow . That in short describes the whole process. In all I need to record some 14 analog channels. Of course there are very many patterns to program and gear box types to contend with. So the application grows big with time as it has to handle very many variants and also run three to four such tests in parallel to save time. .
I have managed upto 3 parallel tests but anything more the PCI cards start to buckle ... have to resort to switching off plot updates on monitor to reduce CPU load
Don't know much of your test system or requirements, so just guessing here. If you are collecting RPM / Pressure / Temperature / Flow / Torque then these are "low bandwidth" measurements, under 1 MSa/s I assume. (Unless you are doing shock physics, not sure why you would even need to measure any higher than 1 MSa/s, you can probably measure in the 100s of kSa/s range.)
So lets assume you have 14 channels at 1 MSa/s and you use doubles format for ease of use, then that is a 112MB/s data rate. This should be entirely doable. You need to design your analysis to occur in parallel as I do not know what update rate you need. Big red flag "have to resort to switching off plot updates"; you need to do a Min-Max decimation and not plot the whole data stream, that will slow you down immensely. See this link for an example of a Min-Max Decimation. Keep a buffer of the required data, then decimate and plot.
06-29-2025 10:11 AM
Maybe it's my own experience, but in several Projects I've worked on (including my "Introduction to LabVIEW" from being an "RT-11 Guru") involved running LabVIEW + LabVIEW RT on PXI and RIO systems, I found the natural "division of Labor" in having one CPU devoted to the "Human" side of things (a PC running Windows, handling setting up TCP/IP messaging with an RT Processor, managing the User Interface, and saving "as much data that you can cram onto a 500 GB drive") and a separate CPU running a Real-Time OS that handled the (sometimes complex) timing of (sometimes independent) protocols, A/D, D/A, and DIO timed sequences running without Front Panel "interruptions", and a fairly-simple flexible "messaging system" based on TCP/IP (in my case, Network Streams) to keep the two processors working both "together" and yet not so tightly linked that if the Host needed a little more time to get another File opened, or to redraw a graph/chart because the Operator wanted to see more (or less) worth the effort.
But certainly McDuff has been using LabVIEW and doing Testing longer than I (I've never actually done "testing", per se). I'll stop beating this "tired" horse. NI has recently announced some new hardware that might provide you the extra "edge" that you need ...
Bob Schor
06-29-2025 12:47 PM
@Bob_Schor wrote:
Maybe it's my own experience, but in several Projects I've worked on (including my "Introduction to LabVIEW" from being an "RT-11 Guru") involved running LabVIEW + LabVIEW RT on PXI and RIO systems, I found the natural "division of Labor" in having one CPU devoted to the "Human" side of things (a PC running Windows, handling setting up TCP/IP messaging with an RT Processor, managing the User Interface, and saving "as much data that you can cram onto a 500 GB drive") and a separate CPU running a Real-Time OS that handled the (sometimes complex) timing of (sometimes independent) protocols, A/D, D/A, and DIO timed sequences running without Front Panel "interruptions", and a fairly-simple flexible "messaging system" based on TCP/IP (in my case, Network Streams) to keep the two processors working both "together" and yet not so tightly linked that if the Host needed a little more time to get another File opened, or to redraw a graph/chart because the Operator wanted to see more (or less) worth the effort.
Bob Schor
There is absolutely nothing wrong with what you are proposing, all I was suggesting was sometimes all of that may not be needed. There is a lot of work and overhead to designing the system as you propose which takes time and money, both of which can be in short supply.
I have a PXIe system that has a 8 core Xeon processor; it can handle a good workload whether in Windows or RT. Updating the UI can be expensive, but the UI is single threaded, so at worst it should only occupy one core.
Some examples where Windows ended up being easier and less expensive.
Really depends on what you need. Time critical tasks need a RT system, DAQ systems that are aren't used for active control can deal with a little latency from time to time.
07-03-2025 08:42 AM
@MogaRaghu wrote:
have to resort to switching off plot updates on monitor to reduce CPU load
Are you using lots of Value Property nodes?
07-08-2025 06:26 AM - edited 07-08-2025 06:27 AM
In order to check out the maximum possible fast loop time, I put together a simple project with a Main VI and Sub VI as I normally structure all my applications.
And here I have brought in two Queues - one to move data from Main to Sub and the other in reverse from Sub to Main.
You can just open the project ( LV2024 ) and run the Main . After that you will have a button to load the Sub ... and the period of Main can be controlled also . In my system I am getting stable performance upto 10ms loop time. ( Win 11 on a Intel i7 laptop - display at 2880 x 1800 ) . Anything lower the loop struggles and the Late LED lights up.
Any suggestions and ideas to further load the code is welcome !
07-08-2025 08:31 AM - edited 07-08-2025 08:43 AM
I understand this is an example, but a few things:
07-08-2025 11:31 AM
Great inputs ... will check out one by one !
Thanks so much. 😍
07-09-2025 04:37 AM
Update on suggested hacks...
1. Indicators should be updated in a separate loop. Trying to figure out how this is done. right now we are trying to collect all data arriving at 5ms rate and accumulate them and display after 20 iterations . Then clear the buffer and start over. Let me see ho it goes.
2. As to the hack below .... tried out with the proper path for my installation with Admin rights and got a message " powercfg/powerthrottling.... no such command " . Maybe this is not OK for the new gen CPUs ??
Disable background throttling (need admin on command prompt):
powercfg /powerthrottling disable /path "C:\Program Files (x86)\National Instruments\LabVIEW 2017\LabVIEW.exe"
LabVIEW.exe will have 1ms system timer even if on background