LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Why application execution cycle is disturbed when remote connection is opened?

Hi, 

 

LV 2009

 

I have a simpe application where is some static aritmetic inside timedloop. Application cycle is 1000ms. I constatly measure how much time is spend to execute one cycle. Usually this value is about 300-320ms. This application is running in one workstation(2 CPUs). I noticed that when I took a windows remote desktop connection to workstation application cycle is disturbed. Sometimes it takes even more than 1000ms to execute one cycle. Remote desktop connection is quite slow because connection is done via cellural phone. If my remote desktop connection is slow why cycle is disturbed? Is frontpanel waiting that all data is updated to my cellural phone display? Have anyone else noticed this behaviour?

 

BR, Jim 

0 Kudos
Message 1 of 6
(3,063 Views)

I have a similar situation, I have a PC running a labview executable carrying out DAQ tasks on a test facility, remote desktop is used to view and control the exe aduring a trial. It appears that if the network connection is lost the running of the executable on the remote PC is interupted, I'm monitoring the DAQ device buffer during continuos operation and it fills up when the connection is lost, indicating the executable is not looping at the correct rate. It even struggles when the remote desktop window is scrolled!! It makes no sense as the remote PC should run quite happily with or without a remote desktop connection?!

0 Kudos
Message 2 of 6
(2,780 Views)

What's the CPU Usage with and without the remote desktop connection?

 

Regards,

Marco

0 Kudos
Message 3 of 6
(2,776 Views)

I cant remember the exact values but the CPU usage is low in both cases.

0 Kudos
Message 4 of 6
(2,770 Views)

That would be nice to share some code, suppose.

 

Could help to work out then maybe.

0 Kudos
Message 5 of 6
(2,764 Views)

I suspect part of the answer would be shown in the WIndows Task manager >>> Perfomance >>> show Kernal times.

 

The OS (operating system) exposes an evnvironment in which our code runs. That environment OPERATES the hardware as required to provide that environment. One of the operations it performs for us is controlling the memory allocation and mapping in order to provide the Virtual Memory our code run in. Virtual Memory is composed of hardware that translates memory address fetches into physical memory fetches. VM allows you to run multiple applications on the same machine at the same time where each of the processes have access to up to 4Gig of memory even though the machine has far less than the total of what all of the processes THINKS it has access to.

 

When the OS discovers a need to implement a new environment for a yet another process or expand the mapped memory space of an existing thread, the OS drops into Kernal mode where is can acces the memory mapping hardware and twiddle the bits to set up the new memory.

 

While in Kernal mode, normal processing is stopped until the mapping is complete.

 

So...

 

You have found out why critical processes running under Windows is a "Hit and Miss" game.

 

But then again, I am just guessing.

 

Ben 

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
0 Kudos
Message 6 of 6
(2,752 Views)