LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Skipping of Time in millisecond in while loop

I have a while loop with 1ms wait time delay. I am reading the current time in the format (dd/mm/yy hh:mm:sec:microsecond), but I observed that microsecond is not updated every 1ms, rather it jumps sometimes 2ms or 3ms.

How can I solve this issue?

 

msabah38_0-1622654517938.png

msabah38_1-1622654552565.png

 

 

0 Kudos
Message 1 of 13
(2,097 Views)

@msabah38 wrote:

I have a while loop with 1ms wait time delay. I am reading the current time in the format (dd/mm/yy hh:mm:sec:microsecond), but I observed that microsecond is not updated every 1ms, rather it jumps sometimes 2ms or 3ms.

How can I solve this issue?

 

 


You can't, because Windows is NOT a real time operating system. 

 

Not to mention Windows IS a multitasking operating system. So every application, including LabVIEW and LabVEW compiled executables have to share the CPU through "time slicing" and other CPU scheduling methods. 

========================
=== Engineer Ambiguously ===
========================
0 Kudos
Message 2 of 13
(2,088 Views)

As has been said, you are trying to make paper-thin prosciutto slices using a chainsaw. Not reasonable!

 

Windows cannot reliable do what you want and that's not the fault of LabVIEW. You actually get a more reliable result using a timed loop (see image). Have you tried? (Still, there are no guarantees under windows. LabVIEW RT or FPGA would be suitable)

 

altenbach_1-1622656643154.png

 

 

 

You are also autoindexing at the output tunnel, which can get expensive as the array built inside the output tunnel grows without limits. The memory manager need to occasionally allocate larger and larger contiguous memory (at high cost!) and eventually you'll run out of memory. Not reasonable!

0 Kudos
Message 3 of 13
(2,071 Views)

What is right way in which I can get milliseconds separated by exactly 1ms.

0 Kudos
Message 4 of 13
(2,054 Views)

LabVIEW Real-time, plus the hardware/OS requirements, is the only way to be sure.

 

If you're stuck with a desktop OS like Windows, the best you can hope for is to get close.  You can try setting the VI properties to turn off debugging and use a higher priority:

 

Kyle97330_0-1622657699363.png

Even still, it's not a guarantee.

 

You can also try deliberately NOT using a "wait" function, and instead just call the "check time" function in an infinite While loop until 1 ms has passed.  Calling a "wait" function will tell LabVIEW (and the OS) that the program doesn't need to do anything for a bit, so it surrenders its current run state briefly to allow other processes to use some system resources... and there's no guarantee that it will ONLY wait the time you set, it'll go over whenever it feels like it needs to.  If you never deliberately surrender your run state, you're more likely (but still not guaranteed!) to be able to not skip timing points.

0 Kudos
Message 5 of 13
(2,048 Views)

@Kyle97330 wrote:

LabVIEW Real-time, plus the hardware/OS requirements, is the only way to be sure.

 

If you're stuck with a desktop OS like Windows, the best you can hope for is to get close.  You can try setting the VI properties to turn off debugging and use a higher priority:

 

Kyle97330_0-1622657699363.png

Even still, it's not a guarantee.

 

You can also try deliberately NOT using a "wait" function, and instead just call the "check time" function in an infinite While loop until 1 ms has passed.  Calling a "wait" function will tell LabVIEW (and the OS) that the program doesn't need to do anything for a bit, so it surrenders its current run state briefly to allow other processes to use some system resources... and there's no guarantee that it will ONLY wait the time you set, it'll go over whenever it feels like it needs to.  If you never deliberately surrender your run state, you're more likely (but still not guaranteed!) to be able to not skip timing points.


At the expense of the CPU starving other applications and tasks. Not a good approach. If the OP truly needs 1 ms resolution then they need to go to LabVIEW RT.



Mark Yedinak
Certified LabVIEW Architect
LabVIEW Champion

"Does anyone know where the love of God goes when the waves turn the minutes to hours?"
Wreck of the Edmund Fitzgerald - Gordon Lightfoot
0 Kudos
Message 6 of 13
(2,024 Views)

No special execution states, etc. Try the High resolution Poll wait. Results will be system dependent and depend on whatever else is running.

 

Snap133.pngsnip.png

 

mcduff

0 Kudos
Message 7 of 13
(2,013 Views)

@msabah38 wrote:

What is right way in which I can get milliseconds separated by exactly 1ms.


 

As I said, use a timed loop. It will get you reasonably close, but there are no guarantees. (And don't mess with priority settings as others have suggested). As I already said, you are also unreasonably hammering the memory manager, so timing will degrade as the array in the shift register grows over time. Never auto-index at the output tunnel of a while loop unless you know exactly what you are doing!

 

Please explain in detail what you are trying to do. Why is timing so important? What happens if timing is off (nothing? Nuclear core meltdown? Anything in-between?.) How much code will be in the loop? Can it comfortably execute in much less that 1ms?

 

Many times you can do e.g. hardware timed continuous acquisition so the timing is done on the DAQ device and windows timing does not matter.

Message 8 of 13
(2,009 Views)

I am acquiring Data from PXI based DAQ card which is an RT based system at a rate of 1kHz, (1ms).

 

Once the data is received at the Host PC (Windows), I am saving the data using binary write operation along with windows timestamp.

 

For Time stamp I am using Windows PC timestamp (HOST side), hence the data is being skipped, since PC timing is not real time.

 

Is there any way in which the timing also comes from RT NI DAQMx (RT program), so that I can utilize the same time.

0 Kudos
Message 9 of 13
(1,969 Views)

"...make paper-thin prosciutto slices using a chainsaw" 😄

-Made my day, thanks Altenbach!

 

But really, of course the timestamp should come from the RT system. If you apply timestamps afterwards, they will not match the actual time for the data and will thus be incorrect. Although I don't use RT, I can google: https://forums.ni.com/t5/LabVIEW/how-to-get-timestamp-of-DAQmx-Read-data/m-p/3212978/highlight/true#...

Certified LabVIEW Architect
0 Kudos
Message 10 of 13
(1,956 Views)