06-02-2021 12:22 PM
I have a while loop with 1ms wait time delay. I am reading the current time in the format (dd/mm/yy hh:mm:sec:microsecond), but I observed that microsecond is not updated every 1ms, rather it jumps sometimes 2ms or 3ms.
How can I solve this issue?
06-02-2021 12:42 PM
@msabah38 wrote:
I have a while loop with 1ms wait time delay. I am reading the current time in the format (dd/mm/yy hh:mm:sec:microsecond), but I observed that microsecond is not updated every 1ms, rather it jumps sometimes 2ms or 3ms.
How can I solve this issue?
You can't, because Windows is NOT a real time operating system.
Not to mention Windows IS a multitasking operating system. So every application, including LabVIEW and LabVEW compiled executables have to share the CPU through "time slicing" and other CPU scheduling methods.
06-02-2021 12:58 PM
As has been said, you are trying to make paper-thin prosciutto slices using a chainsaw. Not reasonable!
Windows cannot reliable do what you want and that's not the fault of LabVIEW. You actually get a more reliable result using a timed loop (see image). Have you tried? (Still, there are no guarantees under windows. LabVIEW RT or FPGA would be suitable)
You are also autoindexing at the output tunnel, which can get expensive as the array built inside the output tunnel grows without limits. The memory manager need to occasionally allocate larger and larger contiguous memory (at high cost!) and eventually you'll run out of memory. Not reasonable!
06-02-2021 01:09 PM
What is right way in which I can get milliseconds separated by exactly 1ms.
06-02-2021 01:19 PM
LabVIEW Real-time, plus the hardware/OS requirements, is the only way to be sure.
If you're stuck with a desktop OS like Windows, the best you can hope for is to get close. You can try setting the VI properties to turn off debugging and use a higher priority:
Even still, it's not a guarantee.
You can also try deliberately NOT using a "wait" function, and instead just call the "check time" function in an infinite While loop until 1 ms has passed. Calling a "wait" function will tell LabVIEW (and the OS) that the program doesn't need to do anything for a bit, so it surrenders its current run state briefly to allow other processes to use some system resources... and there's no guarantee that it will ONLY wait the time you set, it'll go over whenever it feels like it needs to. If you never deliberately surrender your run state, you're more likely (but still not guaranteed!) to be able to not skip timing points.
06-02-2021 02:19 PM
@Kyle97330 wrote:
LabVIEW Real-time, plus the hardware/OS requirements, is the only way to be sure.
If you're stuck with a desktop OS like Windows, the best you can hope for is to get close. You can try setting the VI properties to turn off debugging and use a higher priority:
Even still, it's not a guarantee.
You can also try deliberately NOT using a "wait" function, and instead just call the "check time" function in an infinite While loop until 1 ms has passed. Calling a "wait" function will tell LabVIEW (and the OS) that the program doesn't need to do anything for a bit, so it surrenders its current run state briefly to allow other processes to use some system resources... and there's no guarantee that it will ONLY wait the time you set, it'll go over whenever it feels like it needs to. If you never deliberately surrender your run state, you're more likely (but still not guaranteed!) to be able to not skip timing points.
At the expense of the CPU starving other applications and tasks. Not a good approach. If the OP truly needs 1 ms resolution then they need to go to LabVIEW RT.
06-02-2021 02:54 PM
No special execution states, etc. Try the High resolution Poll wait. Results will be system dependent and depend on whatever else is running.
mcduff
06-02-2021 03:15 PM
@msabah38 wrote:
What is right way in which I can get milliseconds separated by exactly 1ms.
As I said, use a timed loop. It will get you reasonably close, but there are no guarantees. (And don't mess with priority settings as others have suggested). As I already said, you are also unreasonably hammering the memory manager, so timing will degrade as the array in the shift register grows over time. Never auto-index at the output tunnel of a while loop unless you know exactly what you are doing!
Please explain in detail what you are trying to do. Why is timing so important? What happens if timing is off (nothing? Nuclear core meltdown? Anything in-between?.) How much code will be in the loop? Can it comfortably execute in much less that 1ms?
Many times you can do e.g. hardware timed continuous acquisition so the timing is done on the DAQ device and windows timing does not matter.
06-03-2021 12:55 AM
I am acquiring Data from PXI based DAQ card which is an RT based system at a rate of 1kHz, (1ms).
Once the data is received at the Host PC (Windows), I am saving the data using binary write operation along with windows timestamp.
For Time stamp I am using Windows PC timestamp (HOST side), hence the data is being skipped, since PC timing is not real time.
Is there any way in which the timing also comes from RT NI DAQMx (RT program), so that I can utilize the same time.
06-03-2021 03:44 AM
"...make paper-thin prosciutto slices using a chainsaw" 😄
-Made my day, thanks Altenbach!
But really, of course the timestamp should come from the RT system. If you apply timestamps afterwards, they will not match the actual time for the data and will thus be incorrect. Although I don't use RT, I can google: https://forums.ni.com/t5/LabVIEW/how-to-get-timestamp-of-DAQmx-Read-data/m-p/3212978/highlight/true#...