04-10-2025 04:01 PM
Hi,
I have a labview program for IV measurement. The measurement waiting time can be adjusted in miliseconds at the moment. How could I change it to mirco seconds?
Thanks
04-10-2025 04:12 PM - edited 04-10-2025 04:13 PM
There is a VI that does this, the "High Resolution Polling Wait VI":
Unless you have a very old LabVIEW version it should be included and present in the Timing palette. The input for it is in seconds as a floating point number, so just put in 0.00001 for 10 microseconds or whatever, and you're good.
However, a warning: Windows can and will throw off any exact timing you can try to do. You can always write code to wait "at least" a certain amount of time, but if your code requires timing accuracy on a very fine scale then it will fail eventually.
Windows, being a consumer-grade multi-tasking system, can interrupt LabVIEW at any time to do something it considers more important, no matter how high of a priority you set your LabVIEW process to run at.
If you truly need consistent measurements at a fine resolution, then you need to do one of the following:
1. Do all your measurements in hardware, then download them at the end. This depends on having hardware that allows for capturing multiple signals on a timed trigger and storing them internally, so it might not be a possibility.
2. Run your code on a "real-time" operating system. You will need special hardware such as a CompactRIO or a PXI chassis + controller to do this.
04-10-2025 06:36 PM
As has been said, you won't get any "regularity" on plain windows, even with 1ms. Typically, faster acquisition rates must be done hardware timed. If you need to do this point by point and make decisions based on each, you need to use LabVIEW RT, or possibly even FPGA.
Can you take a step back and explain the experiment and what led you to the conclusion that a shorter time would be a reasonable solution. While we cannot see your code, the presence of a stacked sequences with wires in all directions already scares me. 😄
Tell us what you want to do, not how you want to do it. Also tell us your hardware.
04-10-2025 11:18 PM
I’ve attached the block diagram of my LabVIEW program. I’m performing I–V measurements on my devices using a DAQ system and a Keithley 427 current amplifier. The current amplifier allows me to set the rise time between 0.01 ms and 300 ms.
Currently, if I enter a decimal value less than 1 ms for the waiting time in the program, it rounds down to zero. However, I need to use rise times below 1 ms, so I want to ensure that my LabVIEW program correctly handles and sends these sub-millisecond values.
m,y question is on how to modify the program to support decimal values below 1 ms in the waiting time?
Thanks!
04-11-2025 12:01 AM
All you did was attach a picture. We can't even see the IO. Where do you get data points and how?
A wait is U32, so you cannot enter decimal milliseconds. It will round to the nearest milliseconds.
04-11-2025 03:39 AM
@Saxo987 wrote:
Hi,
I have a labview program for IV measurement. The measurement waiting time can be adjusted in miliseconds at the moment. How could I change it to mirco seconds?
Thanks
You don't. Why would you need to? If you're using it for sampling speed you're doing it wrong! Set a sample speed and grab 100 or 1000 samples at once and hardware will handle it for you with much better precision that software can.