05-27-2014 07:04 AM - edited 05-27-2014 07:05 AM
Hello,
I am making a VI in which I want to control a LED light (Plugged to a Vernier SensorDAQ)
I want the LED to be ON for 10 or 100 microseconds and then to be off.
Attached is a picture of my actual VI.
Now I have a problem : I don´t manage to have a delay shorter than the milisecond. I think LabVIEW cannot run my code faster than the milisecond (with the wait for ms in the loop). I am using Windows.
Do you know a way I could wait for 10 or 100 microseconds ? Do you have another program structure in mind that would solve my problem ?
Thank you !
Solved! Go to Solution.
05-27-2014 07:22 AM
Hi greg,
apart from being able to have waits in the microsecond range: Windows is NO realtime OS. It will hurt your timing accuracy sooner or later!
- Did you notice the coercion dot? Is has a purpose to warn you on coercions occuring in your code…
- You could use the High-accuracy clock to create your own Wait-function to be able to wait for e.b. 100µs. But as said before: Windows will hurt your timing accuracy heavily.
Most appropriate solution:
Use some DAQ hardware that supports timed DO channels. Here you can set a sample rate and the hardware will do the job on it's own - not relying on a heavily jitter-influenced wait function…
05-27-2014 07:28 AM
With Windows, you are not going to get any better wait resolution than 1ms. And even then, you can't count on that accuracy.
What DAQ card do you have?
If you want that kind of accuracy, you need to move to an FPGA card or a Real Time. Though, if you have one of the high speed DIO cards, you can pass a waveform to the card and it will clock it out for you.
05-27-2014 07:34 AM
I'm curious, will an LED even illuminate if turned on for 10 microseconds?
I suggest you design a circuit that will give you the desired pulse when you enable an output on your Vernier SensorDAQ.
05-27-2014 08:27 AM
It is not uncommon for LEDs to have rise times in the single digit or fractional nanosecond range, so they will indeed turn on for 10 microseconds. The rise time can be limited more by the inductance and capacitance of the circuit surrounding them than the inherent capabilities of the device.
The least expensive way to do this is to use a DAQ board with digital waveform capabilities at high enough frequency or a counter/timer output that can drive the LED with the digital output (most can, 20mA is all it takes to fully saturate most LEDs). Use the hardware capabilities of the device to determine the pulse width. Barring that, you would need an FPGA device to get that level of timing accuracy. An RT system is not good enough. The single pulse counter/timer approach is most commonly available.
Good luck and let us know if you need more help.
05-27-2014 08:34 AM
Thank you for all your answers !
First thank you for the tip about the coercion dot, I didn't know. Now I'll pay more attention.
About the acuracy, it isn't a real problem for my application : if I ask for 10 microseconds and obtain 30 or 35 instead, that's not a problem.
Do you know if I can find a sub VI using the high accuracy clock to make a timer ?
Using timed DO channels would be great ! But I don't have much hardware, and the budget for my project is limited, and I don't think my actual hardware gives me this possibility. (The documentation is quite limited but there is no mention about timed DO channels).
So far I am using a Vernier SensorDAQ : http://www.vernier.com/products/interfaces/sdaq/
About the illumination if turned on for 10 microseconds, I don't know either ! Will see !
It is true that I could use another circuit, but I have a limited time to perform this project. So if I could find a way to do it from LabVIEW... Once again, I don't need extreme accuracy.
Do you know how I could make a wait-for-microseconds-timer VI ?
Thank you !
05-27-2014 08:39 AM - edited 05-27-2014 08:39 AM
Hi greg,
that device doesn't (seem to) support timed DIO channels. It looks more or less like those cheaper USB devices you can directly get from NI (USB600x). To make it even worse: when your hardware is similar to those USB600x devices you are even limited to a max. sample rate of ~120Hz for your DO channels!
That will leave you with ~8ms pulses…
Please re-confirm if all this really applies to your hardware, I only have experience with the USB6008/9 devices.
05-27-2014 08:40 AM - edited 05-27-2014 08:44 AM
I just figured out there is a pulse output on my device, I'll try to use it instead ! Thank you !
Edit :There is an output but it doesn't allow micro seconds operations...
I'll try to see if I can get another device, but if not, I am still interested if someoene has an idea about how to make a counter for LabVIEW.
Thank you !
05-27-2014 08:41 AM
Never seen that DAQ before. Nice to know there is something out there like it.
Based on the spec, there is a counter on the board. So you want to setup a counter task that will pulse for your 10us.
05-27-2014 08:45 AM - edited 05-27-2014 08:47 AM
@greg765 wrote:
Thank you for all your answers !
First thank you for the tip about the coercion dot, I didn't know. Now I'll pay more attention.
About the acuracy, it isn't a real problem for my application : if I ask for 10 microseconds and obtain 30 or 35 instead, that's not a problem.
Do you know if I can find a sub VI using the high accuracy clock to make a timer ?
Do you know how I could make a wait-for-microseconds-timer VI ?
Thank you !
The subject does come up now and then. While this Community nugget offers some advice. Be careful with using the technique. There is often (Perhaps Always) more effective hardware solutions.