LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Timing sub-millisecond events with USB6009

I asked my students to create a Labview vi that, among other things, controlled a servo. After assigning the task, I found a post on here somewhere saying it was not possible because there is no hardware timing available on the device we were using: a USB6009. I thought perhaps we could work around this limitation using an external clock source and the counter, but that doesn't seem to work either.

 

The objective is to put out a pulse between 1 and 2 ms in duration every 20 ms. Our idea was to use the counter with a 100kHz square wave input to time the 1-2 ms interval, and enclose that timing loop in another loop executing every 20 ms. The inner timing loop first read the counter value and returned it to the next iteration through a shift register. The vi then repeatedly read the counter, took the difference with the initial value and, while the difference was less than the count we were after, it turned on a digital output pin (input to the servo). When the difference became greater than some value, it turned off the output pin and exited the inner loop. Attached are two images showing the vi we're using. They show what happens on the first trip through the timing loop (shift the original count) and on each iteration after (wait until the count goes past some value).

 

The concept works, but it's way too slow. It seems the inner timing loop only executes about every 2.5 ms. The output pulse width changes nicely, but only in increments of around 2.5-3 ms.

 

Is there some inherent timing limitation with loops in Labview? Or did we in our ignorance choose some terribly slow element to put in the timing loop? We assumed not putting a wait timer in the loop meant it would execute at something like 'normal' computer speed, which should be more than fast enough to read a 100kHz clock. Can anyone shed some light on why this thing runs so slowly? Thanks, and appreciate your help.

Download All
0 Kudos
Message 1 of 16
(3,379 Views)

Ever hear the expression about trying to make a silk purse from a pig's ear? Your expectations of the very inexpensive hardware and a non-deterministic os are way too high. The 6009 simply cannot operate fast enough with it's software only timing. It has nothing to do with anything inherent in LabVIEW though you could possibly squeeze a bit more performance out if you did not use the DAQ Assistant.

0 Kudos
Message 2 of 16
(3,351 Views)

Thanks for the response. We own a lot of sow's ears so routinely try to make something else of them.

 

Could you expand on your comment about 'software timing' a little more? The 6009 appeared plenty fast enough to do what we needed. It counted perfectly at 100kHz---the limiting factor seemed to be either the speed of transfer of information between host and device, or the host vi loop was executing only once every couple of milliseconds, or some combination of the two. What exactly is being software timed in this case?

 

Thanks again, we really appreciate the information you provide.

0 Kudos
Message 3 of 16
(3,321 Views)

The 6009 will only do a digital read (or write) when the software commands it do so. I am not familiar with the intimate details of the driver or hardware but I believe it is a fundamental limitation with the speed of transfer on this design which in turn slows down any loop that it is in.

0 Kudos
Message 4 of 16
(3,315 Views)

@Dennis Knutson wrote:

The 6009 will only do a digital read (or write) when the software commands it do so. I am not familiar with the intimate details of the driver or hardware but I believe it is a fundamental limitation with the speed of transfer on this design which in turn slows down any loop that it is in.



I believe Dennis is correct with this. We have one in-house, and I read the specs on it a while ago. If memory serves me correctly, it's entirely software timed. Therefore, the best performance you will get is whatever performance the software gets, plus any latency from the USB. The counter is a bit different as it may count 100 khz, but you can only read it at, say, 10 hz in software. Therefore you will see the counter value as 10,000, 20,000, 30,000 ... etc each time you do a read in software over 1 second. Don't quote me on this last sentence, as I haven't actually tried it, but I think this is the case.

0 Kudos
Message 5 of 16
(3,306 Views)

Thanks again for the feedback. Your analysis seems entirely correct. We could see the counter counting, but could only see it so often.

 

So, if I understand this correctly, we're limited by how often the 'software' can read the counter. The software is LabView, so the question becomes 'Why are LabView loops so slow?' Is it simply overhead from the UI?

 

If so, maybe we start teaching LabWindows CVI. The students would complain about having to learn C but it seems one can avoid some of the speed limitations. I suppose we need to build the same thing in CVI and compare performance.

 

Anyway, will stop bothering you folks with questions - appreciate all the prompt and erudite responses. Thanks.

0 Kudos
Message 6 of 16
(3,296 Views)

The software in this case is not LabVIEW at all as I already mentioned. The controlling software for how fast the device can operate is the DAQmx driver and the firmware on the device itself. The other pertinent software is windows. That controls how often you can write/read to the device and the non-deterministic nature means that you see a lot of jitter.

0 Kudos
Message 7 of 16
(3,287 Views)

VitusTreatise wrote:

 

So, if I understand this correctly, we're limited by how often the 'software' can read the counter. The software is LabView, so the question becomes 'Why are LabView loops so slow?' Is it simply overhead from the UI?

 


I suspect you will see very similar timing with CVI (or any software timed program for that matter, running on your OS). Yes, you are using LabVIEW, but Windows is controlling the thread scheduling as Dennis has mentioned previously. You will be hardpressed to find any software that can read at the rates you want. That is what hardware timing and data buffering is for. If you find an OS that can read at 100 khz, please tell me! I'll use a couple hundred dollar USB DAQs for everything!

0 Kudos
Message 8 of 16
(3,277 Views)

You should try dropping the "DAQ Assistant" and use the lower-level LabVIEW DAQ functions.   I believe Express VIs like "DAQ Assistant" do a "Start" and "Stop" task on every call; this could be most of the slowness your seeing.  If you setup and start the DAQ task outside the loop, you might get much better performance (though, being a USB device, there is a limit to how fast it can be).

 

-- James

0 Kudos
Message 9 of 16
(3,275 Views)

Never been a fan of daq assistant... way too much overhead to do simple daq calls.  For getting started it's great, but for production software it adds a lot of overhead.

 

The 600x is all software-based... you'd need to step up to the 621x series to get hardware timing, unfortunately.  I have some test equipment that just needs a 5v pulse over a fairly accurate time (±500µS); 600x won't do that.... so i've got the 6211 in there and i'm only using one analog output. 😕

 

 

0 Kudos
Message 10 of 16
(3,263 Views)