Multifunction DAQ

cancel
Showing results for 
Search instead for 
Did you mean: 

counting time inside a loop - longer duration for some iterations

I am using a USB joystick to provide user feedback in a training program. The joystick position and target information are displayed on the front panel during the test. My desired output is position of the joystick and time. Because this is a USB device and the signal doesn't enter the DAQ card (6024E), I am using software timing (read counter) to get time. The resulting data shows that some iterations have a time of 1ms, while every 3rd or so has a time much longer, 14ms for instance. Is there any way to improve this performance? Is it possible to sample the joystick using the DAQ card or to time using the DAQ card so that I don't have to compete with system performance? We are hoping to acheive
the equilalent of a 200Hz sampling rate (pt every 5ms) so the current scenario isn't desirable. I have tried setting the vi execution priority as well as closing other applications. I am running V7 on a laptop with XP.
0 Kudos
Message 1 of 4
(2,932 Views)
You are experiencing the wonders of a nondeterministic operating system. You cannot be certain just how much time will pass between your loop itterations. You might try setting the priority of LabVIEW higher--Programmatically Changing the Process Priority in LabVIEW. However, the best solution is to use LabVIEW RT on a RT controller. That product was made to overcome the problem on nondeterminism.
0 Kudos
Message 2 of 4
(2,928 Views)
In the processing of data collected from the program described in the original post, we noticed that the longer time between some iterations seems to be very regular. I was wondering whether this can be attributed to the OS issue mentioned (lack of deterministic timing) or whether this might be something else (like Labview increasing memory for my indexing of collected points). Attatched is a sample of the timing data. The numbers represent the elapsed time between iterations of the loop in ms. This was collected with the loop running as fast as possible and reading the tick count every iteration. Any ideas would be appreciated.
0 Kudos
Message 3 of 4
(2,909 Views)
Either memory allocation or a non-deterministic OS could cause the jitter you are seeing. Most likely, you would see this behavior anytime when running on a non-deterministic OS, regardless of whether you are performing memory allocations, but this effect would be exacerbated by performing memory allocations. Also, memory allocations can introduce jitter even when running on a real-time OS. Thus, to completely eliminate this problem, you will have to move to a real-time platform (such as LabVIEW RT) as well as eliminate any memory allocations inside your loop.

Best of luck,
Joe
0 Kudos
Message 4 of 4
(2,884 Views)