LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Acquisition timing of data acquired from parallel port.

Hello,
I am currently trying to develop a LabVIEW application that acquires data relating to the state of timing gates from the parallel port. I was wondering if anyone could comment on the accuracy of the timing of the data acquired from the parallel port. I am interested in this as the timing gates will be used to measure sprint performance which needs to be accurate to the millisecond level. Is it correct that operations performed by LabVIEW on machines running Windows operating systems may only occur every 15ms thereby limiting the accuracy of the timing of LabVIEW to around 15ms? (I am developing the application using LabVIEW 6.1 with Windows 98 operating system). If this is the case is the only solution to pu
rchase a DAQ board and use inbuilt counters to determine the timing or is there a way to bypass this limitation using LabVIEW (eg. using the priorities functions)?

Thank you,
Shane W.
0 Kudos
Message 1 of 2
(2,474 Views)
I do not have numbers to give you, but I can give you my two cents.

Using the parallel port for timming is not going to be desirable. The problem is that Windows is an interupt driven OS so you could be in the middle of a timming and the OS decides to check the harddrive. So if your programm is polling the port for a change, one iteration may take longer than another iteration. This applies to both reading and writing with software timed applications. This jitter is ok if you do not need high percision.

Since every machine is a little different, you might using a benchtop function generator to test your resolution. The idea is to adjust the frequency of the generator and see what LabVIEW says the frequency is. This can give you an idea for the a
mount of error associated with your system.
0 Kudos
Message 2 of 2
(2,474 Views)