Hi,
Im writing an application that should sample two analog signals, do some calculations on these and set a bit on the DIO according to the status of the signals. For this I have setup an event with the DAQ_Config_Event_Message (event 1) to call my callback routine every N scans (I have set N to 5 as I want a small latency). Then I check the data and if the conditions meet, a bit is set on the DIO (output) directly in the callback routine. This works very good but...
To test the latency on my application I have connected an osilloscope on the AI and to the DIO. Then I check how long time it takes from the pulse is fired into the AI and until my applicaiton outputs on the DIO.
In 90% of the cases the latency is 0.5ms to 1.5 ms, bu
t even up to 4 ms have been encountered.
It doesnt matter if the latency is 0.5 or 3 ms but it have to be the same every time! These variations makes the system quite impossible to use for me.
I sample two channels in 100Khz each meaning the callback is called 20000 times per second.
I use NT 4.0 and I understand that NT is quite bad for realtime applications.
Is Windows98 better?
Can I improve NT with Realtime extention programs (such as iNTime, etc) (will it help me in this case)?
Is it possible to replace the DAQ ISR routine, so instead of using the callback routine, I put my code directly in the interrupt service routine?
Quesions over questions... any help is appreciated alot!
(I know there are realtime systems to buy but I would rather not as this is a quite simple task)
/Thomas