01-24-2011 03:45 PM
Hello,
I'm trying to generate a pulse on a digital line using the USB-6009 board. The pulse must be accurate to the millisecond, but it seems there's a delay of nearly 5-6 miliseconds in simply requesting the digital line to change. There seems to be similar delay in reading the digital line too.
The slow read of the digital line I find strange, since I can read the available samples on an analog line with sub-millisecond latency.
What are my options?
01-25-2011 01:34 AM
Hello sampler,
My guess is that you don't start your task explicitly, and therefore you could see increased time between 2 consecutive calls of DAQmx write. I would suggest to create DAQmx task, start it, and then in the loop you can write new values to output. Keeping DAQ task opened might significantly lower time to execute, as board will stay in commited state (as far as I remember).
Answers to Frequently Asked Questions about NI-DAQmx and Traditional NI-DAQ (Legacy):
regards,
Stefo
01-25-2011 10:36 AM
Below are my steps (in C#), correct me if they are wrong. I've also noticed now that if I change the digital lines too frequently, while sampling at the maximum sampling rate of the board (48kHZ) the sampling will effectively freeze up. At 45kHz, it doesn't seem to matter how frequently I toggle the digital lines.
Initialization of Application.
I declare a task (named 'MyDigitalOutputTask') and store it as a member variable (it persists for the life of the application).
I call 'CreateChannel' on the digital line with a string of cancatenated lines and the setting 'OneChannelForAllLines'.
I declare a DigitalSingleChannelWriter (named 'MyDigitalOutputWriter') based on the stream of that task.
I call 'MyDigitalOutputTask.Control(TaskAction.Verify);'
I call 'MyDigitalOutputTask.Start();'
Toggling the Digital IO.
I call MyDigitalOutputWriter.WriteSingleSampleMultiLine(true, new bool[] { TheLine1Bool, TheLine2Bool })
01-25-2011 09:24 PM
There is no sample clock associated with the digital I/O. It is strictly software timed and subject to a great deal of jitter because you are not using a deterministic OS. I don't believe a consistent 1ms accuracy is a realistic expectation for this device.