06-20-2013 06:21 PM - edited 06-20-2013 06:30 PM
The function calls for key DAQMx functions are taking longer than our application can allow. Here is a diagnostic dump that shows some typical times. The first number on each line is the time in mS from the beginning of the log. The second number (in parenthesis) is the elapsed time since the previous entry so it indicates how long (in mS) that the previous event took. Are these times typical or is something wrong???
0.01 ( 0.00) - DateTime: 19.822 - System seconds/ms for reference
0.03 ( 0.01) - Create read task... - create task, configure channels, configure timing
3.47 ( 3.44) - Create write task... - create task, configure channels, configure timing, generate and write waveform (took 15.3mS)
18.76 ( 15.29) - Create digital task... - create and configure digital IO task. (took 3.1mS)
--- task creation times are not too bad.
21.85 ( 3.09) - Start operation
21.86 ( 0.01) - Powerup start
21.87 ( 0.00) - Digital write start -- Beginning of a simple call: digWriter.WriteSingleSampleMultiLine(true, digCtrlBits);
27.94 ( 6.08) - Digital write end.... - first problem. 6mS just to write out one byte of digital data. This consistently takes about the same time +/- 1mS
29.45 ( 1.51) - Powerup done
29.45 ( 0.00) - Digital write start -- Beginning of a simple call: digWriter.WriteSingleSampleMultiLine(true, digCtrlBits);
35.02 ( 5.56) - Digital write end
35.03 ( 0.01) - Tx
35.83 ( 0.80) - Start Write Task - writeTask.Start(); takes 38.5 mS
74.38 ( 38.55) - Start Read Task - readTask.Start(); took 22.5mS
96.92 ( 22.54) - Start Read Waveform
96.96 ( 0.04) - Now Running - write timing is slaved from the read clock so data writing begins (about) now.
103.62 ( 6.66) - Rx -- Callback for 'writeTask.Done'. Took 6.6mS but there was exactly 3mS of data in the write buffer to start.
103.63 ( 0.01) - Digital write start -- Beginning of a simple call: digWriter.WriteSingleSampleMultiLine(true, digCtrlBits);
109.31 ( 5.68) - Digital write end - another digital write: 5.7mS
132.44 ( 23.13) - Rx done
156.57 ( 24.13) - Finished
156.60 ( 0.03) - DateTime: 19.979
In a typical scenario I need to enable a transmitter, wait 1.5mS, transmit for less than 10mS, switch to receive, wait 1mS, receive data.
06-21-2013 09:23 AM
I have been able to improve the performace to some extent by explicitly starting the digital I/O task but the times still seem longer than they should be. Here is a operation log from the current version (see questions below).
I have changed the log output to include the elapsed time for a specific event in parenthesis after the event description.
0.0 ( 0.0) - DateTime: 00.955
0.1 ( 0.0) - Create read task (10.58)
10.7 ( 10.6) - Create write task (29.54)
40.2 ( 29.6) - Create digital task (2.30)
45.6 ( 5.4) - Start Digital Task (4.42)
50.1 ( 4.4) - Start operation
50.1 ( 0.0) - Digital write (2.75)
54.3 ( 4.3) - Powerup
54.3 ( 0.0) - Digital write (2.55)
57.7 ( 3.4) - Seq #1 begin
57.7 ( 0.0) - Start Write Task (38.02)
95.8 ( 38.1) - Start Read Task (26.59)
122.4 ( 26.6) - Start Read Waveform (0.35)
122.8 ( 0.5) - Now Running
154.1 ( 31.2) - Read done
154.2 ( 0.1) - Write stop (9.66)
163.9 ( 9.7) - Read stop (8.87)
172.8 ( 8.9) - Pause begin
172.8 ( 0.0) - Output data reload (9.87)
182.7 ( 9.9) - Pause end
182.7 ( 0.0) - Digital write (3.45)
186.9 ( 4.3) - Seq #2 begin
187.0 ( 0.0) - Start Write Task (7.67)
194.7 ( 7.7) - Start Read Task (22.61)
217.3 ( 22.6) - Start Read Waveform (0.04)
217.3 ( 0.1) - Now Running
222.4 ( 5.1) - Write done
222.5 ( 0.0) - Digital write (3.40)
233.1 ( 10.6) - Read done
233.2 ( 0.1) - Write stop (6.47)
239.7 ( 6.5) - Read stop (8.99)
248.7 ( 9.0) - Finished
248.7 ( 0.0) - DateTime: 01.204
The digital output write operations were taking ~5mS before (as indicated in the previous post), With the digital write task explicitly started at the beginning the digital writes take less time but can vary between ~1 and ~4mS which is still a 'long' time. This still seems high for a lightly loaded 1.8GHz computer with 8G ram.
I am investigating clock options for the digital output to see if they have some effect but it seems that a single digital output operation should not need a clock.
The performance for the analog out / analog is mainly affected by the task start and stop times. I am currently starting and stopping because the beginning of both the read and write operations need to be synchronized. This is done by using the read clock as the clock source for writing, loading the write buffer with the output waveform and then starting the read task. Is it possible to just halt the read clock without stopping the task?
06-21-2013 03:28 PM
Hi Lorne.V.,
You are correct that you can improve the write/read speed of a task by explicitly starting it rather than let the write/read function explicitly start it. This is because the the task will need to move all the way from the unverified state to the running state every time you implicitly start it. I'm including a link below to some information about the NI-DAQmx Task State Model.
http://zone.ni.com/reference/en-XX/help/370466V-01/mxcncpts/taskstatemodel/
You are also correct that you do not need a clock if you are performing a single point write/read. This is called software timing which is in contrast to hardware timing where the DAQ device uses a clock to perform the read/write. If you can allow your device to handle the timing, you will get much more precise timing.
If you need to use software timing on your read/write functions you may want to consider a real-time operating system to get better determinism. A non real-time operating system, such as Windows, gives us much less control over when a function will be implemented. Windows will decide when it will allow these functions to execute.
Hopefully this information is helpful!
06-21-2013 04:39 PM
Thanks Josh,
I've been able to set up a digital waveform to clock out the control line states that I need. That leaves me with one possible issue to resolve and that is to find a way to repeat my process without having a long delay between repeats. Since the analog out and digital out have finite waveforms that are clocked out by the analog read process it seems that the only way to repeat the process is to stop and start all three tasks. Is there another way?
Lorne
06-21-2013 04:56 PM
Hi Lorne,
One way to speed up the process of stopping and restarting a task is to put the task into the Committed state before starting it initially. When a task is stopped it reverts back to the state it is in before it was started. This means that if it was not explicitly put into the Committed state before being started it will have to through all of the states again once the task is started up again. To put the task into the Committed State you can use the DAQmx Control Task.vi in LabVIEW or the DAQmxTaskControl function in C.
06-21-2013 05:22 PM
I've done that but it still takes about 40mS to stop all three tasks and start them again.
I can improve it by about 10ms since the analog and digital out are done earlier than the analog in so it could stop them while the anlog in finishes. Is there a way to start them again without having them trigger until the analog in is restarted?
The other option might be to let everything free run and auto-repeat. What is the practical limit on the size of the output waveforms for analog and digital?
Is it possible to switch to a new .TDMS file with the auto-repeat?
Thanks again.
06-24-2013 03:29 PM
Hi Lorne,
You can certainly have the analog and digital output tasks share the start trigger for the analog input task. You would use a trigger property to set the source of the start triggers as the analog input start trigger.
There is not really a limit on the waveform size as long as you can continue to write data to the device at least as fast as you are outputting the waveform.
If you want to create a new TDMS file for every iteration of a loop, there are TDMS file I/O functions that you can use to create a new file.