The waveform carries a DT (delta Time) component.
Use the GET WAVEFORM COMPONENTS function to extract the DT. That should be the actual dT used, not necessarily the rate you asked for.
If you have a DT number, you create the array with a FOR loop:
For I = 0 to NSamples-1
Time[i] = i * DT
As far as the inter-channel time, the minimum value is a property of the card, but it's also controlled by software: LV can extend it.
I don't know about the express VIs: I never use them. But the original AI CLOCK CONFIG, has an input where you can specify it.
If that doesn't work, and you have access to some test equipment, you can find it out for yourself:
Set up a ramp generator for 0.0 - 1.0 V at 1000 Hz.
This means the voltage is
changing 1 volt per mSec.
This means the voltage is changing 1 mV per uSec.
Wire the ramp generator to channels 0, and 10.
Perform a DAQ operation, with all channels 0-10 active.
Pick out a portion of the waveform, where the voltage is increasing on all channels (exclude the drop at cycle's end). Ignore channels 1-9.
Subtract the channel 10 array from the channel 0 array.
Average the result array and divide the average by 10.
The average value, in mV, is the channel delay time, in uSec.
By comparing channels 0 and 10, you're measuring 10 intervals, not one, so you're more accurate.