LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Q: Multiple chanell delay, DAQmx Read, multiple array on one chart

    I am acquiring 12 analog input channels using NI-DAQPad 6015, with DAQmx .. everything is normal, but I am interested in this issue:

    When I have 12 channels, three of them (software choose) is displayed on chart. I am using DAQmx Read N samples N channels 2D DBL, and that is wired directly to chart.
    Chart x.axis multiplier is set to 1/sampling rate. I am interested if there is some DELAY between those signals displayed on chart, introduced in DAQmx Read.

    How does chart works. If I have three identical signals applied to to three channels on my DAQPad, and doing acqusition like this, when I put those signal through DAQmx and then display it on chart, there shoudl be some delay on signal #2 to signal #1, and more delay on signal #3 to signal #1..

sampling rate is up to 1000, and read samples parameter of DAQmx Read is up to 200.

How to overcome this kind of problem, if it is showed that this is problem during using my simple measurement VI
0 Kudos
Message 1 of 5
(2,987 Views)
With a 6015, you've got a single ADC and the inputs are muxed into it. So, ch0, gets sampled, then ch1, ch2, and so on. I believe the default behavior of DAQmx is to evenly spread the sampling across the selected sample rate. With a 1khz sample rate and 12 channels, you should see something like a 83 us delay between each channel. One way to reduce the delay between channels is to increase the sample rate.
0 Kudos
Message 2 of 5
(2,974 Views)

I just wanted to reply to Dennis's post.  It depends on which version of DAQmx you are using how the channel spacing works.  If you are using 7.4?? or greater (somewhere around there) then the channel spacing is NOT spread out evenly across the entire sample period.  Depending on your sample rate it is more like spread evenly across half your sample period.  There are other more complex heuristics I can post here if people are really interested.

 

StuartG

0 Kudos
Message 3 of 5
(2,973 Views)

Stuart,

Thanks for the update. I haven't had to do a DAQ task since I upgraded to DAQmx 8. I'll keep the changes in mind. I'd be interested in knowing more.

0 Kudos
Message 4 of 5
(2,965 Views)

I thought you might want more info.  So the help for AI convert rate property does somewhat describe the behavior.  What the help states is "By default, NI-DAQmx selects the maximum convert rate supported by the device, plus 10 microseconds per channel settling time.  Other task settings, such as high channel counts can result in a faster default convert rate."  So lets look at an example

Let's assume your board is 1MHz max rate.  Therefore the MAX convert rate is 1us.  Let's also assume you have 2 channels.  This is 22us given the definition above or around 45kHz.  So if you set a sample clock rate of 1kHz you are well within this range.  We would only use up a small fraction of the total sample time to do the convert period.  22/1000 us or 2%.  However as you increase the sample clock rate the convert period starts taking a greater percentage of the time.  Finally if you set your sample clock rate to say 50k if we kept the algorithm the same you obviously wouldn't be able to convert all the samples for both channels in this period (50k is 20us and it takes 22us).  So, in that case we remove the 10us settling time period so the conversion will happen within the given sample clock period.

I hope this all makes sense

StuartG

0 Kudos
Message 5 of 5
(2,951 Views)