Multifunction DAQ

cancel
Showing results for 
Search instead for 
Did you mean: 

retriggerable analog acquisition data transfer

I am trying to setup a retriggerable analog acquisition as described http://zone.ni.com/devzone/cda/tut/p/id/5382 using a DAQ-PAD 6015.  I am attempting to acquire 500 samples with 20kHz sampling rate per trigger at 10 Hz.  I have noticed that the "AvailSampPerChan" does not update as I would expect, but has sudden jumps to 2000+ samples.  The buffer then contains data from 4 triggers of the 10 Hz input.  My guess is that the data is being held on board and transferred to the PC in large chunks.  I would like to know if it is possible to read the data after every 10 Hz trigger.

 

Attached is a screen shot of a slightly modified copy of  Multi-Function-Ctr Retrigg Pulse Train Generation for AI Sample Clock.vi with a waveform chart showing the "AvailSampPerChan" at each loop step.  After the data appears (jump to 2000+ samples) the data is subsequently read out in 500 sample chunks, thus taking 4+ loops to empty the buffer again.

0 Kudos
Message 1 of 8
(3,781 Views)
Can you post your code?
Randall Pursley
0 Kudos
Message 2 of 8
(3,773 Views)
Attached is the modified version of Multi-Function-Ctr Retrigg Pulse Train Generation for AI Sample Clock.vi.  As you can see, the only changes are a waveform chart to watch the # of available samples, a calculation of the rep rate (which doesn't really work so well), and a limit to the samples read=# of samples per trigger.
0 Kudos
Message 3 of 8
(3,759 Views)

I do not see the problem you are seeing. Your code works fine for me.  If I reduce the delay you put in the FALSE case, I can see the number of available samples grow just like you would expect until it reaches 500, then it goes to 0.  I am not using the DAQpad so maybe that is the issue, but to me it looks like your delay in the FALSE case was larger than it needs to be.  Increasing it to something around 450 ms gave me similar looking waveforms in the Available Samples.

 

Give the attached a try.  It registers a DAQmx event to an event structure to read every time a number of points have been acquired.  See if this works better for you.

Randall Pursley
0 Kudos
Message 4 of 8
(3,740 Views)
Unfortunately no joy.  The event structure may be more efficient, but the behavior is the same.  The buffer is empty for almost 1s after start and then suddenly 5 trigger's worth of data (2500 samples) appears in the buffer.  The vi then rapidly plots the buffered data in 500 sample blocks until the buffer is empty again, then waits for the next chunk.  I can only assume this is something inherant in using a USB DAQ.  I've worked before with a NI PCI DAQ (PCI-MIO-16E-4) and had no trouble with 10 Hz operation.  In fact, there I simply issued start-acquire-stop commands in a loop at 10 Hz.  Doing this with the DAQ PAD I can only achieve 3.3 Hz.
0 Kudos
Message 5 of 8
(3,726 Views)

In the documentation it does mention that this device will not transfer data until it's FIFO is at least half full (the FIFO size is 4,096 samples). The link below is about the DAQPad-6020E, but it probably does apply to the DAQPad-6015.

 

http://digital.ni.com/public.nsf/allkb/7C289D2F9484878986256D5100617EF8

 

 

See what happens if you acquire 5 analog input channels but only keep the first channel of data or acquire at 5 times the desired speed and throwing out 4 out of 5 points.  One of these tests will at least verify if this is the problem.

Message Edited by rpursley8 on 09-25-2008 10:59 AM
Randall Pursley
Message 6 of 8
(3,717 Views)

It looks like there is a property that you can set to change how the FIFO works.  If you drag down the bottom of the DAQmx Channel property you have in your block diagram, the next item is

 

Analog Input: General Properties: Advanced: Data Transfer and Memory: Data Transfer Request Condition (AI.DataXferReqCond) Property.

 

This property defaults to Onboard Memory More than Half Full, but changing it to Onboard Memory Not Empty might give you the desired behavior.

 

 

 

Message Edited by rpursley8 on 09-25-2008 12:12 PM
Randall Pursley
Message 7 of 8
(3,709 Views)

Hurray! I think its working.  Changing the AI.DataXferReqCond property seemed to have no effect, the data transfers remains in chunks of 4096 samples.  In order to get the correct behavior I had to set the # of channels * samples/channel = 4096 (or a multiple there of) exactly.  Under or over results in some data being left until the next trigger event or more than one trigger needed before any data is tranferred.  With the total # of samples per trigger = 4096, all data from one trigger event gets transferred before the next trigger, as needed. 

 

This is a little cumbersome, but I think as long as I leave notes in my data acquisition vis, future users should be able to follow easily enough.

 

Thank you very much for your help. 

 

Sincerely,

Chris

0 Kudos
Message 8 of 8
(3,694 Views)