Multifunction DAQ

cancel
Showing results for 
Search instead for 
Did you mean: 

Using the Task.Stream.AvailableSamplesPerChannel property...

Hi

 

I have an application where I'm sampling 5 counters on two NI6602 PCI cards (2 on card1 and 3 on card2). The counters samples continuous pulse width measurements and the pulses they measure is arriving at 50kHz. A callback handles the acquired data every time 100 samples is available on a single counter on the following way:

 

void hundredSamplesReadyCallback(IAsyncResult ar)

{

...

measurements = reader.EndReadMultiSampleDouble(ar);

// very fast thresholding of data

reader.BeginReadMultiSampleDouble(ar);

}

 

It seems like I'm falling a bit behind (error 200279), that is the buffer is filling up more quickly that I can empty it. I changed the size of the buffer to 50k and everything seems fine. But then I put in some checks on the currently available samples in the buffer like this.

 

void hundredSamplesReadyCallback(IAsyncResult ar)

{

  ...

// logging task.Stream.AvailableSamplesPerChannel

  measurements = reader.EndReadMultiSampleDouble(ar);

// logging task.Stream.AvailableSamplesPerChannel

  // very fast thresholding of data

  reader.BeginReadMultiSampleDouble(ar);

}

 

The logging shows that the buffer has 40321 samples before the EndReadMultiSampleDouble() and 40323 after. How can this be true? Did the daq put 102 samples into the buffer while I was reading out 100 samples, or does it take some time for AvailableSamplesPerChannel to be valid? Is there anyone having experience with using the AvailableSamplesPerChannel property that can tell me what causes it to change it value and if I can trust it or not?

 

/mola

0 Kudos
Message 1 of 8
(3,852 Views)

Hey mola,

 

Sampling pulse width measurements continuously on the 6602 means you are using implicit timing (that is, every pulse of your input signal generates a sample).  If the signal is 50 kHz, and you are reading 100 samples per callback, that means that the callback must execute 500 times per second (every 2 ms).  Since data is being transferred to the buffer asynchronously with your software calls, it seems quite possible that an extra 100 samples are transferred to the buffer between when you read the data and the next time you check the buffer.

 

The error you are seeing indicates that you cannot keep up with all of the data.  Increasing buffer size is one thing you can do to avoid the error, but if your reads dont' keep up in the long-run then the buffer will overflow eventually no matter how big it is.  The best way to ensure that you can keep up is to read more data per callback so you end up with less interrupts (and less overhead).  My general rule-of-thumb is to start out with about 1/10 of the sampling rate which would give a callback every 100 ms on each channel.

 

 

Best Regards,

John Passiak
0 Kudos
Message 2 of 8
(3,851 Views)

Hi John

 

Thanks for your answer. Unfortunately the application requires the data to be evaluated every 2ms to be able to respond quickly to pulse width changes, so I have to find another solution.

 

/mola

0 Kudos
Message 3 of 8
(3,833 Views)

By the way...

 

It takes 15ms for me to access the Task.Stream.AvailableSamplesPerChannel property. It also takes 15ms to access the task.IsDone property. Do you have an explanation for this?

 

/mola

0 Kudos
Message 4 of 8
(3,829 Views)

Hi mola,

 

Could you describe the action that you need to perform every 2 ms?  If you have strict timing requirements such as this, it's best to avoid relying on non-deterministic software calls.

 

That being said, 15 ms to query those properties does seem a bit high.  Which version of Visual Studio and DAQmx are you using?  How are you benchmarking the 15 ms?

 

 

Best Regards,

John Passiak
0 Kudos
Message 5 of 8
(3,819 Views)

Hi John,

 

I would like to read out a number of samples every 2 ms and check whether any of the samples lie above a certain threshold.

In order to know how “old” a specific sample is, I need to know the number of new samples sampled by the NI card since the last read.

That’s why I tried to use the AvailableSamplesPerChannel property.

 

Accessing the AvailableSamplesPerChannel property I’m constantly measuring ~15ms using a StopWatch with a high resolution timer.

(I have checked that the timer is able to measure shorter events than this.)

 

The same goes for accessing Task.IsDone!

 

What happens when the AvailableSamplesPerChannel property is accessed? Is the NI-card accessed over the PCI-bus?

 

I’m using Visual Studio , C#, .Net 4 and DAQmx version 9.23.

 

In the following thread it is also reported that accessing AvailableSamplesPerChannel takes a long time:

http://forums.ni.com/t5/Multifunction-DAQ/stop-takes-a-long-time-using-multiple-tasks/td-p/1023546/p...

 

Cheers,

mola

0 Kudos
Message 6 of 8
(3,795 Views)

Hi mola,

 

After data is acquired, it is transferred to a pre-allocated memory buffer in RAM.  AvailableSamplesPerChannel queries this buffer in your PC memory and does not need to access the hardware over the PCI bus.

 

What action do you need to take if the sample is above a certain threshold?  Is there any reason that you cannot sample a larger amount of samples per loop iteration and analyze the data in larger chunks?

 

 

Best Regards,

John Passiak
0 Kudos
Message 7 of 8
(3,781 Views)

Hi John

 

When the threshold is exceeded, the transition is timestamped (this is where I need AvailableSamplesPerChannel to compute a more accurate estimate of when the transition actually happened) and in some cases a digital output is set to trigger some hardware components. None if this is computational heavy, and it does not influence the fact that accessing AvailableSamplesPerChannel takes ~15ms.

I would also like to use AvailableSamplesPerChannel to check that I'm emptying the buffer fast enough. But since it takes so long to access the AvailableSamplesPerChannel, I'm falling behind because of this. Any clue why this takes so long? If it simply queries the size of an in-memory buffer it puzzles me why this takes so long.   

 

I cannot analyze the data in larger chucks since I need to be able to respond quickly (<2ms) to a transition.

 

/mola

0 Kudos
Message 8 of 8
(3,776 Views)