LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

sample by sample signal analysis delayed at high sampling rates

Hi,

I have developed an algorithm which works by analyzing a signal sample by sample. The code only works this way. At high sampling rates the DAQ sends the signal as a group/array. The size of this group is defined from the setting of the DAQ hardware. In order to send the signal sample by sample I use a FOR loop and an array index function which reads every sample one by one and sends it to the code. Since the data is fed to an array before being analyzed by my code the analysis is always out of date/delayed. The only solution is to reduce the size of the samples to read and hence reduce the delay before signal analysis. The problem arises when I reduce the samples to read setting. I receive an error message saying that: “the sample you are attempting to read is no longer available try reducing the sampling rate or increasing the samples to read”. Although the sampling rate is within the limits of the DAQ in order to acquire the samples without warnings I have to set the samples to read to 1000.

Is there any way to reduce the samples to read aiming at less data delay while keeping the sampling rate the same?

Is that a limitation of my computer or of the DAQ card?

I have also tried the acquire on demand setting but it didn’t work fast enough.

Many thanks in advance for your time,

Regards,

Christos

0 Kudos
Message 1 of 4
(2,735 Views)

How fast is your sampling rate?

You are running into a conflict between a high sampling rate, and how fast your computer can iterate through the loops to do a sample by sample calculation.  In other words, your computer can't keep up.

If you offload all the data acquisition points to another loop (using queues I would guess?), you get all the data, but since the analysis loop runs slower, the queues build up and the calculations lag.

If you offload the data in smaller chunks and only wait to read more once the calculations are complete, then the DAQ buffer fills up and overwrites itself.  Then you are losing data and getting the error message.

I don't think there is anything you can do unless you can come up with a more efficient data analysis algorithm.  Or live with a loss of synchronization.  Or decide to toss out data and not process all of it.

Several contributors to the forum are experts at processing large datasets efficiently.  You can try posting your code along with an explanation of what you are trying to do.  Even save the code with some indicators that have sample data in it and the values saved as default.

0 Kudos
Message 2 of 4
(2,729 Views)

Many thanks for your reply.

My sampling rate is 200k and i need to do a sample by sample anaylsis. I made to experiments one with my doe analysing each sample and another one using the sam technick which simple sends the samples to a graph. The result of my experiment shpwed me that there is no difference in responce to change in the buffer(samples to read). To me this indicates that its not a matter of proccesing power its just a matter of memmory handling.

It is a bit strange since DMA is used and my ram runs at 266mhz which is much higher than the required sampling rate.

Many thanks once again,

CHristos

0 Kudos
Message 3 of 4
(2,723 Views)

Many thanks for your reply.

My sampling rate is 200k and i need to do a sample by sample anaylsis. I made to experiments one with my doe analysing each sample and another one using the sam technick which simple sends the samples to a graph. The result of my experiment shpwed me that there is no difference in responce to change in the buffer(samples to read). To me this indicates that its not a matter of proccesing power its just a matter of memmory handling.

It is a bit strange since DMA is used and my ram runs at 266mhz which is much higher than the required sampling rate.

Many thanks once again,

CHristos

0 Kudos
Message 4 of 4
(2,703 Views)