06-23-2021 04:04 AM
Dear all,
This is probably a very stupid question. It has been many, many years since the last time I used Labview, so be easy on me. 😃
I'm using a DAQ USB-6341. I would like to change the integration time of the measurements in my DAQ, i.e, by doubling the time, I would have, double the value of the reading, and so on.
But by simply changing the measurement time (below). nothing happens.
Hope anyone can help me with that. Cheers
07-07-2021 02:55 AM
Hello RadGent,
What do you mean by "change the integration time of the measurements"? Could you please describe it better?
Best Regards,
Edgar
07-07-2021 03:01 AM
Dear Edgar, thank you for the answer.
Currently, my system measures events (counts) within specific sampling times.
I would like to collect these counts by changing the integration time. For example, if I can manage to integrate 1 ms of counts I have X values, if I double the integration time (in some sort of buffer, I believe) I will have (around) 2X values and as such better signal to noise ratio.
Hope I could make myself clear enough?
cheers
07-07-2021 03:52 AM - edited 07-07-2021 03:58 AM
Sounds like you are talking about a buffered counter application. There is nothing magic about that and if your used hardware supports that then you can do this in DAQmx very easily. But debugging a photo is not possibly so all I can do is recommend you to checkout the DAQmx samples and there specifically the Counter samples such as "Counter - Count Edges (Continuous Clock).vi".
The Sample Clock determines the sample interval at which the counter values will be read. The resulting values will be cumulative, if you only want to know the pulses per sample interval you have to do a differentiation afterwards. It's as simple as calculating the difference between every two samples.
07-07-2021 08:07 AM
I will try my hand at this. Some data acquisition devices can adjust the time that a signal is allowed to fill up a bucket (capacitor) and by dividing quantity over time can get a more accurate result as long as the bucket doesn't overflow. I like to think if it in how a camera takes an image - low light, let the bucket fill up more, get better results. This is how integration is being used here. Most NI analog daqs don't have a bucket where you can change the amount of time you allow it to fill up, but you can do it after the fact instead. Either by grabbing multiple samples at one then averaging them, or creating a buffer and still averaging them.
In this case we have a digital signal and we are trying to measure frequency. There isn't a bucket we fill per say, we can only count. So to get a better result we count over a longer period of time. Then divide by total time and get frequency. The measurement time in your image above is the integration time. Don't expect to "see" twice the counts if you double the time however, that math is being done for you. You could also increase the read samples and then average them to further increase accuracy, but pay attention to how many data points per second you are receiving if that is important. 1 sample with a 5ms measurement time gives you 200 chunks of data per second, 100 samples with a 5ms measurement time will still give you 200 samples per second, but you get only 1 chunk of data per second. To me it is easier to average those 200 samples in 1 chunk to 1 value per second as they are already in a buffer for you.
Hope I help out.
07-08-2021 01:26 AM
Hello,
Thank you for both answers, already helped a lot.
cheers,