08-31-2022
08:54 AM
- last edited on
08-31-2022
10:45 AM
by
NI_Community_Su
Dear Sir or Madam,
in our test field we have carried out a test in which pressure and strain (with strain gauges) were measured with two different measuring cards. Both signals arise simultaneously and should reach at the same time the measurement system. The following measurement configuration was used:
Measuring chassis: cRIO-9040
Analogue input card: NI 9220 (16Ch) for pressure and other variables.
DMS card: NI 9237
Temperature measurement module: NI 9213
The data acquisition is done with LabView using NI-DAQmx (see screenshot).
After recording and evaluating the measurement data, a delay of approx. 8 us of all strain gauge signals regarding to the pressure signals was determined. The sampling rate was 5 kHz.
In order to examine the problem more closely, a manual interruption of the measuring signals (DMS-C and pressure C) was done via an electrical contactor. Several measurements were carried out at different sampling rates. After a manual evaluation of the measurement data in Diadem (see Figure 1), a sampling rate-dependent signal delay could be proven. Table 1 shows the results of this dependence. As the frequency increases, the time delay decreases.
The only delay we know in our measurement chain is that of the used pressure transducer. According to the data sheet: settling time <1ms (built-in analogue preamplifier).
Could you please look at the problem from your point of view? It will help us to perform our measurement series with maximum accuracy at all sampling rates.
Table 1. Delay of the strain gauge measuring signal by the NI-9237 measuring card
Sampling rate |
5 Hz |
12,5 kHz |
16,67 kHz |
25 kHz |
50 kHz |
Signal delay |
8,6 ms |
3,7 ms |
3,1 ms |
2,55 ms |
1,45 ms |
Figure 1.: a) and b) Determination of the time delay due to signal interruption; c) Time-delayed curves of the measured variables
Figure 2. Screenshot LabView
Thank a lot in advance!
V. Slavov
08-31-2022 09:46 AM
The delay could typically be due to the filter delay on the 24-bit Delta-Sigma ADC NI 9237. In any case, all the delays are in us which should be negligible for the supported sampling rates.
08-31-2022 10:10 AM
Thank You for your Answer!
I've just seen that I've made a mistake:
the delay ist in [ms], thats why it can not be neglected in our case.
08-31-2022 10:11 AM
1. The 9237 uses a Delta-Sigma converter, which inherently includes some signal delay due to the filtering done in the signal path. The spec sheet will show this. The delay is typically dominated by a digital filter stage which contributes a delay based on the # of samples. Consequently, this part of the delay is longer at slower sample rates and shorter at higher ones.
This somewhat matches the trend you show. (Note though that when you *ask for* 5 Hz sampling, you'll instead get the lowest sample rate supported by the 9237, which is ~1.6 kHz.) But the delays you measured don't exactly match the specs, which brings us to another thing:
2. I think you might be overly trusting of the t0 value in your DAQ waveforms. It can't be trusted *that* much. The initial t0 gets assigned by doing a software query of the time-of-day clock when you do your first DAQmx Read. It is NOT hardware-level accurate.
3. If all the tasks are triggered by a common trigger signal, that would probably help the various tasks' t0 values match up better, but still not perfectly.
4. It may help for you to use channel expansion so you can put all the channels from all the modules into a single DAQmx task -- if your modules support it.
I don't have personal experience with it, most of what I know I learned from posts by mcduff. Hopefully he'll see this thread and chime in...
-Kevin P
08-31-2022 10:36 AM
Your experiment should be set up correctly to get the right results. In this case, to compare apples to apples, all the tasks must be hardware synchronized and adjusted for any hardware delays per device.
Please post the code you used for this experiment.
08-31-2022 02:25 PM
cDAQ modules can typically be used in Channel Expansion, provided that you don't need the modules to run at different sample rates.
Your delay is due to having a DSA device. Delta-Sigma ADC use anti-aliasing filters to remove frequency components above the Nyquist frequency, which impart some amount of phase distortion to the filtered signal, which typically shows up as a delay or phase shift. The exact amount of delay depends on which device is being used along with the sample rate.
There is a DAQmx Property that will tell you the Filter Delay. You can this use this value to post process the data to remove the delay.
Some PXI devices can remove this filter delay, but cDAQ modules do NOT support this property. So, even if you use Channel Expansion, you will have this problem.
As @Kevin_Price mentioned, another method is to use a trigger to try and align signals. As he stated, t0 is a software trigger.
DAQmx cards are not synchronized to the computer’s system clock and do not output timestamp data. The timestamp in the waveform acquired is based on the system clock; however, it is a software time value because it is based on when the operating system retrieves the data values from the buffer of the DAQ card after the DAQmx Read VI is called and the samples are available.
For a test, I digitally triggered two PXI cards, each with its own task, and looked at the t0 timestamps. I got t0 time differences between 1-11ms. So using t0 as an absolute time is NOT recommended.
Below are some plots for a PXI system that has a DSA card and a SAR card, similar to what you have in a different form factor. A square wave burst was digitized by both cards, and a common TTL trigger started acquisition for both cards. So, the data should be completely aligned and coincident, however, if you just looked at the plots you could not tell. Here, two tasks are used as Channel Expansion is not supported with this combination of devices. Cards are synchronized by the reference clock in the PXI back-plane. (Relative time sets t0 to a value of 0, and there is no absolute timestamp.)
Plots of the 1kHz burst plotted versus relative time (left) and absolute time(right). The filter delay, 625µs, in the DSA DAQ is not removed. Thus, for the plot on the left, the burst recorded by the DSA device looks like it lags the other burst, even though they are time coincident. The plot on the right shows the inaccurate timestamping of the data. The 6366 samples at 1MSa/s and the 4499 sample at 102.4kSa/s.
09-19-2022 03:42 AM
Thank You to All very much for the fast and detailed answers!
Let me report about our results: First we’ve determined the time delay of the Filter for different samplings rates in LabView (Thanks to my Teamleader for the support). The results are listed in the table below.
Sampling rate [Hz] |
Filter delay [s] |
1612,9 |
0,0248202 |
5000 |
0,00800958 |
12.500 |
0,00320653 |
16.666 |
0,00240598 |
25.000 |
0,00160552 |
50.000 |
0,000805047 |
After correcting the measurements data with these values, we found out, that the both signals almost exactly match each other! There is a very small residual delay about 0,2-0,4ms, which could be from our pressure transducer. Actually, we can correct now our data by the known time delays, which could be one way of proceeding. The second way should be an integrated online correction in our acquisition software, which needs of course a little bit more work 😉
Thank You once again for your helpful answers!
Best Regards,
V.Slavov