Multifunction DAQ

cancel
Showing results for 
Search instead for 
Did you mean: 

ALERT! Inconsistent analog triggering behavior from X-series devices!

I'm starting this new thread to give more visibility to an issue I believe I've confirmed after some initial skepticism.  The original thread is here.

 

Problem: Analog re-triggering does not behave consistently over the course of many repeated triggering events.  The effect is more noticeable for signals with a relatively low rate of change near the trigger condition (such as sine and triangle).   

 

Method: Code is attached below.  The general idea is to use an X-series device to generate a continuous sine wave (or other shape) with its AO, capture it using AI on the same X-series device using analog re-triggering and the internal signal path "Dev1/_ao0_vs_aognd".  And of course me being me, I also put in a counter task on the same X-series device to perform a selectable measurement of the internal signal "Dev1/AnalogComparisonEvent".

    Defaults are a 360 Hz +/- 1 V sine with 500 kHz AO update rate, a 500 kHz AI sample rate, analog falling edge triggering at 0.75 V and hysteresis 0.06 V.   (Many of these defaults follow the settings from the original thread.)  Default counter measurement is the interval (period) between Analog Comparison Events during all the retriggering.

 

 

Example:  Using all default values, the trend of the 1st post-trigger AI samples from a consistent sine wave looked like the screencap below:

Kevin_Price_2-1755477111372.png

The graph above was a fairly typical result for the 1st 100 re-triggerings.  When run with more re-triggerings (250), the trend seemed to flatten out and behavior became more consistent as seen in screencap below:

Kevin_Price_4-1755477478370.png

You can see that the trend appears to be getting asymptotic.  This odd behavior, whatever its cause, gets re-instituted with each new run.  But within each run it only exerts noticeable effects for a limited # of re-triggerings.

 

 

Example (part 2): This is the part that really sealed the deal for me.  Let's take a look at the counter's period measurements of the AnalogTriggerEvent during a default run in the screencap below :

Kevin_Price_6-1755477998714.png

Whuddya know?  Now *there's* a clear asymptotic trend!   Though the sine wave has a constant frequency, the interval between trigger detection points varies by 100 microsec.  And this same kind of thing keeps happening every time you restart the AI and CTR capture, even while the AO signal keeps running continuously.  The first bunch of periods measure "too low", gradually approaching a steady-state interval measurement.   For completeness, below is the result when capturing across more trigger events:

Kevin_Price_7-1755478340537.png

As expected, about the same # of "too low" period measurements as the interval timing settles into steady-state (and stays there).

 

 

I only kept screencaps from runs that used default values for the generation and measurement, changing only the # of triggerings to capture.  The utility vi "test analog trigger consistency.vi" allows lots of other possible tinkering and exploration.  (For example, notice the timing consistency for 2-edge separation from the AI start trigger to the AI sample clock.  The problem is somewhere *else*!)

 

Something janky is going on here and I hope someone from NI will take notice and offer up an explanation.

 

 

-Kevin P

 

ALERT! LabVIEW's subscription-only policy came to an end (finally!). Unfortunately, pricing favors the captured and committed over new adopters -- so tread carefully.
0 Kudos
Message 1 of 5
(250 Views)

<bump>    I don't normally bump my own threads, but I think this one deserves a little attention.

 

TLDR: it appears that X-series analog retriggering takes time to "settle in" to a consistent trigger point.  Given an invariant external signal (such as a sine wave), the time between retriggerings shows a concerning and non-constant trend.

 

 

-Kevin P

ALERT! LabVIEW's subscription-only policy came to an end (finally!). Unfortunately, pricing favors the captured and committed over new adopters -- so tread carefully.
0 Kudos
Message 2 of 5
(187 Views)

I don't have a X-series DAQ available for test, but do see the same when using an APFI for triggering?

What happens if you wait one second after 'arming' and generating the signal? 

really looks likesome sort of RC settling of the analog tigger comparision voltage...

 

in the good old (but expensive) days providing schematics of your measuring devices was part of the manual, or at least part of the provided service manual.....   (TEK put small easter eggs / comix ind the schematics)     now you can be happy to find coarse input impedance information ...

Greetings from Germany
Henrik

LV since v3.1

“ground” is a convenient fantasy

'˙˙˙˙uıɐƃɐ lɐıp puɐ °06 ǝuoɥd ɹnoʎ uɹnʇ ǝsɐǝld 'ʎɹɐuıƃɐɯı sı pǝlɐıp ǝʌɐɥ noʎ ɹǝqɯnu ǝɥʇ'


0 Kudos
Message 3 of 5
(148 Views)

@Henrik_Volkers wrote:

I don't have a X-series DAQ available for test, but do see the same when using an APFI for triggering?

What happens if you wait one second after 'arming' and generating the signal? 

Thanks for getting a little discussion going.  The system that was temporarily available had custom cabling so I couldn't have easily tried an APFI input for triggering.   The original thread I linked to suggests that it doesn't make an (appreciable) difference.

 

I never tried arming first, waiting a second, and *then* generating a signal that would satisfy the trigger conditions.   I'll give that a try if / when I get a chance.   Based on the counter measurements of triggering intervals, I expect I'd be well past that "settling in" period and into consistent behavior by then.

 

Anyone out there want to give these things a try?   The code I posted would only need trivial changes...

 

 

-Kevin P

 

ALERT! LabVIEW's subscription-only policy came to an end (finally!). Unfortunately, pricing favors the captured and committed over new adopters -- so tread carefully.
0 Kudos
Message 4 of 5
(115 Views)

Kevin,

 

I'm pretty sure this is what is happening:

  • Internally, DAQmx configures the Analog Trigger circuit with your desired trigger levels
  • DAQmx waits long enough for that trigger circuit to settle to the published trigger accuracy levels
    • However, the analog trigger accuracy is only +/-1%
    • This is much less that the AI accuracy of the product.
    • So, the analog trigger circuit is still continuing to settle (with a nice exponential decay exactly like you measured) for another ~120ms or so until in settles to 16-bit accuracy
  • Meanwhile, you start your task, which arms the trigger circuit, and then you observe the impact of the fact that the trigger circuit is still not fully settled

Sorry about. The workaround for you would be to:

  • Add a "DAQmx Control Task" with action="Commit" to apply the desired Analog trigger settings first
  • Then wait an additional 150ms or so (or however long you think is appropriate based on the detailed settling data you already gathered)
  • Then start your AI task after this

I hope this helps!

Thanks,

Adam Dewhirst

Chief Engineer

National Instruments

 

P.S. Nice debugging on your part! I love the counter measurement data. Pretty clever!

Message 5 of 5
(86 Views)