05-05-2020 11:39 AM
I am aware that is possible to synchronize two counter tasks. However, what I want to do is a bit different. I have a 16 kHz TTL signal from an external device that I have been using as a clock source for an analog output waveform. I now want to synchronize a digitizer to that analog output task, but I do now want it to use this slow clock source (because the digitizer is acquiring at 10-20 MS/s). So, my idea was to use a counter to read the TTL signal from the external device and generate a copy on another counter for pulse generation, but one referenced to the onboard clock of the DAQ board (M-series, 6212). Then, I can use the counter output as a clock for my AO and the onboard clock as a reference clock for my digitizer. In this way, all my signals are synchronized to the same clock. How do I generate a counter output (using the onboard clock) synchronized to my counter input so that they have the same TTL frequency and duty cycle? Note, I cannot simply use the onboard clock for my analog output waveform because I want the output to change at the 16 kHz frequency.
Solved! Go to Solution.
05-05-2020 01:32 PM
I'm not sure I follow why you're approaching things the way you described.
1. It sounds like your digitizer can't accept the external 16 kHz signal directly as a reference clock to sync its multi-MHz digitizing clock. What are the constraints?
2. Some of NI's MIO boards have a 10 MHz Reference Clock that can potentially be exported, but it doesn't appear that your USB-6212 is one of them. I didn't see anything on the spec sheet, nor under "Device Routes" in MAX when I configured a simulated device.
3. It isn't clear to me that your scheme accomplishes sync properly anyway. The clock inaccuracies that concern you (and rightly) are probably too small for your frequency measurement to distinguish, or for your counter pulse train output to compensate for.
What's the importance of the 16 kHz signal? What produces it? What is its meaning in the system, i.e., why is it important that your sync AO to it?
-Kevin P
05-05-2020 04:55 PM
Thank you for the quick response.
The digitizer can only accept a reference clock in the 1-10 MHz range. The key point is that I want the fast digitizer (~20 MS/s) to trigger off of my 16 kHz TTL source (originating from a resonant scanning mirror at the turn around points of its motion). This is easy enough, but each time that trigger pulse arrives, the actual acquisition of the digitizer relative to it may drift since its clock is not synchronized to the trigger pulse train. I thought that perhaps I could use the DAQ card to synchronize the two together. The TTL source acts as a sample clock for my AO task, outputting an incremented voltage each time it sees a rising edge. I thought perhaps I could generate a pulse train from my counter using the onboard clock (or a reference timebase, e.g. 20MHz), triggered by the TTL source. However, it may very well be the case that the timing errors are too large anyway, meaning that the uncertainty of the rising edge of the TTL pulse train is larger than (or on par with) the timing uncertainty from not having a synchronized clock. I wrote the following code, but when I look at the counter output and AO on my oscilloscope, the two do not appear synchronized.
05-05-2020 06:33 PM
I have a number of questions for you. Let's start with the digitizer.
Some other q's:
Answers to those questions will be helpful for choosing a solution direction.
Also, you define a 128 step ramp of analog output values. Because you programmed the task for continuous sampling, it's going to regenerate indefinitely to create a sawtooth wave. Is that what you intend?
Now for a little thought experiment. Let's just suppose you want your digitizer to capture 1 sec of data at 20 MHz. What might be the effect of unsync'ed timing sources?
Well, as a point of reference, a lot of NI's common DAQ devices have timebase accuracy spec'ed at 50 ppm (parts per million). That's 1 part per 20 thousand. Let's just make a big ol' assumption that your 16 kHz signal and digitizer can be off from one another by that same amount.
During 1 second as the digitizer measures it, the 16 kHz clock could *think* there's been 1 second +/- 50 microsec. That's *almost*, but not quite, 1 interval of its 16 kHz signal.
Net summary: It would take a little more than 1 second worth of capture before the assumed cycle # of the 16 kHz clock (and thus the specific AO voltage level) would be off by 1.
Hopefully that gives you some rough order of magnitude for bounding your timing error.
-Kevin P
05-06-2020 09:04 AM
Again, thank you for your thoughtful questions and comments. Here are my answers:
1. The digitizer cannot output its clock signal as far as I know
2. It can indeed handle two input signals
3. It also can be retriggered repeatedly by the 16 kHz signal and capture an unlimited amount of data (by streaming to disk)
Let me explain my concern about the timing. The controller of the scanning mirror produces the 16 kHz signal at the zero-velocity points of motion. However, the timing between that signal and when the acquisition actually starts is not fixed because the digitizer clock is not in synch with the this 16 kHz TTL. This means that the acquisition may start at slightly different times relative to the mirror position. This may not sound like much (~50 ns error) but that corresponds to ~1 pixel for imaging applications. Now, it is certainly correctable in software by aligning each row of the image but there may be other applications down the road where such misalignment may be difficult to correct. Besides triggering, the only thing the 16 kHz TTL signal is used for is to generate the analog voltage The analog voltage output steps another mirror in the perpendicular direction each time it receives the TTL signal; the easiest way to do this is simply use the 16 kHz TTL as a sample clock. In other words, the 16 kHz source is only being used as a clock because dynamically changing the AO using another other source is difficult. The ideal situation would be to have some external clock that both the DAQ and digitizer share, and to use the 16 kHz TTL simply as a trigger for the AO task. Since that latter is proving difficult, my idea was to use an external clock (or one originating from the DAQ device itself) to generate a 16 kHz signal directly on the DAQ device (but synched with the external 16 kHZ TTL). This gets me close to the "ideal situation". But, it may very well be that the inherent timing errors in the process of "cloning" the 16 kHz TTL signal are too large.
05-06-2020 11:17 AM
Here are some approaches I can think of, pretty much in order of preference.
1. Configure the digitizer to be re-triggered by the 16 kHz signal. Each time it's triggered, have it collect and save ~60 microsec worth of data (so it can be done before the next trigger event). Continue to use the 16 kHz signal as your AO sample clock.
By retriggering every cycle, you're also re-SYNCING every cycle. The timing error won't accumulate. This should greatly simplify the post-processing as you'll have fixed-size chunks of digitized data for each trigger event.
2. If your digitizer can accept an external clock as an input signal, you could generate a counter pulse train (or possibly route out the internal 20 MHz timebase) to drive it. But I'm not so sure this ends up helping much, unless you set it up as a ~60 microsec retriggerable finite pulse train. Basically, it's just a repeat of #1 above, but doing it a harder way.
3. Use the digitizer's 2nd signal input to capture your AO signal. This minimizes the relevance of the timing error. Instead of needing to sync the timing at capture time, you can sync the "system state" and data during post-processing. (You'll know when AO changes and what the value is for each digitized sample because you're capturing it.)
-Kevin P