10-09-2007 07:35 PM
10-10-2007 12:42 AM
10-10-2007 06:08 PM
Hi SuperSnake428,
Let me get some more information to more fully help you. What hardware are you using? What version of LabVIEW, operating system, etc. do you have? I want to make sure I address all your needs instead of just patching one part.
In looking at your example code, it appears that counter1 is being triggered off of counter0, which could be one source of synchronization error.
Also, 11kHz is not an exact division of a card’s timebase (20MHz or 80MHz), meaning you don’t get a true 11kHz, whereas 10kHZ is 10kHz. That could introduce some error over time.
I will probably be able to formulate a concrete solution once I understand your setup better. I look forward to your response.
10-10-2007 06:15 PM
10-11-2007 08:17 AM
Don't have LV nearby to look at code but as Mark said, the internal timebase prevents you from generating a signal at *exactly* 11kHz or 110 kHz. If you generate them independently, you can't count on them having an exact 10x relationship because they will quantize to 10.9996 kHz and 110.041 kHz. (Or something like that -- I assumed rounding to nearest possible freq, the actual algorithm might do its rounding differently. Either way, you get small discrepancies due to quantization.)
Your recent comment is exactly correct -- if the 10x relationship is crucial, the easiest implementation is to use the high freq clock as the timebase for the low freq clock. Then you can program the low freq clock to get an exact divide-by-10. It's especially easy here to use units=Ticks and you can just provide integers for low ticks and high ticks that sum to 10.
-Kevin P.
10-11-2007 02:29 PM
10-11-2007 03:06 PM
The faster counter clock acts as the DO sample clock and the slower counter clock acts as the AI sample clock, right?
Sync part 1: Be sure to configure and start the task for the slower clock *before* starting the task for the faster clock. Even though "started", it won't produce any output clock signals until the faster clock starts giving it something to divide down. Once it does, their timing is *necessarily* hardware sync'ed.
Sync part 2: With similar reasoning, make sure you start the DO and AI tasks *before* you start the faster clock. Then you won't need to worry about triggering at all -- the sample clocks themselves will control the sync'ing.
Sync part 3: Phasing. Suppose you set your low and high times at 5 Ticks each. Then the low freq clock will put a rising edge right at the 5th, 15th, 25th, ... rising edges of the high freq clock. If you want to skew it, you could use 8 Low Ticks, 2 High Ticks, and put edges at 8, 18, 28...
Sync part 4: Special-case phasing. The low freq clock counter has another parameter called "Initial Delay" which can be used to make the very first Low time different from all the rest. For example, with Initial Delay=10, Low=5, High=5, you could put edges at 10, 20, 30...
-Kevin P.
10-11-2007 03:20 PM
10-12-2007 10:50 AM
I had a brief chance to look at the latest vi you posted. What I saw looked like the clocks ought to work properly -- I couldn't see a reason for the behavior you observed. The only thing I noticed was that the AO task seems not to be sync'ed to all the other ones since it doesn't update based on one of your counter clocks. Not sure if this is your intent, but I kinda suspect you'd want it sync'ed. Either way, it doesn't affect the sync'ing of the clocks themselves.
My only troubleshooting thoughts are:
1. Add an intermediate sequence frame between the 2 that start the tasks. Put in a "Wait (msec)" delay there. Start big (~100 msec) and if you get proper behavior, work your way down until it gets flaky. This really *shouldn't* help because your sequencing should be enough. But when troubleshooting, sometimes the stuff that shouldn't help is the stuff that does help.
2. Wire a non-zero value for the initial delay. The smallest legal value is 2. Your request for 0 should get turned into the smallest legal value of 2 automatically, but this simple change seems worth a try. I'd wire in the same 5 that's used as the Low Ticks setting.
3. Question: when you run this multiple times, *exactly* how many fast clock edges occur before the first slow clock edge? Is it consistent? If not, list the #'s you get on 10 runs to characterize the variation.
Next, try the same experiment with the "fast clock" throttled down by a factor of 3 or so. This probes at whether the timing discrepancy follows real time (maybe software execution related) or #'s of ticks (board hardware and DAQmx related).
-Kevin P.
10-12-2007 10:54 AM