I'm getting some unexpected and undesired behavior from timed loops in LV 8.0.
I run a timed loop with multiple frames, with a loop period of 60s and a "start time" of 30s (30000ms) wired to the output of frame 1. Frame 3 contains a call to a subvi which takes several minutes, but is only called once or twice a day (at desired times of day). I have set the loop to "discard missed periods" and "ignore phase" when periods are missed.
I EXPECT that, when the subvi in frame 3 runs, the loop will take much longer than the loop period; the missed periods should be discarded and the loop should begin a new iteration, wait 30s to start frame 2, etc., as usual
I FIND that, in the infrequent cases after the loop takes longer than the loop period (because the subvi takes so long to return), the loop correctly discards missed periods, BUT it ignores the requested 30s start time for frame 2!
to recap:
DESIRED behavior: start loop --> turn process ON --> wait 30s --> turn process OFF --> *wait 30s (remainder of 60s loop period) --> repeat
*check time here, and run subvi if it's time
ACTUAL behavior after last loop iteration ran longer than period (>60s):
turn process ON --> turn process OFF --> wait for 59+ seconds (remainder of 60s loop period)
timeout and deadline are set to -1, offset/phase to 0. Is this a bug, or somehow expected behavior? I guess I can fix this by running a timed sequence within a loop, such that the sequence begins anew and has no knowledge of how long the previous sequence took. Any other suggestions?
I've included an example VI which shows that the "start time" is not properly respected after the previous loop takes too long.