03-13-2022 09:29 PM
hi, altenbach, i have been doing something very similar to your diagram. i have the HUGE problem, though that the two timers, the first times that are subtracted do not go to zero. it works ok if you're interested in seconds. i am not, though. i am attempting to run an analog grab AND knowing as close to possible when that happens. i can do it in excel after i write the data. i also need to run the loop for a very controlled amount of time. ideally, the elapsed clock would do exactly what i want. however, using that VI, i get some bizarre starting point of 3.73 or so seconds. e.g., i'll start a process asking for 4000mS at 10kHz and i'll end up with about 1.36S of data. while this seems repeatable, there's something not right. i've tried running the elapsed clock in separate loops also. there's got to be a correct way to do this. your diagram shows subtracting two timers. there's always a problem with it. (i wish i could post my VIs but i am at home. they are not very much different than yours. sorry.)
03-13-2022 10:06 PM
Obviously, there is a glaring mistake, but there is no way to debug from a description alone. Try to post the vi next time you have a chance.
03-14-2022 12:12 PM
here's one that does it. this works fast, though. here is, also, a screen shot showing how it doesn't start at 0 sec. By, the way, thanks for replying so fast!!
notice that at third point, it jumps from 1142682.472 to 1142686.332. i asked for 4seconds and got far less than one second.
03-14-2022 12:51 PM - edited 03-14-2022 12:54 PM
Please use "save for previous" before attaching. I don't have LabVIEW 2021 here.
(Note that my above attachments is basically garbage because it is not my VI, just showing some simplifications to the original. The entire thing should be rewritten!)
03-14-2022 01:29 PM
ok. thanks!