11-02-2011 06:43 AM
I have for exampletwo input signal's:
1: 1,0,0,0,1...
2: 1,0,0,1....
What is easiest way to calculate time between these two signals. On first signal I detect "toggle increment", when 0 goes to 1 and start to measure time. Then on 2 signal, I would like to end of measuring time and save this time. And again for new 0 to 1 on 1. signal..and then again end measuring time at 2 signal.
thanks
11-02-2011 10:38 AM
What kind of hardware is providing you this two signals?
Christian
11-02-2011 04:16 PM
PLC
maybe is there any example of labveiw ad database, maybe I could just save time and signal to some sort of document, and then calculate time between these two signals from database...?
11-02-2011 04:35 PM
Please stick to your other post. Do not confuse people by asking questions on two different threads
http://forums.ni.com/t5/LabVIEW/Simulate-quot-signal-quot-from-process/td-p/1757918
11-02-2011 04:41 PM
Well I'm searching for some help or some example, I know that this might be done DAO or not ?
11-03-2011 09:10 AM
We know you're searching for help. That's why you posted here, right?
Again, please do not create multiple posts or split the discussion across threads. It makes trying to help more difficult.