12-12-2014 08:48 AM - edited 12-12-2014 09:08 AM
I have difficulty with a timing issue.
What I need to do is time how long an input boolean value goes low..
I have tested my logic with the input value and tolerance against expected result.
My main problem is the timer, although it set to ms its not timing or giving me the values I am looking for: for example it the input was low for 6000ms but the value it shows could be 17663008 (example). Additionally there seems to be no consistency, I tried dividing that value than passing that but with very inconsistent results.
This is an FPGA function!! So I am limited to the FPGA palette etc..
Please help – 2 days now messing with this and my brain hurts.
Additional info. This only needs to time this once then reset back to zero ready for the next call
The image shows roughly what I mean, as I said when i control the numeric input (expected time). It works fine.. just need to somehow get the timer to give me an accurate value???
Thanks loads, chocolates in the post
Solved! Go to Solution.
12-12-2014 09:31 AM
I mean numeric input actual time- when i replace the timer for an input
12-12-2014 09:56 AM
The tick count just gives a time for how long the system has been running, not the time since you last ran into that function. What you need to do is use a feedback node to subtract the previous tick count from the new tick count. That will give you an elapsed time.
12-12-2014 10:56 AM
Thanks for the response, however maybe my explanation is a little poor,
Sorry for words description but I am away from access to labview at the min.
The operation required: I only want to run once, so there will not be a previous value. i need it to be a sub.vi which starts when triggerd by a boolean value,
then stops when that boolean value changes. The result of that stop needs to give me the elapsed time that the Sub.vi was running for and/or the the length of time that the boolean value was in its changed state. can then run a check on that number (elapsed time (ms) to see if it qualifies as a pass or fail.
Hope this helps for more clear advice.
Thanks again.
12-12-2014 11:09 AM
alternativly, maybe I could use a value to start and stop a loop and then retrieve value from some kind of elapsed loop time? I clasping straws now... :o(
12-12-2014 12:30 PM
Long shot because I cant get access to labview til monday now, (did the pic in good old mspaint, but would this work? somehow?
Thanks all
12-12-2014 01:26 PM - edited 12-12-2014 01:27 PM
Something like this?
12-12-2014 02:17 PM
looks good!
I cant try it until monday- do you know the how the numeric output will look? ie will it show actual (ms)? does it need to be a timed loop, what is the difference between that one and a normal while loop? Sorry for q's I just would like to understand the code should I have to adapt it.
Appreciate you input thus far!
Thanks again
12-12-2014 02:24 PM
@testswan wrote:
do you know the how the numeric output will look? ie will it show actual (ms)? does it need to be a timed loop, what is the difference between that one and a normal while loop?
The timers were set to be ms, so the output should also be in ms.
Since we are in an FPGA, you will want to save a few gates by using the Single Cycle Timed Loop. It eliminates a few flip flops by forcing everything inside of it to execute in a single clock cycle.
12-12-2014 02:43 PM
That's great. The reason I ask about the output value is that earlier today even with the (ms) selected the timer was going crazy. for example it would output something like 52421196 for 7000 ms and for 6000 ms, maybe something like 64826183 made no sense at all. But I will give it a go monday and keep the thread updated with how it goes. have you tried it?