LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Timing an event

Hi,
 
I am very new to labview and require some advise/help.
 
I need to check the clock of a component.
 
What I would like to do is reset my equipment and time how long it takes for an output to go high, it should take 30 seconds.
 
1) what is the best way to go about it.  I know how to reset my equipent and measure if the output has gone high just not sure how to time.
 
2) how accurate is labview?
 
I have labview 7 with PCI 6509 and PCI 6014.
 
Thanks in advance.
0 Kudos
Message 1 of 5
(2,993 Views)

I dont know the answer for your question, but for the 2nd, I can say LabVIEW is accurate in whatever you do since it is a programming language.

So, it ll reflect the correctness of the code you ve written in the results. Smiley Wink

- Partha ( CLD until Oct 2027 🙂 )
0 Kudos
Message 2 of 5
(2,985 Views)
Hi Roger,

I recommend you take a look at some of the DAQmx examples that ship with the driver. You can find them under Help »Find Examples. Then, Hardware Input and Output, DAQmx, Digital Measurements.

I think the VI "Read Dig Chan - Change Detection" is similar to what you are looking for. You could do something like record the time when you start the program/reset equipment and take another time measurement when this program has finished, and subtract the difference.

To record the time, go to the functions palette on the block diagram, then timing, and use a function such as "Get Date/Time in Seconds"

Hope that helps!
-Sam F, DAQ Marketing Manager
0 Kudos
Message 3 of 5
(2,964 Views)

Hi Thanks for your previous advice.

I have written a vi to time my event please see attached.

What I am trying to do is reset a timing cycle (by activating an input) then time how long it takes for an output to go high.  It should be 30 seconds.

My vi seems to be working but if I run the vi say 10 times there is upto +/- 0.2 seconds difference. ie 30.005, 30.2, 29.9, 30.196 etc.

What I am trying to determine is

1 that my code is not causing the fluctuation

2 that it is possible with labview to get an extremely accurate reading.

My device has a 2hz clock setup and it is critical that it does not drift and definitely not by as much as +/- 0.2 secs.  The device I am trialling with is not drifting so I am sure the drift is with labview or my code.

If anyone has some advice/tips your help would be much appreciated.

Thanks

0 Kudos
Message 4 of 5
(2,919 Views)
Your results are probably about as good as you are going to get with software timing. You never know when the OS is going to go off and do something which takes time away from your program. If you set up your data acquisition to use hardware timing, the results will be as accurate as the timebase on the board and the variation between the transition of your signal and the sample clock.

You might look at some of the style guides as well. Try to keep the diagram to the size of one screen. Using dataflow would allow you to eliminate the sequence structure.

Lynn
Message 5 of 5
(2,908 Views)