LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

highrestimer + usespecialtimer


Hello,

 

After reading quite a number of threads about these microsecond timers, i'm still confused about some points. Some people say when measuring at

microsecond resolutions, software alone can't do it - you need hardware to assist. But these high-res timers are counting at microsec aren't

they? In fact it seems to me that the file usespecialtimer.vi is counting at nanosec units??

 

So just these software timers alone (which means using a computer with labview program, no extra hardware timers) can be used to measure the

delays between two signals points in microsecond timescale or higher?

 

Please help. Thanx.
0 Kudos
Message 1 of 4
(2,791 Views)

Where is this "usespecialtimer.vi" that you are referring to? I looked in the example finder as well as the functions palette and the discussion forums and did not find a VI with that name.

 

Are you referring to measuring with DAQmx devices? What high-res timers are you referring to? Are they on DAQ cards or high speed digitizers?

 

If you are referring to DAQ cards here is a document on the timestamps. I don't know if this answers any of your questions but it may.

 

Regarding your last question, I don't see any way to get microsecond resolution timescale with a Windows operating system and no hardware timers. The test you can expect is millisecond resolution. On a real-time system the best you can expect is microsecond resolution timescaling; but again, that depends on the programming and what else the OS is processing at the time.

Vince M
Applications Engineer
0 Kudos
Message 2 of 4
(2,760 Views)

Sorry for not attaching the file.

 

I found this file within this website's labview forum area by searching the terms "microsecond delay" 

 

For example:  http://forums.ni.com/ni/board/message?board.id=170&message.id=75603&query.id=301319#M75603

 

This thread, and a few other threads seem to suggest that we can use just the computer's internal clock as a hardware to assist labview to count at the microsecond rate??

 

 

0 Kudos
Message 3 of 4
(2,741 Views)
That's an interesting approach, but the best you'll be able to do with that is get a timestamp with a higher precision (read this article on MSDN).  You won't be able to run a loop that fast.
Jim
You're entirely bonkers. But I'll tell you a secret. All the best people are. ~ Alice
For he does not know what will happen; So who can tell him when it will occur? Eccl. 8:7

0 Kudos
Message 4 of 4
(2,727 Views)