LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

tick count precision

I am using the tick count function to acquire some data using the
serial port. The data is time critical, i don't if i can trust the tick
count function and it's precision. Does someone knows if the function
is very precise?, i do not need more than milisecond precision.
thanxs
P


Sent via Deja.com http://www.deja.com/
Before you buy.
0 Kudos
Message 1 of 11
(6,211 Views)
if the data is time critical, tick count is almost certainly not accurate
enough. On average it's pretty good (which is why your computer keeps time
OK), but I wouldn't depend on it if you care about the data.

wrote in message news:8cehhr$pa0$1@nnrp1.deja.com...
> I am using the tick count function to acquire some data using the
> serial port. The data is time critical, i don't if i can trust the tick
> count function and it's precision. Does someone knows if the function
> is very precise?, i do not need more than milisecond precision.
> thanxs
> P
>
>
> Sent via Deja.com http://www.deja.com/
> Before you buy.
>
0 Kudos
Message 2 of 11
(6,211 Views)
Michael,
Do you have any ideas ?
Do you know what kind of timer can i use to get some real accurate time?
thanx
Paul


In article <38f105e2@newsgroups.ni.com>,
"Michael Bush" wrote:
> if the data is time critical, tick count is almost certainly not
accurate
> enough. On average it's pretty good (which is why your computer
keeps time
> OK), but I wouldn't depend on it if you care about the data.
>
> wrote in message news:8cehhr$pa0
$1@nnrp1.deja.com...
> > I am using the tick count function to acquire some data using the
> > serial port. The data is time critical, i don't if i can trust the
tick
> > count function and it's precision. Does someone knows if the
function
> > is very precise?, i do no
t need more than milisecond precision.
> > thanxs
> > P
> >
> >
> > Sent via Deja.com http://www.deja.com/
> > Before you buy.
> >
>
>


Sent via Deja.com http://www.deja.com/
Before you buy.
0 Kudos
Message 3 of 11
(6,211 Views)
Hello

I think you can use a counter of your card, if it is a multifonction card,
or maybe you can count each sample.

paulhogan@my-deja.com a écrit :

> Michael,
> Do you have any ideas ?
> Do you know what kind of timer can i use to get some real accurate time?
> thanx
> Paul
>
> In article <38f105e2@newsgroups.ni.com>,
> "Michael Bush" wrote:
> > if the data is time critical, tick count is almost certainly not
> accurate
> > enough. On average it's pretty good (which is why your computer
> keeps time
> > OK), but I wouldn't depend on it if you care about the data.
> >
> > wrote in message news:8cehhr$pa0
> $1@nnrp1.deja.com...
> > > I am using the tick count function to acquire some data usi
ng the
> > > serial port. The data is time critical, i don't if i can trust the
> tick
> > > count function and it's precision. Does someone knows if the
> function
> > > is very precise?, i do not need more than milisecond precision.
> > > thanxs
> > > P
> > >
> > >
> > > Sent via Deja.com http://www.deja.com/
> > > Before you buy.
> > >
> >
> >
>
> Sent via Deja.com http://www.deja.com/
> Before you buy.

--
Laurent TERRIER
LEGI - Ecoulements Diphasiques
1025 rue de la piscine - BP 53
38041 GRENOBLE cedex 9

Tel : 04 76 82 51 39
Fax : 04 76 82 52 71
0 Kudos
Message 4 of 11
(6,211 Views)
wrote in message news:8cr7fn$c0g$1@nnrp1.deja.com...
> Michael,
> Do you have any ideas ?
> Do you know what kind of timer can i use to get some real accurate time?

Unless you want to change your DAQ hardware to latch the output from a
crystal based counter on the same trigger pulse as the data is latched, then
read both buffers via the serial port, you're not going to get any more
accuracy than the tick counter.

The tick counter itself should be pretty accurate. The errors come in
because of the variable time between reading the tick counter and reading
the data from the serial port- this is far more significant than the error
in the tick counter itself which, being crystal driven, will be pretty good.

Assuming you want a
solution in software, rather than hardware, I'd take a
tick counter reading immediately before the read operation, and one
immediately after. You then have the two bounds between which your real time
value lies and you can give a value and uncertainty from there. Then it's a
case of living with the uncertainty- some of which is going to be present no
matter what you do, even if you adopt a hardware approach.

It'd be useful if you could give more specific information on the mechanics
of how you get the data- i.e. do you issue a trigger command, then read the
data? If so, you just need the tick count before and after issuing the
trigger command- assuming the trigger latches the instantaneous value (read
the instrument's instruction manual to find out what the relationship
between the trigger time and the aquisition time are).
0 Kudos
Message 5 of 11
(6,211 Views)
Well.... sort of.....

the crystal itself is very good, but Windows has control over when labview
gets to see the output of the crystal ("the tick"). The problem comes
when windows decides to go off into LALA land for a while, and then either
generates an extra "fast" tick to catch up, or skips a tick so it doesn't
get ahead....

IF Windows is allowing Labview to operate at the instant that the tick
happens, and IF windows allows labview to execute the call to the serial
port in the next instant, all is well. Otherwise..... well, don't hold your
breath. In a hardware solution, the call to the port is generated on the
card, and the data is buffered (on the card) until such time as the
operating system gets around to dealing with it. The only time you have a
problem is if the operating system gets so far behind that the buffer
overflows. There's nothing wrong with using the tick count, but I wouldn't
count on it for accuracy of better than a millisecond or two. In general,
it will be right on, but when it's not, you have no way of knowing....





Craig Graham wrote in message
news:8d4emp$cr5$1@sponge.lancs.ac.uk...
>
> wrote in message
news:8cr7fn$c0g$1@nnrp1.deja.com...
> > Michael,
> > Do you have any ideas ?
> > Do you know what kind of timer can i use to get some real accurate time?
>
> Unless you want to change your DAQ hardware to latch the output from a
> crystal based counter on the same trigger pulse as the data is latched,
then
> read both buffers via the serial port, you're not going to get any more
> accuracy than the tick counter.
>
> The tick counter itself should be pretty accurate. The errors come in
> because of the variable time between reading the tick counter and reading
> the data from the serial port- this is far more significant than the error
> in the tick counter itself which, being crystal driven, will be pretty
good.
>
> Assuming you want a solution in software, rather than hardware, I'd take a
> tick counter reading immediately before the read operation, and one
> immediately after. You then have the two bounds between which your real
time
> value lies and you can give a value and uncertainty from there. Then it's
a
> case of living with the uncertainty- some of which is going to be present
no
> matter what you do, even if you adopt a hardware approach.
>
> It'd be useful if you could give more specific information on the
mechanics
> of how you get the data- i.e. do you issue a trigger command, then read
the
> data? If so, you just need the tick count before and after issuing the
> trigger command- assuming the trigger latches the instantaneous value
(read
> the instrument's instruction manual to find out what the relationship
> between the trigger time and the aquisition time are).
>
>
0 Kudos
Message 6 of 11
(6,210 Views)
I designed the hardware for the data acquisition , and the way it works
is :
The card is reading 8 (12 bit) analog channels as fast as it can.
Meanwhile labview is doing it's thing, but when it needs some data from
the card it sends a string through the serial port, and waits for the
serial data. Labview should ask for data at certain frequency, which is
not higher than 100 Hz.
The timing between the moment labview asks for the data and the moment
the card responds is constant, so I think that the problem is the
labview timing, that is not constant or at least close. It is affected
by the computer performance.
Is there any way I can make my vi run exactly equal all the times?
without being affected by the pc?




In article <8d4emp$cr5$1@sponge.lancs.ac.uk>,
"Craig Graham" wrote:
>
> wrote in message
news:8cr7fn$c0g$1@nnrp1.deja.com...
> > Michael,
> > Do you have any ideas ?
> > Do you know what kind of timer can i use to get some real accurate
time?
>
> Unless you want to change your DAQ hardware to latch the output from a
> crystal based counter on the same trigger pulse as the data is
latched, then
> read both buffers via the serial port, you're not going to get any
more
> accuracy than the tick counter.
>
> The tick counter itself should be pretty accurate. The errors come in
> because of the variable time between reading the tick counter and
reading
> the data from the serial port- this is far more significant than the
error
> in the tick counter itself which, being crystal driven, will be
pretty good.
>
> Assuming you want a solution in software, rather than hardware, I'd
take a
> tick counter reading immediately before the read operation, and one
> immediately after. You then have the two bounds between which your
real time
> value lies and you can give a value and uncertainty from there. Then
it's a
> case of living with the uncertainty- some of which is going to be
present no
> matter what you do, even if you adopt a hardware approach.
>
> It'd be useful if you could give more specific information on the
mechanics
> of how you get the data- i.e. do you issue a trigger command, then
read the
> data? If so, you just need the tick count before and after issuing the
> trigger command- assuming the trigger latches the instantaneous value
(read
> the instrument's instruction manual to find out what the relationship
> between the trigger time and the aquisition time are).
>
>


Sent via Deja.com http://www.deja.com/
Before you buy.
0 Kudos
Message 7 of 11
(6,210 Views)
> Is there any way I can make my vi run exactly equal all the times?
> without being affected by the pc?
>

No. You're missing the fundamental concept here, which is that Labview runs
on the PC, and thus is vulnerable to the potentially variable timing of the
PC. You have no control (and no way of getting control) over the timing of
events on your PC.

If you want "exactly equal" timing, you need to do the daq in some kind of
hardware which is independant of the PC timing, and then buffer it in some
way so that it can be moved into the PC in a (relatively) asynchronous way.
You use Labview to initiate the data acquisition, and to manage the transfer
between the card and the PC, and to process the data once it's in the PC,
but the
actual data acquisition is done on the card.....

A question tho-- please define what "exactly equal" means for you-- what
level of error is acceptable? 100 ms +/- ????? The big issue in designing
a system like this is what the allowable error is... millisecond?
nano-second? pico-second? fempto-second???? All are possible; it's just
a question of how much time (and money) you want to spend on solving it.
0 Kudos
Message 8 of 11
(6,208 Views)
The NI RT products will solve this problem. They provide a hardware solution
that will make you application run "deterministically". Determinism is a
fancy way to say "operate at a constant delay."

If you do not want to do the RT stuff, you can try this........
Try writing your data to a Que, buffer so may que elements, and read them
out with a loop that is run on a multiple of millisecond interval. Or you
can make the Que run off of a thread, and you better determisism. The key
term is "better". In theory this should give you decent results. I have
yet to do any serious testing with this, but use Ques and time intervaling
on my loops for data displays so I do not kill the CPU time! 🙂

Hope this helps!

"Michael Bus
h" wrote:
>> Is there any way I can make my vi run exactly equal all the times?>> without
being affected by the pc?>>>>No. You're missing the fundamental concept
here, which is that Labview runs>on the PC, and thus is vulnerable to the
potentially variable timing of the>PC. You have no control (and no way of
getting control) over the timing of>events on your PC.>>If you want "exactly
equal" timing, you need to do the daq in some kind of>hardware which is independant
of the PC timing, and then buffer it in some>way so that it can be moved
into the PC in a (relatively) asynchronous way.>You use Labview to initiate
the data acquisition, and to manage the transfer>between the card and the
PC, and to process the data once it's in the PC,>but the actual data acquisition
is done on the card.....>>A question tho-- please define what "exactly equal"
means for you-- what>level of error is acceptable? 100 ms +/- ????? The
big issue in designing>a system like this
is what the allowable error is...
millisecond?>nano-second? pico-second? fempto-second???? All are possible;
it's just>a question of how much time (and money) you want to spend on solving
it.>>
0 Kudos
Message 9 of 11
(6,208 Views)
When you say you want 1ms precision, do you want to cause a daq event
at a precise time +/- 1ms, or do you mean that the daq is triggered by
an external event that happens whenever, but when it does you need to
know the time of the event +/- 1ms. Your approach (which in either
case will be a hardware solution) will be different depending on which
one you mean.

On Wed, 05 Apr 2000 05:05:07 GMT, paulhogan@my-deja.com wrote:

>I am using the tick count function to acquire some data using the
>serial port. The data is time critical, i don't if i can trust the tick
>count function and it's precision. Does someone knows if the function
>is very precise?, i do not need more than milisecond precision.
>thanxss
>P
>
>
>Sent via Deja.com http://www.deja.com/
>Before you buy.

0 Kudos
Message 10 of 11
(6,207 Views)