Machine Vision

cancel
Showing results for 
Search instead for 
Did you mean: 

How to determine the accuracy of the frame rate using a Basler sca640-70gm?

Solved!
Go to solution

Hi,

 

Please go ahead and pull the time stamp from the image data themselves, as BlueCheese instructed earlier.  As Bruce noted, its impossible to try and measure it accurately from loop to loop since the latency over the bus is not taken into consideration.  You're right, things in the universe are going to have variance, but I think we need to look past that to the solutions posted here. 

 

Cheers, 

Marti C
Applications Engineer
National Instruments
NI Medical
0 Kudos
Message 11 of 13
(1,640 Views)
Solution
Accepted by topic author ni@ecafa.com

ni@ecafa.com wrote:

Hello,

 

It seems to me that every real world thing has a variance, even an atomic clock.  I am also always asked for a range on the variance.  Statistically, it seems one should be able to determine a range on that variance as well.

 

Could something like this following statement then be said, "The internal timing of the Basler GigE sCA640-70gm camera (accessed with the LabVIEW LowLevelGrab.vi) appears to be millions of times more accurant than most applications need, so the fps is modelled as an ideal timing source?"  If so, thanks for the vote of confidence!

 

Maybe Basler has a timing diagram?

 

Thank you,

 

Todd

 


 

Todd,

 

I believe Basler's current line of GigE cameras use a 125Mhz local clock that is used for timing and time-stamping images. You'd have to consult with Basler as to the characteristics of how accurate that is and how it could drift over time/environment changes, as I don't believe this is something they publish in their user manual.

 

You can also read a timestamp on each image that was recorded by the driver when the image finished transfer over the bus. This likely has much more jitter than the one from the camera since it also takes into account transmission latency and jitter associated with the drivers, OS, and CPU, but if you are concerned about _very_ long-term drift (as in accumulated over 1 day or longer) it might be more accurate since the OS clock is usually disciplined by NTP/SNTP or other sources (at least on Windows) while the clock on the camera is simply free-running.

 

Eric

0 Kudos
Message 12 of 13
(1,638 Views)

I have a follow-up question that is very similar to this issue (I posted it here).

 

I did as Bruce suggested:   measure the time over a large number of frames (thousands) and divide the number of frames by the total time. 

 

I did this, and discovered that my frame rate is constant (as discussed here).  But it is somewhat innacurate.  I expect it to be 30 fps (Format 0) ... but I am measuring a value that is off by about 20 microseconds.  Over many hours, this tiny error builds up to a considerable time lag.

 

I'd prefer not to have to trigger the exposure using an externally generated pulse train.  But how can I rely on the accuracy of the frame rate?

http://www.medicollector.com
0 Kudos
Message 13 of 13
(1,417 Views)