Machine Vision

cancel
Showing results for 
Search instead for 
Did you mean: 

Cannot get repeatable stereo calibration

Hello all. I am struggling to get a repeatable stereo caibration. Hopefully someone can give me some pointers. A little bit about my setup:

I have a pair of AVT Manta GigE cameras (1292 x 964) paired with Tamron 23FM25SP lenses. The cameras are mounted on a rigid (12mm thick) aluminium plate and are currently set to be around 895mm apart.The cameras are toed in slightly so that the centres of the images intersect around 4 metres from the cameras. The cameras are securely mounted via adapter plates and bolts. They cannot move.

 

I have a calibration grid along the lines of the NI example grid. Mine is 28X20 black dots spaced around 13mm apart (centre to cente) with each dot being around 5mm diameter. I am aware of the  NI guidelines on suitable calibration grids, and mine seems to be well within the recommended bounds. The grid was formed by laser printing onto A3 paper and then using spray adhesive to fis to a rigid carbon fibre panel. It is flat and doesn't deform when in use.

 

So, here is my problem: when I use the calibration grid to calibrate the cameras I sometimes get a good calibration and sometimes not. When I get a good calibration and attempt to repeat exactly the same process I get a different result. What do I mean by a good calibration? When I go on to use the stereo calibration in my system which tracks a circular feature in 3D space I get good accurate measurements (well sub-mm in the cross camera axes and ~1mm depth resolution, over a ange of 600mm in each axis centred around 3000mm from the cameras. The centres of the circular features in each image lie on the same horizontal image line as expected in the rectified images for a well calibrated camera pair. When I get this 'good' calibration the distance between the cameras as returned by the 'IMAQ get binocular stereo calibration info 2' VI (the magniture of the translation vector) is around the correct distance of 895mm. However, when I perform the calibration lots of times I get quite a spread of camera separations (up to 20mm either side of correct). When I get a significant error in the camera separation the accuracy of the system degrades dramatically and the centres of the circular feature line on progressively further apart horizontal lines (there's one distance from the camera when they're on the same line, and they move apart either side of that distance).

 

I have gathered a set of 10 images of the calibration target and set up a VI to use a subset of the images for the calibration process and iterate through permutations to investigate the repeatabilty. I get a similar spread of results for inter-camera distance. 

 

Does anyone have a feel for whether what I'm trying to do is sensible / achievable? Any tips for repeatable calibration? For instance, should the calibration grid be at a constant distance from the cameras when it is presented at the different anglres, or should a range of distances be used? If it should be the same distance, how accurately should this distnace be maintained?

 

Thanks, Chris

Regards,
Chris Vann
Certified LabVIEW Architect
0 Kudos
Message 1 of 6
(5,183 Views)

You made no mention of what you are using for illumination.  Are you using ambient light of the room?  If you rely on ambient light that changes intensity/position/direction, you need to pre-process your grid images to eliminate variations as much as possible.  You can try a Normalize, or Equalize filter.  Be sure to mask out anything that is not part of the calibration grid (background).

Machine Vision, Robotics, Embedded Systems, Surveillance

www.movimed.com - Custom Imaging Solutions
0 Kudos
Message 2 of 6
(5,156 Views)

Thanks for the suggestion MoviJohn, but after some more investigation, I have made some progress. Using the exact same setup, In have used the Caltech Camera Calibration Toolbox for Matlab, Link here. This results in accurate and repeatable calibrations, with a repeatability of ~0.2mm in camera separation - about 2 orders of magnitude better than the LV toolkit. I can only conclude that I am either using the LV toolkit incorrectly (although I have read EVERYTHING I can find on it), or it is fundamentally flawed. It is most certainly fairly opaque in terms of allowing the user to understand what's going on!

 

The only difference in setup is that I have used a square grid for the Matlab approach (dots for LV). The grid is printed in the same way, it is a comparable size and mounted to the same rigid board, so I think I have a like for like comparison.

 

Hopefully this post will a) help point others in a useful direction and b) prompt NI to take a look at this issue.

 

 

Regards,
Chris Vann
Certified LabVIEW Architect
0 Kudos
Message 3 of 6
(5,129 Views)

Actually, the best lighting to use for a stero system is some structured lighting such as

http://www.effilux.fr/1-36128-EFFI-Lase-Stereo-vision.php

We got good results using a cloud of dots.

 

Did you acquire and calibrate each camera on your own, computing each camera model using the Calibration Training utility, or did you use the live LabVIEW Stereo Vvision example that we create to help calibrate a stereo system?

0 Kudos
Message 4 of 6
(5,093 Views)

> Any tips for repeatable calibration? For instance, should the calibration grid be at a constant distance from the cameras when it is presented at the different anglres, or should a range of distances be used? If it should be the same distance, how accurately should this distnace be maintained?

 

The advice for calibrating is to try to get enough images to cover the entire field of view of the cameras. Also for each exposure, the grid should be presented a different angles so that the algorithm can accurately compute the camera model.

The live example that I mentioned in my previous post helps you doing that by showing the live coverage of the grid. The image on the left suggest the proper grid orientation.

You can fin the example at the following location:

C:\Program Files (x86)\National Instruments\LabVIEW 2012\examples\Vision\Stereo Vision\Calibrate Stereo Vision System\Calibrate Stereo Vision System.vi

 

Hope this helps.

 

Christophe

0 Kudos
Message 5 of 6
(5,091 Views)

Hi Christophe. Thanks for taking an interest. I am pretty sure that structured light is not relevant to the calibration stage being discussed here. Structured light is a useful technique for introducing detail to otherwise bland areas of an image to provide feature matching algorithms something to match against, but I don't see how it's relevant to calibrating against a grid of dots. Happy if someone can correct me of course...

 

I have been using the NI example "Stereo Vision Example.vi" located in C:\Program Files (x86)\National Instruments\LabVIEW 2012\examples\Vision\3. Applications, and have also created my own system based upon that example. I get the same poor results with both. The path and file you suggested is not present on my machine (I'm running LV2012). Is the example you suggested the same? Maybe I should be trying it. Any ideas where I can get hold of it?

 

I have been using the techniques you suggest of presenting the calibration grid at a variety of angles and ensuring good coverage of the fields of view. I have spent upwards of 20 hours experimenting with different techniques and approaches, and cannot get repeatable results. But with the Matlab-based approach from Caltech using the same techniques I get good results. I am becoming increasingly confident there is an issue in the LV implementation.

 

Thanks,

Chris

Regards,
Chris Vann
Certified LabVIEW Architect
0 Kudos
Message 6 of 6
(5,072 Views)