LabWindows/CVI

cancel
Showing results for 
Search instead for 
Did you mean: 

displaying images in vision

I am trying to display an image from the ImaqGrab function on a canvas instead of opening a seperate window.  How can I do this?
 
Jon K.
0 Kudos
Message 1 of 9
(4,594 Views)
Is there a specific development environment/example you are using?
S. Arves S.
National Instruments
Applications Engineer
0 Kudos
Message 2 of 9
(4,568 Views)
I am using the Pattern Matching example found in the CVI vision examples folder.  I am trying to determine the accuracy in degrees of a pan tilt zoom camera.  The camera is zoomed all the way in on an object a certain distance away.  The image is displayed and a pattern within he picture is stored.  This preset position is saved in the cameras memory.  The camera is moved 180 degrees away for the initial location.  The preset location is called and the camera moves to the stored location.  I take another snap shot and perform a search of the pattern.  The pattern is found and the delta x,y coordinates are displayed.  The camera accuracy is listed as.1 degrees.  I know the distance to the target (26 ft).  If I remember my trig correctly the target should fall within a .04537' (.54 ") circle of the original x,y location as the center.  Any suggestions on how to perform this with the Vision software in CVI?
0 Kudos
Message 3 of 9
(4,557 Views)
Thanks for the extra info.  The shipping example Simple Calibration Example.prj  found in C:\Program Files\National Instruments\CVI71\samples\Vision\2. Functions\Calibration is a great place to start with your particular application.  It is almost the exactly the same, but let me know if you have any questions on that. 
S. Arves S.
National Instruments
Applications Engineer
0 Kudos
Message 4 of 9
(4,556 Views)
The program looks like it will do the job it I can modify it for my application.  I am trying to use the vision assitant to perform the tasks I need but am not having much luck.  Any suggestions?
 
 
0 Kudos
Message 5 of 9
(4,537 Views)
I attached two (2) images.  Original postion and seconday position.  The PTZ camera I am testing will move to a location.  An image will be stored (original position).  The dome position will be stored in memory.  The dome will move 180 degrees away from the original location and the dome preset will be called.  The dome will return to the original position (+/- .1 degrees).  A second image will be taken (secondary position).  I need to relate the delta x and y to a +/- camera angle.  The distance to the target will not change.  I need to know the following:
1)  What kind of target should I use to get the most accurate results?
2)  How can I calibrate an image that is a certain distance away so I can measure real world distances from the image?
3)  What size pattern do I search for that can be acquired quickly and give the most accuracy?  
4)  How can the specified measurements be made (angle or the displacement).
 
 
Jon K.
Download All
0 Kudos
Message 6 of 9
(4,529 Views)
Hi Jon,

From what I understand, you are trying to measure the change in (x,y) coordinates from Picture A to Picture B.  Calibration is typically used for converting pixel distances to real world values within the same picture.  If you are just comparing the change in distance between two different pictures, you just need to make the trigonometric calculations based on how far away the camera is from the object.

The guidelines for what kind of target and what size pattern (ROI, or Region of Interest) to use in pattern matching are described in the NI Vision Concepts Manual (found in C:\Program Files\National Instruments\Vision\Documentation\Concepts_Manual.pdf).  Some general suggestions include objects with good contrast, distinct patterns, etc.

To learn more about the specific pattern matching functions, you can search the NI Vision for LabWindows/CVI Function Reference Help (C:\Program Files\National Instruments\Vision\Documentation\NIVisionCVI.chm), as well as view the shipping example Pattern Matching.prj.  It will return information about the matches, such as the position or centroid, which you can easily use to calculate the shift in location. 

Hope this helps; let me know if you have any more questions!

Irene Chow
National Instruments
Applications Engineer


0 Kudos
Message 7 of 9
(4,513 Views)
I can measure the delta x,y.  I just need to know how to calibrate the image or test pattern at a specified distance.
0 Kudos
Message 8 of 9
(4,499 Views)
Hi Jon,

I believe our definitions of calibration may be different.  The spatial calibration functions in NI Vision are only used for converting pixel values into real world units (such as inches).  For example, if I have a pH sensor and want to correlate voltage values into pH values, I would have to have to physically test a couple of known pH values, find the corresponding voltage values, and relate the two in a formula.  In the same way, the Vision calibration functions will ask you to specify the number of pixels that equal, say, 3 inches, and from then on, it will be able to give you the size of objects in inches based on how many pixels wide they are (for that particular camera set-up and distance from the object).

Perhaps you are looking to "calibrate" the image by resetting the coordinate system if the picture shifts.  I would suggest looking at the NI Vision Concepts Manual, especially Chapter 5 Performing Machine Vision Tasks.  It has sections called "Locate Objects to Inspect" (how to set a coordinate system if objects in your image shift) and "Convert Pixel Coordinates to Real-World Coordinates."  A great shipping example for setting up a coordinate system to look at is the Battery Clamp Inspection Example.

If this is not what you are looking to do, please do clarify and explain the purpose of your application so that I may best help you.

Cheers,
Irene Chow
National Instruments
Applications Engineer
Message 9 of 9
(4,476 Views)