Machine Vision

cancel
Showing results for 
Search instead for 
Did you mean: 

How do I track an object using a Unibrain's Fire-i DV camera, IMAQ 1394, and Vision 7.0?

Hello,
Thank you for reviewing my question. Presently I am trying to track the motion of a black dot (the size a quarter), on my wrist using a Unibrain Fire-i digital camera @30fps (1), IMAQ 1394 drivers, Vision 7.0, and LabVIEW 7.1. This is two dimensional planar motion and what I would like to do is plot the position vs. time, velocity vs. time, and accleration vs time on a graph. Does anyone have any example tracking motion vi's or ideas on how to accomplish such a task?
I ran across the following website, http://synapse.vit.iit.nrc.ca/Nouse/index2.html , that tracks the position of a person’s nose and would love to create a similar LabVIEW tool. I am a beginner at the LabVIEW vision processing tools...soon to become an expert ...but have used Redlake MotionScope before in the past to track the motion of a spool for vibrational/modal analysis. Something similar to this software or MiDAS (2) would work great for my application. Any ideas? Sounds like an excellent subVI for NI Engineers to develop


Thank you for your help and I look forward to your response,
Travis


(1) http://www.unibrain.com/1394_products/fire-i_dig_cam/digital_camera_pc.htm
(2) http://www.fastcamreplay.com/products/midas_ovr.htm
0 Kudos
Message 1 of 10
(5,025 Views)
Hi Travis,

I checked out the video that you linked to in your post and that tool is certainly interesting. The subject of motion capture is definitely one of the more exciting areas in vision. If you are trying to track the black dot on your wrist, a couple of key factors will come into play, and should be considered. First, though the dot on your wrist is the size of a quarter, how large is that in the image? The number of pixels per inch will influence how easy/tough it is to characterize the object you are trying to track (in this case the dot). If you have an image of just a 1 foot by 1 foot area, I don't think it would be very difficult to track the dot. If you have an image of an entire field, the dot would be near impossible to track. I will assume you are working somewhere in the middle. The more limited your field, the simpler it will be to track a dot, or nose in the case of Nouse. In that application, the nose only had a 1 square foot range.

One other factor that will play an important role in this sort of application is the color of the rest of the image. Will you be using a color camera? If you have a blue dot on your wrist, and are standing against a red background, it is possible that the blue dot will be the only presence of pure blue in the image. This will make tracking the dot *much* easier. When this is considered, we can see why people use backdrops like greenscreens and bluescreens for motion capture.

While these things can be done in LabVIEW, I would guess that it might take a bit of time. If you have no problems tracking the dot, it should be relatively straightforward to calibrate your image and add position vs time, velocity, and acceleration graphs. Labview has some inherent pattern and shape matching VIs that could prove useful as well.

Good luck with your application!

Robert
0 Kudos
Message 2 of 10
(4,998 Views)
Thanks Robert for your advice and your comments where definitely helpful. The DV camera I am using is color, has a resolution of 640 x 480, and acquires RGB frames at 30 fps. My actual application for this is tracking the general movements of an infant. More information on my research with sample videos can be found at the following website:
http://scholar.lib.vt.edu/theses/available/etd-09232003-153717/ . Note these videos are not the devices or background that I am working with but are examples of the overall idea and picture that I will be working with.

So basically I am attaching a 0.5” white Velcro band to the wrist of an infant that has a 1.25” black wireless accelerometer (aka “black dot”). The infant will be wearing a jumper of his or her choice but I have never have seen an infant wearing a black jumper 🙂 Thus, I believe the contrast should be adequate. I plan to take video directly above the infant lying on a white bed and also a video from the side that I plan to process with object tracking. I have full control of the background and image size. I plan on zooming in on the infant so that length of the infant fits into the width of the video screen. Now I just want to try to track the black dot with respect to time. I can reduce the size of the black dot if necessary so that the LabVIEW’s tracking mechanism doesn’t wonder within this 1.25” area. I was hoping this would be a simple labview express vi since this is such a common application in general. I envision a tool that you drag a little box around the area you want to track and a larger box around the area you want to search. Then LabVIEW acquires the pixels configuration of the little box and on each frame searches within the defined larger box for the known pixel configuration. Once found, it builds an array of the coordinates of the center of the little box. Someone has bound to tried this in labview.

I understand that I am trying to model 6-degrees of freedom with 2-degrees, however, this is the necessary first step in developing a motion tracking system. In fact, at this particular age the infant doesn’t move out of plane that often. It appears the Nouse Sterotracker (http://www.cv.iit.nrc.ca/~dmitry/stereotracking/StereoTracker.html) can caputure 3-degrees with two cameras…this would be an excellent fit. Has anyone ever used LabVIEW as a motion tracking system or know of any sample code that is similar to this application?

Thank you for your help!
-Travis
0 Kudos
Message 3 of 10
(4,990 Views)
Hi again,

It sounds like you are working on a pretty interesting application.

There are not any pre built VIs that take care of this entire operation. That would certainly be a powerful VI! On the other hand, Vision does provide all of the needed tools to create your own.

As far as examples go, I would recommend looking at the following link: Examples. There are several VIs on that page, and I would recommend the VI that is called Grab_and_Pattern_Match.VI. Because the VI is older, it might give you a little bit of trouble opening, but it should be possible. Most likely it is going to search for some ROI functions that were moved. All of that aside, the example uses pattern matching to track an objects position. In your particular scenario, I would suggest some sort of pre processing in order to reduce the other distractions within the image. Afterwards, you should be able to pattern match with minimal additional logic in order to track the item's movement.

Take a look at the example, and play around with the functions a little and you will begin to see how this can shape up!

Have a great day,

Robert
0 Kudos
Message 4 of 10
(4,978 Views)
I wrote an application for a customer that tracks an object similar to what you described. We used a small ball attached to a glove so that we had a perfect circle pattern from any angle. We acquired images at 100 fps and tracked quite nicely. You are right that you need to define the object to track and the area to search. I found it helpful to predict the maximum change in position per frame and only search the area surrounding the previous location.

Bruce
Bruce Ammons
Ammons Engineering
0 Kudos
Message 5 of 10
(4,971 Views)
Hi Bruce,

I agree, defining the change per frame is a very good way to speed up the application. Sounds cool!

Robert
0 Kudos
Message 6 of 10
(4,967 Views)
Bruce,

I appreciate that limiting the search area will increase the speed of processing and allow for higher frame rate cameras. However, it is unclear to me how that is achieved in practice.

For example, using IMAQ Match Pattern, one can limit the search angle for rotation to something like +/- 2deg in the IMAQ Setup Match Pattern2 for a rotation invariant problem. That is great and the time spent searching really drops. But if your object rotates >2 deg the VI will max out at 2deg and not report the true rotation angle. Unfortunately, one cannot simply provide the most recent image as the new template image to Match Pattern...the template needs to be relearned and that seriously slows down your VI.

Is there a way around this??

Thanks,

RB
0 Kudos
Message 7 of 10
(4,926 Views)
One of the inputs for the pattern matching search is the ROI. For each image, create a ROI centered on the previous location and large enough to handle the maximum movement. This really depends on your scale and max rate of movement. Don't forget to include space for the entire pattern inside the ROI.

The rotation invariant problem is one reason we used a ball for the target. It is a circle from any direction or rotation, so we were able to use shift invariant pattern matching.

Make sure you train a good template to begin with, and don't retrain it. You don't have time for something like that.

Bruce
Bruce Ammons
Ammons Engineering
0 Kudos
Message 8 of 10
(4,919 Views)
Bruce-

The ROI is a good point to suggest as that will also speed processing by limiting the search area and I do implement that. However, my object (an eye) is already constrained in the X-Y. Since I am interested in rotation about the Z-axis, I wonder if I could periodically rotate my template image? Hmmm, I am thinking this may not work as I would need to rotate the training information that gets stored along with the template.

To circumvent that, one might take the initial image, copy and rotate it several times (say at 10deg intervals), and then create several trained templates at these known rotation angles. Then, as the object rotated, one could cycle through the templates with the closest rotation, later adjusting to determine the true object rotation.

Could there be an easier way?

RB
0 Kudos
Message 9 of 10
(4,907 Views)
I was thinking the same possibility of training multiple images at 10 degree intervals, but it looks like you can just run IMAQ Setup Match Patterns with different angle ranges. Each one will generate Match Pattern Setup Data. Store these in an array and index out the one you need.

Bruce
Bruce Ammons
Ammons Engineering
0 Kudos
Message 10 of 10
(4,903 Views)