03-06-2014 04:08 PM
Hi all, I wrote a simple script to read in a set of images, do some basic image processing on them, and display the results. My problem occurs when I try to use the IMAQ local threshold block in my script- the program runs and displays the images fine without it. But once I route the image into the IMAQ local threshold block, it only outputs an entirely black image.
I know this shouldn't be the case because I literally copied and pasted the same blocks generated by the Vision assistant into my code, and I've confirmed that running the same image through the same set of blocks in a simpler script results in a thresholded image that isn't entirely black.
Does anyone have any idea what could be causing this? Could it have to do with my Labview version being 2011 and my labview vision version being for 2013? I'm a newbie to labview so I feel like I must be missing something obvious...
While I'm at it, my end goal is to be able to track a whale in a video. Does anybody have any suggestions for where to begin on that hefty problem? I've started to look into the motion estimation and object tracking VIs, of course, any advice on how to implement a robust solution would be welcome. Do I want to use a bunch of case statements to switch how I'm processing the image before I do any motion analysis?
Thanks in advance for the help! I've attached my script below.
Solved! Go to Solution.
03-06-2014 04:52 PM
03-07-2014 12:39 AM
Or fastest way - input value of "255" to the "Replace value" input in Local Threshold.vi.
Can you share an example image of the whale? Or maybe a couple to show how the image contrast/illumination, etc... changes.
You could for example find strong keypoints on your whale (edge, corner detection) and use optical flow to track these points from one frame to another...
Best regards,
K
03-07-2014 03:13 PM
Hi, thanks for your quick and helpful response Klemen! That was silly of me to miss that...
I'm currently processing different youtube videos of whale footage, but here's an example of one that I'm running some tests on:
http://www.youtube.com/watch?v=crwZPPYXn6g
Any suggestions and advice welcome! For my first goal, I just want to be able to track the whale when it's above the surface... tracking it when it's under the water would be nice as well. Thank you for taking the time and effort to help me out! I've been working on using Labview for quite a while and do my best to google problems and read the vision manual, but I still run across silly things that absolutely stump me.
03-10-2014 03:39 PM
Hi DC445,
We do have image tracking examples made by some of our users
https://decibel.ni.com/content/docs/DOC-23399
Since the object is continuously changing, you may have to change the code and do multiple analyses using multiple images to pattern match against.
This forum post also give you a good idea of the fundamentals:
http://forums.ni.com/t5/Machine-Vision/object-tracking-and-recognition/m-p/2512168
03-11-2014 07:26 AM
Hello,
I have tried the video link you provided and I can say that you are going to have a hard time tracking the whale. The main reasons in my opinion are:
1. poor contrast between the background and the whale (the whale is not descernible) and
2. bad image quality.
You could probably pull off surface tracking of the whale, but underwater would be more difficult.
For example, using optical flow for tracking the boat is fairly simple and returns good results (please see the attachment).
Maybe another option is to train a larger set of sample images using classification tools. I know OpenCV has some good examples (Haar or LBP, check here: http://docs.opencv.org/doc/user_guide/ug_traincascade.html).
Another possibility is adaptive template correlation - in this way you can auto-update the template when the correlation falls (or other features like scale, occlusion,...) below a certain thresold. I don't know if this is a good option, since the auto-updated template can quickly drift away from your objet of interest. Some additional restraints would need to be implemented.
To encourage you, professionals are having problems detecting/tracking whales and they are using state of the art hardware/software, so... 🙂
http://mashable.com/2014/02/14/space-satellites-whale-tracking/
Anyway, I think your first step should be concentrated on obtaining better images. Let's see if others have something to add.
Best regards,
K