How to determine whether two edge images should match


I am using background subtraction to see if something has entered the viewpoint of my camera. Sometimes, this is reliable, but other times certain changes in lighting cause the background subtractor to fail. I'm trying to filter out these cases, and one idea I had was to look at the edge images.

Essentially, I want to compare the edge image in a region of interest within the stored background image and the current image. If the cause of background subtraction failure is a lighting change, the edges should be roughly preserved. In contrast, a person being in front of the camera would really change the edge image, so it would match poorly against the stored background image.

Here are the results of the edge images in two cases:enter image description here

There are two scenarios. On the left, there is the edge image for a person in front of the camera and the stored background with no person. Below it is the XOR of these two images, which is a big mess (as expected). This should be a poor match, so that is good.

On the right, there is the edge image of a scene where lighting change occurred, and beside it is the stored background image. These edge images look very similar, so they should be a good match. However, if you look below it at the XOR image, you see that there are many pixels in the XOR image. This makes XOR under the current scenario a bad choice for matching edge images.

I've already blurred the images with a Gaussian and still get these results. How can I match these images to determine whether the edge image of a given image is similar to the edge image of the stored background?

Thanks for any advice here!


Answers:


Have you tried the Hausdorff distance? More Information can be found here. There's also a duplicate of this solution here.