From: Alienware on
hi everyone,
I am doing a project on image comparison. I am using SIFT for determining the keypoints of the images and then using NNS(Nearest Neighbour Search) for comparing the two images.
My problem is that, when I am comparing an image with its flipped image, it is not showing any matched keypoints. But on comparing an image with its rotated image it is showing about 95% match.

Why is this so? Is this the problem in SIFT or in NNS?
From: ImageAnalyst on
"Alienware " <rajesh89.ni...(a)gmail.com> :
Why do you think an image should be matched to it's flipped mirror
image version? Do you think that a template image of a right hand
should produce hits in an image composed of a bunch of left hands?
Maybe that's your definition of a match but it might not be everyone's
definition of a match.