Avisho Avisho - 4 months ago 41
Java Question

OpenCV - Java - No match with 2 opposite images using DescriptorMatcher

I'm trying to match 2 opposite images using OpenCV's DescriptorMatcher with no luck.
The images are: http://i61.tinypic.com/28whu0g.jpg (left to right) and http://i61.tinypic.com/x35vte.jpg (right to left).

My code is pretty like alot of examples that I saw in StackOverflow and the web but still I always get no match.

String firstImageSourcePath = "RTL_IMAGE_PATH";
String secondImageSourcePath = "LTR_IMAGE_PATH";

Mat firstImageSrcImgMat = Highgui.imread(firstImageSourcePath);
Mat secondImageSrcImgMat = Highgui.imread(firstImageSourcePath);

if (firstImageSrcImgMat.empty() || secondImageSrcImgMat.empty()) {
System.out.println("Failed to load images");
return;
}

System.out.println("Loaded image at " + firstImageSourcePath + " and " + secondImageSourcePath);

FeatureDetector featureDetector = FeatureDetector.create(FeatureDetector.BRISK);

MatOfKeyPoint firstImgMatOfKeyPoints = new MatOfKeyPoint();
MatOfKeyPoint secondImgMatOfKeyPoints = new MatOfKeyPoint();

featureDetector.detect(firstImageSrcImgMat, firstImgMatOfKeyPoints);
featureDetector.detect(secondImageSrcImgMat, secondImgMatOfKeyPoints);

System.out.println("Detected " + firstImgMatOfKeyPoints.size() + " and " + secondImgMatOfKeyPoints + " blobs in the images");

List<KeyPoint> firstImgKeyPoints = firstImgMatOfKeyPoints.toList();
List<KeyPoint> secondImgKeyPoints = secondImgMatOfKeyPoints.toList();

System.out.println("First Image key points: " + firstImgKeyPoints);
System.out.println("Second Image key points: " + secondImgKeyPoints);

Mat firstImgDescriptors = new Mat();
Mat secondImgDescriptors = new Mat();

DescriptorExtractor extractor = DescriptorExtractor.create(DescriptorExtractor.BRISK);
extractor.compute(firstImageSrcImgMat, firstImgMatOfKeyPoints, firstImgDescriptors);
extractor.compute(secondImageSrcImgMat, secondImgMatOfKeyPoints, secondImgDescriptors);

System.out.println("descriptorsA.size() : " + firstImgDescriptors.size());
System.out.println("descriptorsB.size() : " + secondImgDescriptors.size());

MatOfDMatch matches = new MatOfDMatch();

DescriptorMatcher matcher = DescriptorMatcher.create(DescriptorMatcher.BRUTEFORCE_HAMMINGLUT); // BRUTEFORCE_HAMMINGLUT
matcher.match(firstImgDescriptors, secondImgDescriptors, matches);

System.out.println("matches.size() : " + matches.size());
System.out.println("matches : " + matches);

MatOfDMatch matchesFiltered = new MatOfDMatch();

List<DMatch> matchesList = matches.toList();
List<DMatch> bestMatches = new ArrayList<DMatch>();

Double max_dist = 0.0;
Double min_dist = 100.0;

for (int i = 0; i < matchesList.size(); i++) {
Double dist = (double) matchesList.get(i).distance;

if (dist > 0)
System.out.println("dist : " + dist);

if (dist < min_dist && dist != 0) {
min_dist = dist;
}

if (dist > max_dist) {
max_dist = dist;
}

}

System.out.println("max_dist : " + max_dist);
System.out.println("min_dist : " + min_dist);

if (min_dist > 50) {
System.out.println("No match found, min_dist under minimum value");
return;
}

double threshold = 3 * min_dist;
double threshold2 = 2 * min_dist;

if (threshold > 75) {
threshold = 75;
} else if (threshold2 >= max_dist) {
threshold = min_dist * 1.1;
} else if (threshold >= max_dist) {
threshold = threshold2 * 1.4;
}

System.out.println("Threshold : " + threshold);

for (int i = 0; i < matchesList.size(); i++) {
Double dist = (double) matchesList.get(i).distance;

if (dist < threshold) {
bestMatches.add(matches.toList().get(i));
System.out.println(String.format(i + " best match added : %s", dist));
}
}

matchesFiltered.fromList(bestMatches);

System.out.println("matchesFiltered.size() : " + matchesFiltered.size());

if (matchesFiltered.rows() >= 1) {
System.out.println("match found");
} else {
System.out.println("match not found");
}


any hint what am I doing wrong?

Answer

As @Iwillnotexist-Idonotexist stated the first problem is the threshold you are applying. Try to use a threshold that does not depend on the distance between descriptors that does not perform well since some of the descriptors are much more discriminative than others. I think that will give you better results. I advise you to use the Ratio Test proposed by D. Lowe in SIFT’s paper. Please give a look at section 7.1: http://cs.ubc.ca/~lowe/papers/ijcv04.pdf

The second problem is that you are using BRISK to detect features in your images. This OpenCV implementation has bugs (you can check here: http://code.opencv.org/issues/3976) so try using another FeatureDetector like FAST, ORB, etc … (descriptor is fine so you can keep using it)

I ended up testing in your pictures and I managed to get some results with different detectors / descriptors: (keypoints with no match -> yellow)

BRISK detector and descriptor: BRISK/BRISK

  • left image keypoints: 74
  • right image keypoints: 86
  • matches: 3 (Even with broken detector i got matches)

ORB detector with BRISK as descriptor: ORB/BRISK

  • left image keypoints: 499
  • right image keypoints: 500
  • matches: 26

ORB detector and descriptor using ORB

  • left image keypoints: 841
  • right image keypoints: 907
  • matches: 43

All results were obtained using the ratio test to remove false matches. I hope this helps!

EDIT:

BruteForceMatcher<Hamming> matcher;
vector< vector<DMatch> > matches;
vector <DMatch> goodMatches;
matcher.knnMatch(imgDescriptors1, imgDescriptors2, matches, 2);
// Ratio Test
for (unsigned int matchIdx = 0; matchIdx < matches.size(); ++matchIdx) 
{
    const float ratio = 0.8; // As in Lowe's paper (can be tuned)
    if (matches[matchIdx][0].distance < ratio * matches[matchIdx][1].distance)
    {
        goodMatches.push_back(matches[matchIdx][0]);
    }
}
Comments