fbitterlich fbitterlich - 2 months ago 67
Objective-C Question

Working CIAreaHistogram / CIHistogramDisplayFilter example for NSImage

Somehow I cannot figure out to get an actual, meaningful histogram image from an

input using the

I read Apple's "Core Image Filter Reference" and the relevant posts here on SO, but whatever I try I get no meaningful output.

Here's my code so far:

- (void) testHist3:(NSImage *)image {

CIContext* context = [[NSGraphicsContext currentContext] CIContext];

NSBitmapImageRep *rep = [image bitmapImageRepresentation];
CIImage *ciImage = [[CIImage alloc] initWithBitmapImageRep:rep];

ciImage = [CIFilter filterWithName:@"CIAreaHistogram" keysAndValues:kCIInputImageKey, ciImage, @"inputExtent", ciImage.extent, @"inputScale", [NSNumber numberWithFloat:1.0], @"inputCount", [NSNumber numberWithFloat:256.0], nil].outputImage;
ciImage = [CIFilter filterWithName:@"CIHistogramDisplayFilter" keysAndValues:kCIInputImageKey, ciImage, @"inputHeight", [NSNumber numberWithFloat:100.0], @"inputHighLimit", [NSNumber numberWithFloat:1.0], @"inputLowLimit", [NSNumber numberWithFloat:0.0], nil].outputImage;

CGImageRef cgImage2 = [context createCGImage:ciImage fromRect:ciImage.extent];
NSImage *img2 = [[NSImage alloc] initWithCGImage:cgImage2 size:ciImage.extent.size];

NSLog(@"Histogram image: %@", img2);
self.histImage = img2;


What I get is a 64x100 image with zero representations (=invisible). If I create the CI context with

CIContext *context = [[CIContext alloc] init];

then the resulting image is grey, but at least it does have a representation:

Histogram image: <NSImage 0x6100002612c0 Size={64, 100} Reps=(
"<NSCGImageSnapshotRep:0x6100002620c0 cgImage=<CGImage 0x6100001a1880>>" )>

The input image is a 1024x768 JPEG image.

I have little experience with Core Image or Core Graphics, so the mistake might be with the conversion back to NSImage... any ideas?

Edit 2016-10-26: With rickster's very comprehensive answer I was able to make a lot of progress.

Indeed it was the
parameter that was messing up my result. Supplying a
there solved the problem. I found that you cannot leave that to the default either; I don't know what the default value is, but it is not the input image's full size. (I found that out by running an image and a mirrored version of it through the filter; I got different histograms.)

Edit 2016-10-28:

So, I've got a working, displayable histogram now; my next step will be to figure out how the "intermediate" histogram (the 256x1 pixel image coming out of the filter) can contain the actual histogram information even though all but the last pixel are always (0, 0, 0, 0).


I presume the [image bitmapImageRepresentation] in your code is a local category method that's roughly equivalent to (NSBitmapImageRep *)image.representations[0]? Otherwise, first make sure that you're getting the right input.

Next, it looks like you're passing the raw output of ciImage.extent into your filter parameters — given that said parameter expects a CIVector object and not a CGRect struct, you're probably borking the input to your filter at run time. You can get a bit more useful diagnostics for such problems by using the dictionary-based filter methods filterWithName:withInputParameters or imageByApplyingFilter:withInputParameters: — that way, if you try to pass nil for a filter key or pass something that isn't a proper object, you'll get a compile-time error. The latter gives you an easy way to go straight from input image to output image, or chain filters, without creating intermediary CIFilter objects and needing to set the input image on each.

A related tip: most of the parameters you're passing are the default values for those filters, so you can pass only the values you need:

CIImage *hist = [inputImage imageByApplyingFilter:@"CIAreaHistogram"
                              withInputParameters:@{ @"inputCount": @256 }];
CIImage *outputImage = [hist imageByApplyingFilter:@"CIHistogramDisplayFilter"

Finally, you might still get an almost-all-gray image out of CIHistogramDisplayFilter depending on what your input image looks like, because all of the histogram bins may have very small bars. I get the following for Lenna:

small bars histogram

Increasing the value for kCIInputScaleKey can help with that.

Also, you don't need to go through CGImage to get from CIImage to NSImage — create an NSCIImageRep instead and AppKit will automatically manage a CIContext behind the scenes when it comes time to render the image for display/output.

// input from NSImage
NSBitmapImageRep *inRep = [nsImage bitmapImageRepresentation];
CIImage *inputImage = [[CIImage alloc] initWithBitmapImageRep:inRep];

CIImage *outputImage = // filter, rinse, repeat

// output to NSImage
NSCIImageRep *outRep = [NSCIImageRep imageRepWithCIImage: outputImage];
NSImage *outNSImage = [[NSImage alloc] init];
[outNSImage addRepresentation: outRep];