Nico Nico - 1 year ago 50
iOS Question

Pixellating a UIImage returns UIImage with a different size

I'm using an extension to pixellate my images like the following:

func pixellated(scale: Int = 8) -> UIImage? {
guard let ciImage = CIImage(image: self), let filter = CIFilter(name: "CIPixellate") else { return nil }
filter.setValue(ciImage, forKey: kCIInputImageKey)
filter.setValue(scale, forKey: kCIInputScaleKey)

guard let output = filter.outputImage else { return nil }

return UIImage(ciImage: output)
}


The problem is the image represented by
self
here has not the same size than the one I create using
UIImage(ciImage: output)
.

For example, using that code:

print("image.size BEFORE : \(image.size)")
if let imagePixellated = image.pixellated(scale: 48) {
image = imagePixellated
print("image.size AFTER : \(image.size)")
}


will print:

image.size BEFORE : (400.0, 298.0)
image.size AFTER : (848.0, 644.0)


Not the same size and not the same ratio.

Any idea why?

EDIT:

I added some prints in the extension as following:

func pixellated(scale: Int = 8) -> UIImage? {
guard let ciImage = CIImage(image: self), let filter = CIFilter(name: "CIPixellate") else { return nil }

print("UIIMAGE : \(self.size)")
print("ciImage.extent.size : \(ciImage.extent.size)")

filter.setValue(ciImage, forKey: kCIInputImageKey)
filter.setValue(scale, forKey: kCIInputScaleKey)

guard let output = filter.outputImage else { return nil }

print("output : \(output.extent.size)")

return UIImage(ciImage: output)
}


And here are the outputs:

UIIMAGE : (250.0, 166.5)
ciImage.extent.size : (500.0, 333.0)
output : (548.0, 381.0)

Answer Source

You have two problems:

  1. self.size is measured in points. self's size in pixels is actually self.size multiplied by self.scale.
  2. The CIPixellate filter changes the bounds of its image.

To fix problem one, you can simply set the scale property of the returned UIImage to be the same as self.scale:

return UIImage(ciImage: output, scale: self.scale, orientation: imageOrientation)

But you'll find this still isn't quite right. That's because of problem two. For problem two, the simplest solution is to crop the output CIImage:

// Must use self.scale, to disambiguate from the scale parameter
let floatScale = CGFloat(self.scale)
let pixelSize = CGSize(width: size.width * floatScale, height: size.height * floatScale)
let cropRect = CGRect(origin: CGPoint.zero, size: pixelSize)
guard let output = filter.outputImage?.cropping(to: cropRect) else { return nil }

This will give you an image of the size you want.

Now, your next question may be, "why is there a thin, dark border around my pixellated images?" Good question! But ask a new question for that.

Recommended from our users: Dynamic Network Monitoring from WhatsUp Gold from IPSwitch. Free Download