gizmodo gizmodo - 1 year ago 126
iOS Question

CIGaussianBlur Quality: Banding & Artifacts

I am trying to blur the AVCaptureVideoPreviewLayer on applicationWillResignActive just as in the stock iOS Camera App.

Since rasterizing AVCaptureVideoPreviewLayer and blurring itself produces an empty frame, I approached it by keeping a CIImage from didOutputSampleBuffer and upon applicationWillResignActive, I take the CIImage -> Apply CIFilter CIGaussianBlur to it, add a UIImageView to AVCaptureVideoPreviewLayer and make that UIImageView's Image to be the UIImage version of the blur I applied the CIGaussianBlur to.

This seem to be working okay so far... However, it seem to produce quite a bit of banding, displaying quality issues on the Gaussian Blur.

enter image description here

Grabbing the Frame:

func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {

let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)
CVPixelBufferLockBaseAddress(imageBuffer!, CVPixelBufferLockFlags(rawValue: CVOptionFlags(0)))

let baseAddress = CVPixelBufferGetBaseAddress(imageBuffer!)

let bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer!)
let width = CVPixelBufferGetWidth(imageBuffer!)
let height = CVPixelBufferGetHeight(imageBuffer!)

let colorSpace = CGColorSpaceCreateDeviceRGB()

let bitmap = CGBitmapInfo(rawValue: CGBitmapInfo.ByteOrder32Little.rawValue|CGImageAlphaInfo.PremultipliedFirst.rawValue)
let context = CGBitmapContextCreate(baseAddress, width, height, 8,
bytesPerRow, colorSpace, bitmap.rawValue)
let quartzImage = CGBitmapContextCreateImage(context!)
CVPixelBufferUnlockBaseAddress(imageBuffer!,CVPixelBufferLockFlags(rawValue: CVOptionFlags(0)))

self.cameraBufferImage = CIImage(CGImage: quartzImage!)

Applying the CIGaussianBlur:

func blurPreviewWindow() {
let previewLayer = self.previewView.layer as? AVCaptureVideoPreviewLayer
previewLayer?.connection.enabled = false

let context = CIContext(options: nil)
let inputImage = self.cameraBufferImage!.imageByApplyingOrientation(6)

let clampFilter = CIFilter(name: "CIAffineClamp")
clampFilter!.setValue(inputImage, forKey: "inputImage")

if let currentFilter = CIFilter(name: "CIGaussianBlur") {
currentFilter.setValue(clampFilter!.outputImage, forKey: "inputImage")
currentFilter.setValue(50.0, forKey: "inputRadius")

if let output = currentFilter.outputImage {
if let cgimg = context.createCGImage(output, fromRect: inputImage.extent) {
let processedImage = UIImage(CGImage: cgimg)
self.previewBlurImageView = UIImageView(frame: self.previewView.bounds)
self.previewBlurImageView?.alpha = 0
self.previewBlurImageView?.image = processedImage
self.previewBlurImageView?.contentMode = .ScaleToFill
self.previewBlurImageView?.layer.filters = [currentFilter]

UIView.animateWithDuration(0.2, delay: 0.0, options: [.BeginFromCurrentState], animations: {
() -> Void in
self.previewBlurImageView?.alpha = 1
}, completion: { _ in })

May be there's a whole different approach to this of this iOS 10 era?


Could this be a colorspace issue? Since the test device is iPhone 7, which has wide color?

Answer Source

It was CIContext.


let context = CIContext(options: nil)


let context = CIContext(options: [kCIContextWorkingColorSpace: CGColorSpaceCreateDeviceRGB(),
                                  kCIContextOutputColorSpace: CGColorSpaceCreateDeviceRGB(), 
                                  kCIContextUseSoftwareRenderer: false])

No more banding or artifacts!!!

Recommended from our users: Dynamic Network Monitoring from WhatsUp Gold from IPSwitch. Free Download