swiftcode swiftcode - 4 months ago 23
iOS Question

Crop image to a square according to the size of a UIView/CGRect

I have an implementation of AVCaptureSession and my goal is for the user to take a photo and only save the part of the image within the red square border, as shown below:

UIViewController with an implementation of AVCaptureSession


(the camera) spans from
(top left) to the bottom of my camera controls bar (the bar just above the view that contains the shutter). My navigation bar and controls bar are semi-transparent, so the camera can show through.

I'm using
[captureSession setSessionPreset:AVCaptureSessionPresetPhoto];
to ensure that the original image being saved to the camera roll is like Apple's camera.

The user will be able to take the photo in portrait, landscape left and right, so the cropping method must take this into account.

So far, I've tried to crop the original image using this code:

DDLogVerbose(@"%@: Image crop rect: (%f, %f, %f, %f)", THIS_FILE, self.imageCropRect.origin.x, self.imageCropRect.origin.y, self.imageCropRect.size.width, self.imageCropRect.size.height);

// Create new image context (retina safe)
UIGraphicsBeginImageContextWithOptions(CGSizeMake(self.imageCropRect.size.width, self.imageCropRect.size.width), NO, 0.0);

// Create rect for image
CGRect rect = self.imageCropRect;

// Draw the image into the rect
[self.captureManager.stillImage drawInRect:rect];

// Saving the image, ending image context
UIImage *croppedImage = UIGraphicsGetImageFromCurrentImageContext();

However, when I look at the cropped image in the camera roll, it seems that it has just squashed the original image, and not discarded the top and bottom parts of the image like I'd like. It also results in 53 pixels of white space at the top of the "cropped" image, likely because of the y position of my

This is my logging output for the

Image crop rect: (0.000000, 53.000000, 320.000000, 322.000000)

This also describes the frame of the red bordered view in the superview.

Is there something crucial I'm overlooking?

P.S. The original image size (taken with a camera in portrait mode) is:

Original image size: (2448.000000, 3264.000000)

Rob Rob

You can crop images with CGImageCreateWithImageInRect:

CGImageRef imageRef = CGImageCreateWithImageInRect([uncroppedImage CGImage], bounds);
UIImage *croppedImage = [UIImage imageWithCGImage:imageRef];