EBDOKUM EBDOKUM - 5 months ago 40
Swift Question

UIImage to UIColor array of pixel colors

I'm sorry to ask this, but I can't figure out how to represent a UIImage as an array of UIColor for each pixel. I've tried my best with converting

UIImagePNG/JPEGRepresentation
but couldn't get the desired result.

Answer

This is simply a Swift's translation of Olie's answer to the same question in ObjC. Make sure you give him an upvote as well.

extension UIImage {
    func colorArray() -> [UIColor] {
        let result = NSMutableArray()

        let img = self.CGImage
        let width = CGImageGetWidth(img)
        let height = CGImageGetHeight(img)
        let colorSpace = CGColorSpaceCreateDeviceRGB()

        var rawData = [UInt8](count: width * height * 4, repeatedValue: 0)
        let bytesPerPixel = 4
        let bytesPerRow = bytesPerPixel * width
        let bytesPerComponent = 8

        let bitmapInfo = CGImageAlphaInfo.PremultipliedLast.rawValue | CGBitmapInfo.ByteOrder32Big.rawValue
        let context = CGBitmapContextCreate(&rawData, width, height, bytesPerComponent, bytesPerRow, colorSpace, bitmapInfo)

        CGContextDrawImage(context, CGRectMake(0, 0, CGFloat(width), CGFloat(height)), img);
        for x in 0..<width {
            for y in 0..<height {
                let byteIndex = (bytesPerRow * x) + y * bytesPerPixel

                let red   = CGFloat(rawData[byteIndex]    ) / 255.0
                let green = CGFloat(rawData[byteIndex + 1]) / 255.0
                let blue  = CGFloat(rawData[byteIndex + 2]) / 255.0
                let alpha = CGFloat(rawData[byteIndex + 3]) / 255.0

                let color = UIColor(red: red, green: green, blue: blue, alpha: alpha)
                result.addObject(color)
            }
        }

        return (result as NSArray) as! [UIColor]
    }
}

Note that this runs rather slowly. It takes 35 seconds for the simulator to decode a 15MP image, and that's on a quad-core i7.