freek freek - 4 months ago 43
Swift Question

Consistent binary data from images in Swift

For a small project, I'm making an iOS app which should do two things:


  1. take a picture

  2. take a hash from the picture data (and print it to the xcode console)



Then, I want to export the picture to my laptop and confirm the hash. I tried exporting via AirDrop, Photos.app, email and iCloud (Photos.app compresses the photo and iCloud transforms it into an
.png
).

Problem is, I can't repodruce the hash. This means that the exported picture differs from the picture in the app. There are some variables I tried to rule out one by one. To get
NSData
from a picture, one can use the
UIImagePNGRepresentation
and
UIImageJPEGRepresentation
functions, forcing the image in a format representation before extracting the data. To be honest, I'm not completely sure what these functions do (other than transforming to
NSData
), but they do something different from the other because they give a different result compared to each other and compared to the exported data (which is
.jpg
).

There are some things unclear to me what Swift/Apple is doing to my (picture)data upon exporting. I read in several places that Apple transforms (or deletes) the EXIF but to me it is unclear what part. I tried to anticipate this by explicitly removing the EXIF data myself before hashing in both the app (via function
ImageHelper.removeExifData
(found here) and via exiftools on the command line), but to no avail.

I tried hashing an existing photo on my phone. I had a photo send to me by mail but hashing this in my app and on the command line gave different results. A
string
gave similar results in the app and on command line so the hash function(s) are not the problem.

So my questions are:


  1. Is there a way to prevent transformation when exporting a photo

  2. Are there alternatives to
    UIImagePNGRepresentation / UIImageJPEGRepresentation
    functions



(3. Is this at all possible or is iOS/Apple too much of a black box?)

Any help or pointers to more documentation is greatly appreciated!




Here is my code

//
// ViewController.swift
// camera test

import UIKit
import ImageIO

// extension on NSData format, to enable conversion to String type
extension NSData {
func toHexString() -> String {
var hexString: String = ""
let dataBytes = UnsafePointer<CUnsignedChar>(self.bytes)
for (var i: Int=0; i<self.length; ++i) {
hexString += String(format: "%02X", dataBytes[i])
}
return hexString
}
}

// function to remove EXIF data from image
class ImageHelper {
static func removeExifData(data: NSData) -> NSData? {
guard let source = CGImageSourceCreateWithData(data, nil) else {
return nil
}
guard let type = CGImageSourceGetType(source) else {
return nil
}
let count = CGImageSourceGetCount(source)
let mutableData = NSMutableData(data: data)
guard let destination = CGImageDestinationCreateWithData(mutableData, type, count, nil) else {
return nil
}
// Check the keys for what you need to remove
// As per documentation, if you need a key removed, assign it kCFNull
let removeExifProperties: CFDictionary = [String(kCGImagePropertyExifDictionary) : kCFNull, String(CGImagePropertyOrientation): kCFNull]

for i in 0..<count {
CGImageDestinationAddImageFromSource(destination, source, i, removeExifProperties)
}

guard CGImageDestinationFinalize(destination) else {
return nil
}

return mutableData;
}
}



class ViewController: UIViewController, UINavigationControllerDelegate, UIImagePickerControllerDelegate, MFMailComposeViewControllerDelegate {

@IBOutlet weak var imageView: UIImageView!

// creats var for picture
var imagePicker: UIImagePickerController!

override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view, typically from a nib.
}

override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated.
}

// calls Camera function and outputs picture to imagePicker
@IBAction func cameraAction(sender: UIButton) {
imagePicker = UIImagePickerController()
imagePicker.delegate = self
imagePicker.sourceType = .Camera

presentViewController(imagePicker, animated: true, completion: nil)
}

// calls camera app, based on cameraAction
func imagePickerController(picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [String : AnyObject]) {
imagePicker.dismissViewControllerAnimated(true, completion: nil)
imageView.image = info[UIImagePickerControllerOriginalImage] as? UIImage
}

// calls photoHash function based on button hashAction
@IBAction func hashAction(sender: AnyObject) {
photoHash()
}


// converts latest picture to binary to sha256 hash and outputs to console
func photoHash(){
let img = ImageHelper.removeExifData(UIImagePNGRepresentation(imageView.image!)!)
let img2 = ImageHelper.removeExifData(UIImageJPEGRepresentation(imageView.image!, 1.0)!)
let imgHash = sha256_bin(img!)
let imgHash2 = sha256_bin(img2!)
print(imgHash)
print(imgHash2)

// write image to photo library
UIImageWriteToSavedPhotosAlbum(imageView.image!, nil, nil, nil)
}

// Digests binary data from picture into sha256 hash, output: hex string
func sha256_bin(data : NSData) -> String {
var hash = [UInt8](count: Int(CC_SHA256_DIGEST_LENGTH), repeatedValue: 0)
CC_SHA256(data.bytes, CC_LONG(data.length), &hash)
let res = NSData(bytes: hash, length: Int(CC_SHA256_DIGEST_LENGTH))

let resString = res.toHexString()
return resString
}



}


Specifications:

MacBook Pro retina 2013, OS X 10.11.5

xcode version 7.3.1

swift 2

iphone 5S

hash on command line via
shasum -a 256 filename.jpg


Answer

Since posting my question last week I learned that Apple seperates the image data from the meta data (image data is stored in UIIMage object), so hashing the UIImage object will never result in a hash that is the same as a hash digested on the command line (or in python or where ever). This is because for python/perl/etc, the meta data is present (even with a tool as Exiftool, the exif data is standardized but still there, whereas in the app environment, the exif data is simply not there, I guess this has something to do with low level vs high level languages but not sure).

Although there are some ways to access the EXIF data (or meta data in general) of a UIImage, it is not easy. This is a feature to protect the privacy (among other things) of the user.

Solution

I have found a solution to our specific problem via a different route: turns out that iOS does save all the image data and meta data in one place on disk for a photo. By using the Photos API, I can get access to these with this call (I found this in an answer on SO, but I just don't remember how I ended up there. If you recognise this snippet, please let me know):

func getLastPhoto() {
    let fetchOptions = PHFetchOptions()
    fetchOptions.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: true)]

    let fetchResult = PHAsset.fetchAssetsWithMediaType(PHAssetMediaType.Image, options: fetchOptions)

    if let lastAsset: PHAsset = fetchResult.lastObject as? PHAsset {
        let manager = PHImageManager.defaultManager()
        let imageRequestOptions = PHImageRequestOptions()

        manager.requestImageDataForAsset(lastAsset, options: imageRequestOptions) {
            (let imageData: NSData?, let dataUTI: String?,
            let orientation: UIImageOrientation,
            let info: [NSObject : AnyObject]?) -> Void in

                // Doing stuff to the NSDAta in imageData

        }
    }

By sorting on date in reverse order the first entry is (obviously) the most recent photo. And as long as I don't load it into an imageView, I can do with the data what I want (sending it to a hash function in this case).

So the flow is as follows: user takes photo, photo is saved to the library and imported to the imageView. The user then presses the hash button upon which the most recently added photo (the one in the imageView) is fetched from disk with meta data and all. I can then export the photo from the library by airdrop (for now, https request in later stadium) and reproduce the hash on my laptop.

Comments