Paul Cezanne Paul Cezanne - 10 months ago 72
iOS Question

Reading the GPS data from the image returned by the camera in iOS iphone

I need to get the GPS coordinates of an image taken with the iOS device's camera. I do not care about the Camera Roll images, just the image taken with UIImagePickerControllerSourceTypeCamera.

I've read many stackoverflow answers, like Get Exif data from UIImage - UIImagePickerController, which either assumes you are using the AssetsLibrary framework, which doesn't seem to work on camera images, or use CoreLocaiton to get the latitude/longitude from the app itself, not from the image.

Using CoreLocation is not an option. That will not give me the coordinates when the shutter button was pressed. (With the CoreLocation based solutions, you either need to record the coords before you bring up the camera view or after, and of course if the device is moving the coordinates will be wrong. This method should work with a stationary device.)

I am iOS5 only, so I don't need to support older devices. This is also for a commercial product so I cannot use

So, what are my options for reading the GPS data from the image returned by the camera in iOS5? All I can think of right now is to save the image to Camera Roll and then use the AssetsLibrary, but that seems hokey.


Here's the code I wrote based on Caleb's answer.

UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];

NSData *jpeg = UIImageJPEGRepresentation(image,1.0);
CGImageSourceRef source ;
source = CGImageSourceCreateWithData((__bridge CFDataRef)jpeg, NULL);

NSDictionary *metadataNew = (__bridge NSDictionary *) CGImageSourceCopyPropertiesAtIndex(source,0,NULL);


and my Console shows:

2012-04-26 14:15:37:137 ferret[2060:1799] {
ColorModel = RGB;
Depth = 8;
Orientation = 6;
PixelHeight = 1936;
PixelWidth = 2592;
"{Exif}" = {
ColorSpace = 1;
PixelXDimension = 2592;
PixelYDimension = 1936;
"{JFIF}" = {
DensityUnit = 0;
JFIFVersion = (
XDensity = 1;
YDensity = 1;
"{TIFF}" = {
Orientation = 6;

No latitude/longitude.

Answer Source

One possibility is to leaving CoreLocation running when the camera is visible. Record each CCLocation into an array along with the time of the sample. When the photo comes back, find its time, then match the closest CClocation from the array.

Sounds kludgy but it will work.