Humbertda Humbertda - 2 months ago 14x
iOS Question

How to calculate FOV?

Initial Context

I am developping an augmented reality application location based and I need to get the field of view [FOV] (I just update the value when the orientation change, so I am looking for a method which can get this value when I call it)

The goal is to make a "degree ruler" relevant to reality like the following:
Degree Ruler - AR App

I am already using
to display camera stream ; and a path coupled with a
to draw the ruler. This is working pretty good, but now I have to use Field of view value to place my element in the right place (choose the right space between 160° and 170° for example!).

Actually, I am hardcoding these values with these sources : (Special thanks to @hotpaw2!) But I am not sure they are fully precise and this is not handling iPhone 5, etc. I was unable to obtain values from official sources (Apple!), but there is a link showing values for all iDevice I think I need (4, 4S, 5, 5S) : AnandTech | Some thoughts about the iphone 5s camera improvements.

Note: After personal test and some other research online, I am pretty sure these values are inaccurate! Also this forces me to use an external library to check which model of iPhone am I using to manually initialize my FOV... And I have to check my values for all supported device.

I would prefer a "code solution" !

After reading this post: iPhone: Real-time video color info, focal length, aperture?, I am trying to get
exif data
from AVCaptureStillImageOutput like suggested. After what I could be able to read the focal length from the exif data, and then calculate the horizontal and vertical field of view via formula! (Or maybe directly obtain the FOV like showed here : --
note: after a certain number of update, It seems that we can't get directly field of view from exif!)

Actual Point

Sources from : and Modified EXIF data doesn't save properly

Here is the code I am using:

AVCaptureDevice* camera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if (camera != nil)
captureSession = [[AVCaptureSession alloc] init];

AVCaptureDeviceInput *newVideoInput = [[AVCaptureDeviceInput alloc] initWithDevice:camera error:nil];

[captureSession addInput:newVideoInput];

captureLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:captureSession];
captureLayer.frame = overlayCamera.bounds;
[captureLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
[self setCameraOrientation:[[UIApplication sharedApplication] statusBarOrientation]];
[overlayCamera.layer addSublayer:captureLayer];
[captureSession startRunning];

AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[captureSession addOutput:stillImageOutput];

AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections)
for (AVCaptureInputPort *port in [connection inputPorts])
if ([[port mediaType] isEqual:AVMediaTypeVideo] )
videoConnection = connection;
if (videoConnection) { break; }

[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection
completionHandler:^(CMSampleBufferRef imageSampleBuffer, NSError *error)
NSData *imageNSData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];

CGImageSourceRef imgSource = CGImageSourceCreateWithData((__bridge_retained CFDataRef)imageNSData, NULL);

NSDictionary *metadata = (__bridge NSDictionary *)CGImageSourceCopyPropertiesAtIndex(imgSource, 0, NULL);

NSMutableDictionary *metadataAsMutable = [metadata mutableCopy];

NSMutableDictionary *EXIFDictionary = [[metadataAsMutable objectForKey:(NSString *)kCGImagePropertyExifDictionary]mutableCopy];

EXIFDictionary = [[NSMutableDictionary dictionary] init];

[metadataAsMutable setObject:EXIFDictionary forKey:(NSString *)kCGImagePropertyExifDictionary];


Here is the output:

ApertureValue = "2.52606882168926";
BrightnessValue = "0.5019629837352776";
ColorSpace = 1;
ComponentsConfiguration = (
ExifVersion = (
ExposureMode = 0;
ExposureProgram = 2;
ExposureTime = "0.008333333333333333";
FNumber = "2.4";
Flash = 16;
FlashPixVersion = (
FocalLenIn35mmFilm = 40;
FocalLength = "4.28";
ISOSpeedRatings = (
LensMake = Apple;
LensModel = "iPhone 4S back camera 4.28mm f/2.4";
LensSpecification = (
MeteringMode = 5;
PixelXDimension = 1920;
PixelYDimension = 1080;
SceneCaptureType = 0;
SceneType = 1;
SensingMethod = 2;
ShutterSpeedValue = "6.906947890818858";
SubjectDistance = "69.999";
UserComment = "[S.D.] kCGImagePropertyExifUserComment";
WhiteBalance = 0;

I think I have everything I need to calculate FOV. But are they the right values? Because after reading a lot of different website giving different focal length values, I am a bit confused! Also my PixelDimensions seems to be wrong!

Via this is the formula I planned to use:

FOV = (IN_DEGREES( 2*atan( (d) / (2 * f) ) ));
// d = sensor dimensions (mm)
// f = focal length (mm)

My Question

Do my method and my formula look right, and if yes, which values do I pass to the function?


  • FOV is what I think I need to use, if you have any suggestion of how the ruler can match reality; I would accept the answer !

  • Zoom is disabled in the augmented reality view controller, so my field of view is fixed when camera is initialized, and can't change until the user rotate the phone!

Also sorry for my English mistakes, I'm French...


In iOS 7 and above you can do something along these lines:

float FOV = camera.activeFormat.videoFieldOfView;

where camera is your AVCaptureDevice. Depending on what preset you choose for the video session, this can change even on the same device. It's the horizontal field-of-view (in degrees), so you'll need to calculate the vertical field-of-view from the display dimensions.

Here's Apple's reference material.