frangulyan frangulyan - 5 months ago 133
Objective-C Question

iOS: AVPlayer - getting a snapshot of the current frame of a video

I have spent the whole day and went through a lot of SO answers, Apple references, documentations, etc, but no success.

I want a simple thing: I am playing a video using AVPlayer and I want to pause it and get the current frame as

. That's it.

My video is a m3u8 file located on the internet, it is played normally in the
without any problems.

What have I tried:

  1. AVAssetImageGenerator
    It is not working, the method
    copyCGImageAtTime:actualTime: error:
    returns null image ref. According to the answer here
    doesn't work for streaming videos.

  2. Taking snapshot of the player view. I tried first
    , but then I realized that it is not rendering this kind of "special" layers. Then I found a new method introduced in iOS 7 -
    which should be able to render also the special layers, but no luck, still got the UI snapshot with blank black area where the video is shown.

  3. AVPlayerItemVideoOutput
    I have added a video output for my
    , however whenever I call
    it returns
    . I guess the problem is again streaming video and I am not alone with this problem.

  4. AVAssetReader
    I was thinking to try it but decided not to lose time after finding a related question here.

So isn't there any way to get a snapshot of something that I am anyway seeing right now on the screen? I can't believe this.


AVPlayerItemVideoOutput works fine for me from an m3u8. Maybe it's because I don't consult hasNewPixelBufferForItemTime and simply call copyPixelBufferForItemTime? This code produces a CVPixelBuffer instead of a UIImage, but there are answers that describe how to do that.

This answer mostly cribbed from here

#import "ViewController.h"
#import <AVFoundation/AVFoundation.h>

@interface ViewController ()

@property (nonatomic) AVPlayer *player;
@property (nonatomic) AVPlayerItem *playerItem;
@property (nonatomic) AVPlayerItemVideoOutput *playerOutput;


@implementation ViewController
- (void)setupPlayerWithLoadedAsset:(AVAsset *)asset {
    NSDictionary* settings = @{ (id)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32BGRA) };
    self.playerOutput = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:settings];
    self.playerItem = [AVPlayerItem playerItemWithAsset:asset];
    [self.playerItem addOutput:self.playerOutput];
    self.player = [AVPlayer playerWithPlayerItem:self.playerItem];

    AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player];
    playerLayer.frame = self.view.frame;
    [self.view.layer addSublayer:playerLayer];

    [self.player play];

- (IBAction)grabFrame {
    CVPixelBufferRef buffer = [self.playerOutput copyPixelBufferForItemTime:[self.playerItem currentTime] itemTimeForDisplay:nil];
    NSLog(@"The image: %@", buffer);

- (void)viewDidLoad {
    [super viewDidLoad];

    NSURL *someUrl = [NSURL URLWithString:@""];
    AVURLAsset *asset = [AVURLAsset URLAssetWithURL:someUrl options:nil];

    [asset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:@"tracks"] completionHandler:^{

        NSError* error = nil;
        AVKeyValueStatus status = [asset statusOfValueForKey:@"tracks" error:&error];
        if (status == AVKeyValueStatusLoaded)
            dispatch_async(dispatch_get_main_queue(), ^{
                [self setupPlayerWithLoadedAsset:asset];
            NSLog(@"%@ Failed to load the tracks.", self);