pasawaya pasawaya - 4 months ago 146
Objective-C Question

Processing a video file with OpenCV on iOS (no live feed)

There are a bunch of existing tutorials and SO questions on setting up OpenCV on iOS to process data from the iPhone camera. They all involve using the CvVideoCameraDelegate and its

processImage:
method. I'm not interested in that. I'm interested in loading a video from the camera roll and then processing and displaying that video in real-time for the user.

Loading from the camera roll is trivial and covered here. I'm more concerned with the actual display of the video feed from the
cv::VideoCapture
object. Does anyone have any idea how to feed the
cv:VideoCapture
output into a UIImageView?

Answer

So it turns out it's not too difficult to do. The basic outline is:

  1. Create a cv::VideoCapture to read from a file
  2. Create a CALayer to receive and display each frame.
  3. Run a method at a given rate that reads and processes each frame.
  4. Once done processing, convert each cv::Mat to a CGImageRef and display it on the CALayer.

The actual implementation is as follows:

Step 1: Create cv::VideoCapture

std::string filename = "/Path/To/Video/File";
capture = cv::VideoCapture(filename);
if(!capture.isOpened()) NSLog(@"Could not open file.mov");

Step 2: Create the Output CALayer

self.previewLayer = [CALayer layer];
self.previewLayer.frame = CGRectMake(0, 0, width, height);
[self.view.layer addSublayer:self.previewLayer];

Step 3: Create Processing Loop w/ GCD

int kFPS = 30;

dispatch_queue_t queue = dispatch_queue_create("timer", 0);
self.timer = dispatch_source_create(DISPATCH_SOURCE_TYPE_TIMER, 0, 0, queue);
dispatch_source_set_timer(self.timer, dispatch_walltime(NULL, 0), (1/kFPS) * NSEC_PER_SEC, (0.5/kFPS) * NSEC_PER_SEC);

dispatch_source_set_event_handler(self.timer, ^{
    dispatch_async(dispatch_get_main_queue(), ^{
        [self processNextFrame];
    });
});

dispatch_resume(self.timer);

Step 4: Processing Method

-(void)processNextFrame {
    /* Read */
    cv::Mat frame;
    capture.read(frame);

    /* Process */
    ...

    /* Convert and Output to CALayer*/
    cvtColor(frame, frame, CV_BGR2RGB);
    NSData *data = [NSData dataWithBytes:frame.data
                              length:frame.elemSize()*frame.total()];

    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    CGBitmapInfo bitmapInfo = (frame.elemSize() == 3) ? kCGImageAlphaNone : kCGImageAlphaNoneSkipFirst;
    CGDataProviderRef provider = CGDataProviderCreateWithCFData((__bridge CFDataRef) data);

    CGImageRef imageRef = CGImageCreate(frame.cols,
                                        frame.rows,
                                        8,
                                        8 * frame.elemSize(),
                                        frame.step[0],
                                        colorSpace,
                                        bitmapInfo,
                                        provider,
                                        NULL,
                                        false,
                                        kCGRenderingIntentDefault);

    self.previewLayer.contents = (__bridge id)imageRef;

    CGImageRelease(imageRef);
    CGColorSpaceRelease(colorSpace);
}