Marcin Kapusta Marcin Kapusta - 1 month ago 11
iOS Question

iOS correct way for update recording time since beginning of the recording

How to update time label during recording using this capture pipeline in AVFoundation:

Mic -> AVCaptureDeviceInput -> AVCaptureSession -> AVCaptureAudioDataOutput


AVCaptureAudioDataOutput has Delegate and AVAssetWritter that is writing sampleBuffers to the output file. In this solution I would like to update recording time. I can record time when user tapped Record/Pause/Resume/Stop Buttons but is there any official way from Apple how to update some UILabel with 00:00:00 time at the beginning. I know I can use NSTimer but as I was working with AVAudioRecorder I had property:

var currentTime: TimeInterval
// The time, in seconds, since the beginning of the recording.


The above property was working with pause/continue buttons as well.

Now in this new pipeline I don't have this property and I wonder if I missing something and cannot find it anywhere. Also I noticed that AVCaptureSession has CMClock property named masterClock but I'm not sure how to use it to calculate time, since the beginning of the recording.

I don't get this whole idea of clocks and methods like:

CMClockGetHostTimeClock()
CMSyncConvertTime


How to obtain number of seconds since the beginning of the recording in AVCaptureAudioDataOutputSampleBufferDelegate method based on sampleBuffer?

UPDATE

I discovered a class called CMTimebase and I think this is exacly what I was looking for. Currently I don't know how to use it but what I found after some experimentation in playground.

import CoreMedia

let masterClock = CMClockGetHostTimeClock() // This masterclock should be from captureSession
var timebase: CMTimebase? = nil
CMTimebaseCreateWithMasterClock(kCFAllocatorDefault, masterClock, &timebase)

print(CMTimeGetSeconds(CMTimebaseGetTime(timebase!)))

CMTimebaseSetRate(timebase!, 1.0) // Start/Resume recording

sleep(1)
print(CMTimeGetSeconds(CMTimebaseGetTime(timebase!)))

sleep(1)
CMTimebaseSetRate(timebase!, 0.0) // Pause Recording
print(CMTimeGetSeconds(CMTimebaseGetTime(timebase!)))


sleep(1)
print(CMTimeGetSeconds(CMTimebaseGetTime(timebase!)))


So this object is really useful. When I set rate to 1.0 it starts measuring time in sync with masterClock. When I set rate to 0.0 it stops measuring time. I noticed also that I can attach Timer to this object. Unfortunately there is no good tutorial of this Core Media stuff at all just some api documentation on Apple site. Does anybody know if there is good tutorial/video from WWDC on topic related to CoreMedia Framework?

Answer

When you're dealing with AVCaptureAudioOutput, you're now working with Sample Buffers. Regardless of if it's Video or Audio, you need two critical pieces of information, the sample data, and the time that sample data was recorded.

This is the "Clock" https://developer.apple.com/reference/coremedia/cmclock

A clock represents a source of time information: generally, a piece of hardware that measures the passage of time.

It's highly accurate, and that's really the point. It's read only, you can only grab a reference from it and you can't set it. All sorts of things can happen to buffers, they get queued, they can be forwarded over a network, but if you have that CMTime reference, they can be reconstructed into the right order and the right time.

CMSync is a subset of functions that deals with Clock drift (between two different clocks reporting slightly different values). You don't need to worry about these at all.

Actually to do what you want is very simple. Here's an example you can cut and paste into a playground.

import AVFoundation

let masterClock : CMClock = CMClockGetHostTimeClock()
let startTime : CMTime = CMClockGetTime(masterClock)
sleep(1)
let endTime : CMTime = CMClockGetTime(masterClock)
let difference : CMTime = CMTimeSubtract(endTime, startTime)
let seconds = CMTimeGetSeconds(difference)

That should give you a seconds value of slightly > 1.

When you click the record button in your application, grab the AVCaptureSession clock and capture the startTime.

When you want to update the UI, just check the AVCaptureSession clock again, subtract the two CMTime's and convert to seconds.

Just make sure that you don't update the UI on every sample buffer, that would be far too frequent, so maybe do it only every 14,000 samples or so (or whatever works for the audio frequency you're using). Also make sure to offload that to a separate thread.

Hope this helps and explains a little bit on the Clocks. Good luck!

Comments