HariKarthick HariKarthick - 18 days ago 8
iOS Question

Reduce the merge audio file?

I merge the two audio files in Swift2.3.

It's working fine. The file size is 7MB. So I am not able to send to the server.

How to reduce the audio file size? So that I can easily send to the server.
I attached the code below. It will be useful for those who are trying to merge:

func mixAudio()
{
let currentTime = CFAbsoluteTimeGetCurrent()
let composition = AVMutableComposition()
let compositionAudioTrack = composition.addMutableTrackWithMediaType(AVMediaTypeAudio, preferredTrackID: kCMPersistentTrackID_Invalid)
compositionAudioTrack.preferredVolume = 0.8
let avAsset = AVURLAsset.init(URL: soundFileURL, options: nil)
print("\(avAsset)")
var tracks = avAsset.tracksWithMediaType(AVMediaTypeAudio)
let clipAudioTrack = tracks[0]
do {
try compositionAudioTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, avAsset.duration), ofTrack: clipAudioTrack, atTime: kCMTimeZero)
}
catch _ {
}
let compositionAudioTrack1 = composition.addMutableTrackWithMediaType(AVMediaTypeAudio, preferredTrackID: kCMPersistentTrackID_Invalid)
compositionAudioTrack.preferredVolume = 0.8

let avAsset1 = AVURLAsset.init(URL: soundFileURL1)
print(avAsset1)


var tracks1 = avAsset1.tracksWithMediaType(AVMediaTypeAudio)
let clipAudioTrack1 = tracks1[0]
do {
try compositionAudioTrack1.insertTimeRange(CMTimeRangeMake(kCMTimeZero, avAsset1.duration), ofTrack: clipAudioTrack1, atTime: kCMTimeZero)
}
catch _ {
}
var paths = NSSearchPathForDirectoriesInDomains(.LibraryDirectory, .UserDomainMask, true)
let CachesDirectory = paths[0]
let strOutputFilePath = CachesDirectory.stringByAppendingString("/newone.mov")
print(" strOutputFilePath is \n \(strOutputFilePath)")

let requiredOutputPath = CachesDirectory.stringByAppendingString("/newone.m4a")
print(" requiredOutputPath is \n \(requiredOutputPath)")

let audioFileOutput = NSURL.fileURLWithPath(requiredOutputPath)
print(" OUtput path is \n \(audioFileOutput)")
do {

try NSFileManager.defaultManager().removeItemAtURL(audioFileOutput)
}
catch _ {
}
let exporter = AVAssetExportSession(asset: composition, presetName: AVAssetExportPresetAppleM4A)
exporter!.outputURL = audioFileOutput
exporter!.outputFileType = AVFileTypeAppleM4A

exporter!.exportAsynchronouslyWithCompletionHandler
{() -> Void in
print(" OUtput path is \n \(requiredOutputPath)")
print("export complete: \(CFAbsoluteTimeGetCurrent() - currentTime)")

do
{
print(audioFileOutput)
print(" OUtput path is \n \(requiredOutputPath)")

self.wasteplayer = try AVAudioPlayer(contentsOfURL: audioFileOutput)
self.wasteplayer.numberOfLoops = 0
self.wasteplayer.play()

}
}

catch _
{

}

}

Answer

You must be providing a audio setting dictionary to your writer. Here are the following formats you can look into

kAudioFormatMPEG4AAC : 164, (Lowest)

kAudioFormatAppleLossless : 430,

kAudioFormatAppleIMA4 : 475,

kAudioFormatULaw : 889,

kAudioFormatALaw : 889

Use kAudioFormatMPEG4AAC It is smallest among all.

       let audioOutputSettings = [
            AVFormatIDKey : Int(kAudioFormatMPEG4AAC),
            AVSampleRateKey :     //[ NSNumber numberWithInt: 16 ], AVEncoderBitDepthHintKey,
                Int(44100.0),
            AVNumberOfChannelsKey : Int(1),
            AVChannelLayoutKey: NSData(bytes:&channelLayout, length:sizeof(AudioChannelLayout)),
 ]

This is the above setting in Swift.

At this line

let exporter = AVAssetExportSession(asset: composition, presetName: AVAssetExportPresetAppleM4A)

Can u try this instead

let exporter = AVAssetExportSession(asset: composition, presetName: AVAssetExportPresetLowQuality)
Comments