Elric Elric - 6 months ago 18
Javascript Question

Use renderedBuffer as HTML5 Audio tag

So I've used WebAudioAPI to create a music from code. I've used OfflineAudioContext to create a music and it's oncomplete event is similar to this:

function(e) {
var audioCtx = new (window.AudioContext || window.webkitAudioContext)();
var song = audioCtx.createBufferSource();
song.buffer = e.renderedBuffer;
song.connect(audioCtx.destination);
song.start();
}


Which plays the sound. And it works. But I would like to instead store it as an
<audio>
element, because it's easier to play, loop, pause and stop, which I need to reuse the song.

Is it possible? I'm googling for days, but I can't find how!

The idea was to use
var song = new Audio()
and something to copy the
e.renderedBuffer
to it.

Answer

Ok, so I found this code floating around: http://codedbot.com/questions-/911767/web-audio-api-output . I've created a copy here too: http://pastebin.com/rE9a1PaX .

I've managed to use this code to create and store an audio on the fly, using all the function provided in this link.

offaudioctx.oncomplete = function(e) {
    var buffer   = e.renderedBuffer;
    var UintWave = createWaveFileData(buffer);
    var base64   = btoa(uint8ToString(UintWave));

    songsarr.push(document.createElement('audio'))
    songsarr[songsarr.length-1].src = "data:audio/wav;base64," + base64;
    console.log("completed!");
};

It's not pretty, but it works. I'm leaving everything here in case someone finds an easier way.