We need to streaming live audio (from a medical device) to web browsers with no more than 3-5s of end-to-end delay (assume 200mS or less network latency). Today we use a browser plugin (NPAPI) for decoding, filtering (high, low, band), and playback of the audio stream (delivered via Web Sockets).
We want to replace the plugin.
I was looking at various Web Audio API demos and the most of our required functionality (playback, gain control, filtering) appears to be available in Web Audio API. However, it is not clear to me if Web Audio API can be used for streamed sources as most of the Web Audio API makes use of short sounds and/or audio clips.
Can Web Audio API be used to play live streamed audio?
After a bit more research and local prototyping, I am not sure live audio streaming with Web Audio API is possible. As Web Audio API's decodeAudioData isn't really designed to handle random chunks of audio data (in our case delivered via WebSockets). It appears to need the whole 'file' in order to process it correctly.
Yes, the Web Audio API (along with AJAX or Websockets) can be used for streaming.
Basically, you pull down (or send, in the case of Websockets) some chunks of
n length. Then you decode them with the Web Audio API and queue them up to be played, one after the other.
Because the Web Audio API has high-precision timing, you won't hear any "seams" between the playback of each buffer if you do the scheduling correctly.