We're looking for a person or people to help audit MDN to find places we could speed up. Is this you or someone you know? Check out the RFP: https://mzl.la/2IHcMiE


The oncomplete event handler of the OfflineAudioContext interface is called when the audio processing is terminated, that is when the complete event (of type OfflineAudioCompletionEvent) is raised.


var offlineAudioCtx = new OfflineAudioContext();
offlineAudioCtx.oncomplete = function() { ... }


In this simple example, we declare both an AudioContext and an OfflineAudioContext object. We use the AudioContext to load an audio track via XHR (AudioContext.decodeAudioData), then the OfflineAudioContext to render the audio into an AudioBufferSourceNode and play the track through. After the offline audio graph is set up, you need to render it to an AudioBuffer using OfflineAudioContext.startRendering.

When the startRendering() promise resolves, rendering has completed and the output AudioBuffer is returned out of the promise.

At this point we create another audio context, create an AudioBufferSourceNode inside it, and set its buffer to be equal to the promise AudioBuffer. This is then played as part of a simple standard audio graph.

Note: For a working example, see our offline-audio-context-promise Github repo (see the source code too.)

// define online and offline audio context

var audioCtx = new AudioContext();
var offlineCtx = new OfflineAudioContext(2,44100*40,44100);

source = offlineCtx.createBufferSource();

// use XHR to load an audio track, and
// decodeAudioData to decode it and OfflineAudioContext to render it

function getData() {
  request = new XMLHttpRequest();

  request.open('GET', 'viper.ogg', true);

  request.responseType = 'arraybuffer';

  request.onload = function() {
    var audioData = request.response;

    audioCtx.decodeAudioData(audioData, function(buffer) {
      myBuffer = buffer;
      source.buffer = myBuffer;
      //source.loop = true;
      offlineCtx.startRendering().then(function(renderedBuffer) {
        console.log('Rendering completed successfully');
        var audioCtx = new (window.AudioContext || window.webkitAudioContext)();
        var song = audioCtx.createBufferSource();
        song.buffer = renderedBuffer;


        play.onclick = function() {
      }).catch(function(err) {
          console.log('Rendering failed: ' + err);
          // Note: The promise should reject when startRendering is called a second time on an OfflineAudioContext


// Run getData to start the process off



Specification Status Comment
Web Audio API
The definition of 'oncomplete' in that specification.
Working Draft  

Browser compatibility

FeatureChromeEdgeFirefoxInternet ExplorerOperaSafari
Basic support14 Yes25 No156
FeatureAndroid webviewChrome for AndroidEdge mobileFirefox for AndroidIE mobileOpera AndroidiOS Safari
Basic support Yes14 Yes26 No15 ?

See also

Document Tags and Contributors

 Contributors to this page: fscholz, abbycar, chrisdavidmills
 Last updated by: fscholz,