The AudioContext interface represents an audio-processing graph built from audio modules linked together, each represented by an AudioNode. An audio context controls both the creation of the nodes it contains and the execution of the audio processing, or decoding. You need to create an AudioContext before you do anything else, as everything happens inside a context.


Creates and returns a new AudioContext object.


Also inherits properties from its parent interface, BaseAudioContext.

AudioContext.baseLatency Read only
Returns the number of seconds of processing latency incurred by the AudioContext passing the audio from the AudioDestinationNode to the audio subsystem.
AudioContext.outputLatency Read only
Returns an estimation of the output latency of the current audio context.


Also inherits methods from its parent interface, BaseAudioContext.

Closes the audio context, releasing any system audio resources that it uses.
Creates a MediaElementAudioSourceNode associated with an HTMLMediaElement. This can be used to play and manipulate audio from <video> or <audio> elements.
Creates a MediaStreamAudioSourceNode associated with a MediaStream representing an audio stream which may come from the local computer microphone or other sources.
Creates a MediaStreamAudioDestinationNode associated with a MediaStream representing an audio stream which may be stored in a local file or sent to another computer.
Creates a MediaStreamTrackAudioSourceNode associated with a MediaStream representing an media stream track.
Returns a new AudioTimestamp object containing two correlated context's audio stream position values.
Resumes the progression of time in an audio context that has previously been suspended.
Suspends the progression of time in the audio context, temporarily halting audio hardware access and reducing CPU/battery usage in the process.


Basic audio context declaration:

var audioCtx = new AudioContext();

Cross browser variant:

var AudioContext = window.AudioContext || window.webkitAudioContext;
var audioCtx = new AudioContext();

var oscillatorNode = audioCtx.createOscillator();
var gainNode = audioCtx.createGain();
var finish = audioCtx.destination;
// etc.


Specification Status Comment
Web Audio API
The definition of 'AudioContext' in that specification.
Working Draft  

Browser compatibility

FeatureChromeEdgeFirefoxInternet ExplorerOperaSafari
Basic support


14 — 57 webkit

Yes25 No


15 — 44 webkit

6 webkit
AudioContext() constructor55 Yes25 No42 Yes webkit
baseLatency60 ? No No47 No
outputLatency Yes ? No No Yes No
close43 ?40 No Yes ?
createMediaElementSource14 Yes25 No156
createMediaStreamSource14 Yes25 No156
createMediaStreamDestination14 Yes25 No156
createMediaStreamTrackSource ? ? No No ? No
getOutputTimestamp57 ? No No44 No
suspend43 ?40 No Yes ?
FeatureAndroid webviewChrome for AndroidEdge mobileFirefox for AndroidOpera AndroidiOS SafariSamsung Internet
Basic support Yes


14 — 57 webkit



15 — 44 webkit

? ?
AudioContext() constructor5555 ?2542 ? ?
baseLatency6060 ? No47 No ?
outputLatency Yes Yes ? No Yes ? ?
close4343 ?40 Yes ? ?
createMediaElementSource Yes14 Yes2615 ? ?
createMediaStreamSource Yes14 Yes2615 ? ?
createMediaStreamDestination Yes14 Yes2615 ? ?
createMediaStreamTrackSource ? ? ? No ? No ?
getOutputTimestamp5757 ? No44 No ?
suspend4343 ?40 Yes ? ?

See also