AudioProcessingEvent

Deprecated: This feature is no longer recommended. Though some browsers might still support it, it may have already been removed from the relevant web standards, may be in the process of being dropped, or may only be kept for compatibility purposes. Avoid using it, and update existing code if possible; see the compatibility table at the bottom of this page to guide your decision. Be aware that this feature may cease to work at any time.

The Web Audio API AudioProcessingEvent represents events that occur when a ScriptProcessorNode input buffer is ready to be processed.

Note: As of the August 29 2014 Web Audio API spec publication, this feature has been marked as deprecated, and is soon to be replaced by AudioWorklet.

Event AudioProcessingEvent

Properties

Also implements the properties inherited from its parent, Event.

playbackTime Read only

A double representing the time when the audio will be played, as defined by the time of AudioContext.currentTime.

inputBuffer Read only

An AudioBuffer that is the buffer containing the input audio data to be processed. The number of channels is defined as a parameter numberOfInputChannels, of the factory method AudioContext.createScriptProcessor(). Note that the returned AudioBuffer is only valid in the scope of the event handler.

outputBuffer Read only

An AudioBuffer that is the buffer where the output audio data should be written. The number of channels is defined as a parameter, numberOfOutputChannels, of the factory method AudioContext.createScriptProcessor(). Note that the returned AudioBuffer is only valid in the scope of the event handler.

Example

See BaseAudioContext.createScriptProcessor() for example code that uses an AudioProcessingEvent.

Browser compatibility

BCD tables only load in the browser

See also