BaseAudioContext
The BaseAudioContext
interface acts as a base definition for online and offline audio-processing graphs, as represented by AudioContext
(en-US) and OfflineAudioContext
(en-US) resepectively. You wouldn't use BaseAudioContext
directly — you'd use its features via one of these two inheriting interfaces.
A BaseAudioContext
can be a target of events, therefore it implements the EventTarget
interface.
Properties
BaseAudioContext.baseLatency
Read only-
Returns a double epresents the number of seconds of processing latency incurred by the AudioContext passing the audio from the AudioDestinationNode to the audio subsystem.
BaseAudioContext.currentTime
(en-US) Read only-
Returns a double representing an ever-increasing hardware time in seconds used for scheduling. It starts at
0
. BaseAudioContext.destination
(en-US) Read only-
Returns an
AudioDestinationNode
(en-US) representing the final destination of all audio in the context. It can be thought of as the audio-rendering device. BaseAudioContext.listener
(en-US) Read only-
Returns the
AudioListener
(en-US) object, used for 3D spatialization. BaseAudioContext.sampleRate
(en-US) Read only-
Returns a float representing the sample rate (in samples per second) used by all nodes in this context. The sample-rate of an
AudioContext
(en-US) cannot be changed. BaseAudioContext.state
(en-US) Read only-
Returns the current state of the
AudioContext
.
Event handlers
BaseAudioContext.onstatechange
(en-US)-
An event handler that runs when an event of type
statechange
(en-US) has fired. This occurs when theAudioContext
's state changes, due to the calling of one of the state change methods (AudioContext.suspend
(en-US),AudioContext.resume
(en-US), orAudioContext.close
(en-US)).
Methods
Also implements methods from the interface EventTarget
.
BaseAudioContext.createBuffer()
(en-US)-
Creates a new, empty
AudioBuffer
object, which can then be populated by data and played via anAudioBufferSourceNode
(en-US). BaseAudioContext.createConstantSource()
(en-US)-
Creates a
ConstantSourceNode
(en-US) object, which is an audio source that continuously outputs a monaural (one-channel) sound signal whose samples all have the same value. BaseAudioContext.createBufferSource()
(en-US)-
Creates an
AudioBufferSourceNode
(en-US), which can be used to play and manipulate audio data contained within anAudioBuffer
object.AudioBuffer
s are created usingAudioContext.createBuffer
(en-US) or returned byAudioContext.decodeAudioData
(en-US) when it successfully decodes an audio track. BaseAudioContext.createScriptProcessor()
(en-US)-
Creates a
ScriptProcessorNode
(en-US), which can be used for direct audio processing via JavaScript. BaseAudioContext.createStereoPanner()
(en-US)-
Creates a
StereoPannerNode
(en-US), which can be used to apply stereo panning to an audio source. BaseAudioContext.createAnalyser()
(en-US)-
Creates an
AnalyserNode
, which can be used to expose audio time and frequency data and for example to create data visualisations. BaseAudioContext.createBiquadFilter()
-
Creates a
BiquadFilterNode
(en-US), which represents a second order filter configurable as several different common filter types: high-pass, low-pass, band-pass, etc. BaseAudioContext.createChannelMerger()
(en-US)-
Creates a
ChannelMergerNode
(en-US), which is used to combine channels from multiple audio streams into a single audio stream. BaseAudioContext.createChannelSplitter()
(en-US)-
Creates a
ChannelSplitterNode
(en-US), which is used to access the individual channels of an audio stream and process them separately. BaseAudioContext.createConvolver()
(en-US)-
Creates a
ConvolverNode
(en-US), which can be used to apply convolution effects to your audio graph, for example a reverberation effect. BaseAudioContext.createDelay()
(en-US)-
Creates a
DelayNode
(en-US), which is used to delay the incoming audio signal by a certain amount. This node is also useful to create feedback loops in a Web Audio API graph. BaseAudioContext.createDynamicsCompressor()
(en-US)-
Creates a
DynamicsCompressorNode
(en-US), which can be used to apply acoustic compression to an audio signal. BaseAudioContext.createGain()
(en-US)-
Creates a
GainNode
(en-US), which can be used to control the overall volume of the audio graph. BaseAudioContext.createIIRFilter()
(en-US)-
Creates an
IIRFilterNode
(en-US), which represents a second order filter configurable as several different common filter types. BaseAudioContext.createOscillator()
(en-US)-
Creates an
OscillatorNode
(en-US), a source representing a periodic waveform. It basically generates a tone. BaseAudioContext.createPanner()
(en-US)-
Creates a
PannerNode
(en-US), which is used to spatialise an incoming audio stream in 3D space. BaseAudioContext.createPeriodicWave()
(en-US)-
Creates a
PeriodicWave
(en-US), used to define a periodic waveform that can be used to determine the output of anOscillatorNode
(en-US). BaseAudioContext.createWaveShaper()
(en-US)-
Creates a
WaveShaperNode
(en-US), which is used to implement non-linear distortion effects. BaseAudioContext.decodeAudioData()
(en-US)-
Asynchronously decodes audio file data contained in an
ArrayBuffer
. In this case, the ArrayBuffer is usually loaded from anXMLHttpRequest
'sresponse
attribute after setting theresponseType
toarraybuffer
. This method only works on complete files, not fragments of audio files. BaseAudioContext.resume()
(en-US)-
Resumes the progression of time in an audio context that has previously been suspended/paused.
Examples
Basic audio context declaration:
var audioCtx = new AudioContext();
Cross browser variant:
var AudioContext = window.AudioContext || window.webkitAudioContext;
var audioCtx = new AudioContext();
var oscillatorNode = audioCtx.createOscillator();
var gainNode = audioCtx.createGain();
var finish = audioCtx.destination;
// etc.
Especificaciones
Specification |
---|
Web Audio API # BaseAudioContext |
Compatibilidad con navegadores
BCD tables only load in the browser