MDN’s new design is in Beta! A sneak peek: https://blog.mozilla.org/opendesign/mdns-new-design-beta/

AudioContext

To tłumaczenie jest niekompletne. Pomóż przetłumaczyć ten artykuł z języka angielskiego.

Interfejs AudioContext reprezentuje wykres przetwarzania sygnału audio, utworzonego z połączonych ze sobą modułów audio, przy czym każdy z tych modułów reprezentowany jest przez AudioNode. Kontekst audio kontroluje zarówno powstawanie zawartych w nim powiązań, jak również realizację przetwarzania audio lub dekodowania.  Niezbędne jest stworzenie AudioContext przez wprowadzeniem czegokolwiek innego, jako że wszystko dzieje się w kontekście.

AudioCintext może stanowić cel (target) eventów, dlatego implementuje inferfejs EventTarget.

Konstruktor

AudioContext()
Tworzy i zwraca nowy obiekt AudioContext.

Właściwości

AudioContext.currentTime Read only
Zwraca double reprezentujące stale liczony czas w sekundach, używany do przyporządkowywania. Rozpoczyna się od 0.
AudioContext.destination Read only
Zwraca AudioDestinationNode reprezentujące ostateczny cel wszyskich audio w kontekście. Należy traktować go jako urządzenie interpretujące audio.
AudioContext.listener Read only
Zwraca obiekt AudioListener, używany do przestrzenności 3D.
AudioContext.sampleRate Read only
Zwraca float reprezentujący wskaźnik próbkowania (w samplach na sekundę) używany we wszystkich połączeniach w tym kontekście. Wskaźnik próbkowania AudioContext nie może być zmieniany.
AudioContext.state Read only
Zwraca aktualny status AudioContext.
AudioContext.mozAudioChannelType Read only
Używany do zwracania kanału audio, tak by grany dźwięk w AudioContext był poprawnie odtwarzany na urządządzeniu Firefox OS.

Event handlers

AudioContext.onstatechange
An event handler that runs when an event of type statechange has fired. This occurs when the AudioContext's state changes, due to the calling of one of the state change methods (AudioContext.suspend, AudioContext.resume, or AudioContext.close).

Methods

Also implements methods from the interface EventTarget.

AudioContext.close()
Closes the audio context, releasing any system audio resources that it uses.
AudioContext.createBuffer()
Creates a new, empty AudioBuffer object, which can then be populated by data and played via an AudioBufferSourceNode.
AudioContext.createConstantSource()
Creates a ConstantSourceNode object, which is an audio source that continuously outputs a monaural (one-channel) sound signal whose samples all have the same value.
AudioContext.createBufferSource()
Creates an AudioBufferSourceNode, which can be used to play and manipulate audio data contained within an AudioBuffer object. AudioBuffers are created using AudioContext.createBuffer or returned by AudioContext.decodeAudioData when it successfully decodes an audio track.
AudioContext.createMediaElementSource()
Creates a MediaElementAudioSourceNode associated with an HTMLMediaElement. This can be used to play and manipulate audio from <video> or <audio> elements.
AudioContext.createMediaStreamSource()
Creates a MediaStreamAudioSourceNode associated with a MediaStream representing an audio stream which may come from the local computer microphone or other sources.
AudioContext.createMediaStreamDestination()
Creates a MediaStreamAudioDestinationNode associated with a MediaStream representing an audio stream which may be stored in a local file or sent to another computer.
AudioContext.createScriptProcessor()
Creates a ScriptProcessorNode, which can be used for direct audio processing via JavaScript.
AudioContext.createStereoPanner()
Creates a StereoPannerNode, which can be used to apply stereo panning to an audio source.
AudioContext.createAnalyser()
Creates an AnalyserNode, which can be used to expose audio time and frequency data and for example to create data visualisations.
AudioContext.createBiquadFilter()
Creates a BiquadFilterNode, which represents a second order filter configurable as several different common filter types: high-pass, low-pass, band-pass, etc.
AudioContext.createChannelMerger()
Creates a ChannelMergerNode, which is used to combine channels from multiple audio streams into a single audio stream.
AudioContext.createChannelSplitter()
Creates a ChannelSplitterNode, which is used to access the individual channels of an audio stream and process them separately.
AudioContext.createConvolver()
Creates a ConvolverNode, which can be used to apply convolution effects to your audio graph, for example a reverberation effect.
AudioContext.createDelay()
Creates a DelayNode, which is used to delay the incoming audio signal by a certain amount. This node is also useful to create feedback loops in a Web Audio API graph.
AudioContext.createDynamicsCompressor()
Creates a DynamicsCompressorNode, which can be used to apply acoustic compression to an audio signal.
AudioContext.createGain()
Creates a GainNode, which can be used to control the overall volume of the audio graph.
AudioContext.createIIRFilter()
Creates an IIRFilterNode, which represents a second order filter configurable as several different common filter types.
AudioContext.createOscillator()
Creates an OscillatorNode, a source representing a periodic waveform. It basically generates a tone.
AudioContext.createPanner()
Creates a PannerNode, which is used to spatialise an incoming audio stream in 3D space.
AudioContext.createPeriodicWave()
Creates a PeriodicWave, used to define a periodic waveform that can be used to determine the output of an OscillatorNode.
AudioContext.createWaveShaper()
Creates a WaveShaperNode, which is used to implement non-linear distortion effects.
AudioContext.createAudioWorker()
Creates an AudioWorkerNode, which can interact with a web worker thread to generate, process, or analyse audio directly. This was added to the spec on August 29 2014, and is not implemented in any browser yet.
AudioContext.decodeAudioData()
Asynchronously decodes audio file data contained in an ArrayBuffer. In this case, the ArrayBuffer is usually loaded from an XMLHttpRequest's response attribute after setting the responseType to arraybuffer. This method only works on complete files, not fragments of audio files.
AudioContext.getOutputTimestamp()
Returns a new AudioTimestamp containing two correlated context's audio stream position values: the AudioTimestamp.contextTime member contains the time of the sample frame which is currently being rendered by the audio output device (i.e., output audio stream position), in the same units and origin as context's AudioContext.currentTime; the AudioTimestamp.performanceTime member contains the time estimating the moment when the sample frame corresponding to the stored contextTime value was rendered by the audio output device, in the same units and origin as performance.now().
AudioContext.resume()
Resumes the progression of time in an audio context that has previously been suspended.
AudioContext.suspend()
Suspends the progression of time in the audio context, temporarily halting audio hardware access and reducing CPU/battery usage in the process.

Obsolete methods

AudioContext.createJavaScriptNode()
Creates a JavaScriptNode, used for direct audio processing via JavaScript. This method is obsolete, and has been replaced by AudioContext.createScriptProcessor().
AudioContext.createWaveTable()
Creates a WaveTableNode, used to define a periodic waveform. This method is obsolete, and has been replaced by AudioContext.createPeriodicWave().

Examples

Basic audio context declaration:

var audioCtx = new AudioContext();

Cross browser variant:

var AudioContext = window.AudioContext || window.webkitAudioContext;
var audioCtx = new AudioContext();

var oscillatorNode = audioCtx.createOscillator();
var gainNode = audioCtx.createGain();
var finish = audioCtx.destination;
// etc.

Specifications

Specification Status Comment
Web Audio API
The definition of 'AudioContext' in that specification.
Working Draft  

Browser compatibility

Feature Chrome Edge Firefox (Gecko) Internet Explorer Opera Safari (WebKit)
Basic support 10.0webkit
35
(Yes) 25.0 (25.0)  No support 15.0webkit
22
6.0webkit
createStereoPanner() 42.0 (Yes) 37.0 (37.0)  No support No support No support
onstatechange, state, suspend(), resume() (Yes) (Yes) 40.0 (40.0) No support No support No support
createConstantSource() 56.0 No support 52 (52) No support 43 No support
Unprefixed (Yes) (Yes)        
Feature Android Webview Edge Firefox Mobile (Gecko) Firefox OS IE Mobile Opera Mobile Safari Mobile Chrome for Android
Basic support (Yes) (Yes) 37.0 (37.0)  2.2 No support (Yes) No support (Yes)
createStereoPanner() 42.0 (Yes) (Yes) (Yes) No support No support No support 42.0
onstatechange, state, suspend(), resume() (Yes) (Yes) (Yes) (Yes) No support No support No support (Yes)
createConstantSource() 56.0 No support 52.0 (52) No support No support No support No support 56.0
Unprefixed (Yes) (Yes) ? ? ? 43 ? (Yes)

See also

Autorzy i etykiety dokumentu

 Autorzy tej strony: drm404
 Ostatnia aktualizacja: drm404,