mozilla
검색 결과

    OfflineAudioContext

    이 문서는 기술적인 검토가 필요합니다.

    아직 자원 봉사자들이 한국어로 현재 문서를 번역하지 않았습니다. 가입해서 이 문서가 번역되는 일에 함께 해 주세요!

    The OfflineAudioContext interface is an AudioContext interface representing an audio-processing graph built from linked together AudioNodes. In contrast with a standard AudioContext, an OfflineAudioContext doesn't render the audio to the device hardware; instead, it generates it, as fast as it can, and outputs the result to an AudioBuffer.

    It is important to note that, whereas you can create a new AudioContext using the new AudioContext() constructor with no arguments, the new OfflineAudioContext() constructor requires three arguments:

    new OfflineAudioContext(numOfChannels,length,sampleRate);

    This works in exactly the same way as when you create a new AudioBuffer with the AudioContext.createBuffer method. For more detail, read Audio buffers: frames, samples and channels from our Basic concepts guide. The arguments are:

    numOfChannels
    An integer representing the number of channels this buffer should have. Implementations must support a minimum 32 channels.
    length
    An integer representing the size of the buffer in sample-frames.
    sampleRate
    The sample-rate of the linear audio data in sample-frames per second. An implementation must support sample-rates in at least the range 22050 to 96000, with 44100 being the most commonly used.

    Note: Like a regular AudioContext, an OfflineAudioContext can be the target of events, therefore it implements the EventTarget interface.

    Properties

    Implements properties from its parent, AudioContext.

    Event handlers

    OfflineAudioContext.oncomplete
    Is an EventHandler called when the processing is terminated, that is when the complete event (of type OfflineAudioCompletionEvent) is raised.

    Methods

    Also implements methods from its parent, AudioContext, and EventTarget too.

    OfflineAudioContext.startRendering()
    Starts rendering the audio, taking into account the current connections and the current scheduled changes. This is the event-based version.
    OfflineAudioContext.startRendering_(promise)
    Starts rendering the audio, taking into account the current connections and the current scheduled changes. This is the newer promise version.

    Example

    In this simple example, we declare both an AudioContext and an OfflineAudioContext object. We use the AudioContext to load an audio track via XHR (AudioContext.decodeAudioData), then the OfflineAudioContext to render the audio into an AudioBufferSourceNode and play the track through. After the offline audio graph is set up, you need to render it to an AudioBuffer using OfflineAudioContext.startRendering.

    When the startRendering() promise resolves, rendering has completed and the output AudioBuffer is returned out of the promise.

    At this point we create another audio context, create an AudioBufferSourceNode inside it, and set its buffer to be equal to the promise AudioBuffer. This is then played as part of a simple standard audio graph.

    Note: For a working example, see our offline-audio-context-promise Github repo (see the source code too.)

    // define online and offline audio context
    
    var audioCtx = new AudioContext();
    var offlineCtx = new OfflineAudioContext(2,44100*40,44100);
    
    source = offlineCtx.createBufferSource();
    
    // use XHR to load an audio track, and
    // decodeAudioData to decode it and OfflineAudioContext to render it
    
    function getData() {
      request = new XMLHttpRequest();
    
      request.open('GET', 'viper.ogg', true);
    
      request.responseType = 'arraybuffer';
    
      request.onload = function() {
        var audioData = request.response;
    
        audioCtx.decodeAudioData(audioData, function(buffer) {
          myBuffer = buffer;
          source.buffer = myBuffer;
          source.connect(offlineCtx.destination);
          source.start();
          //source.loop = true;
          offlineCtx.startRendering().then(function(renderedBuffer) {
            console.log('Rendering completed successfully');
            var audioCtx = new (window.AudioContext || window.webkitAudioContext)();
            var song = audioCtx.createBufferSource();
            song.buffer = renderedBuffer;
    
            song.connect(audioCtx.destination);
    
            play.onclick = function() {
              song.start();
            }
          }).catch(function(err) {
              console.log('Rendering failed: ' + err);
              // Note: The promise should reject when startRendering is called a second time on an OfflineAudioContext
          });
        });
      }
    
      request.send();
    }
    
    // Run getData to start the process off
    
    getData();

    Specifications

    Specification Status Comment
    Web Audio API
    The definition of 'OfflineAudioContext' in that specification.
    Working Draft  

    Browser compatibility

    Feature Chrome Firefox (Gecko) Internet Explorer Opera Safari (WebKit)
    Basic support 10.0webkit 25.0 (25.0)  Not supported 15.0webkit
    22 (unprefixed)
    6.0webkit
    Promise-based startRendering()

    42.0

    37.0 (37.0)  ? ? ?
    Feature Chrome for Android Firefox Mobile (Gecko) Firefox OS IE Mobile Opera Mobile Safari Mobile Chrome for Android
    Basic support 33.0 26.0 1.2 ? ? ? ?
    Promise-based startRendering()

    42.0

    37.0 2.2 ? ? ? ?

    See also

    문서 태그 및 공헌자

    Contributors to this page: kscarfone, tregagnon, chrisdavidmills, fscholz, teoli
    최종 변경: fscholz,