AudioBuffer
Baseline Widely available
This feature is well established and works across many devices and browser versions. It’s been available across browsers since April 2021.
La interfaz AudioBuffer
representa un pequeño recurso de audio que se almacena en memoria, creado a partir de un archivo de audio usando el método AudioContext.decodeAudioData()
, o de datos en bruto con el método AudioContext.createBuffer()
. Una véz cargado en AudioBuffer, el audio puede ser reproducido pasandolo en el método AudioBufferSourceNode
.
Objetos de este tipo son diseñados para almacenar pequeños trozos de audio, normalmente menos de 45 segundos. Para audios más extensos, los objectos implementan MediaElementAudioSourceNode
que es más adecuado. El buffer contiene data en el siguiente formado: non-interleaved IEEE754 32-bit linear PCM with a nominal range between -1
and +1
, that is, 32bits floating point buffer, with each samples between -1.0 and 1.0. If the AudioBuffer
has multiple channels, they are stored in separate buffer.
Constructor
AudioBuffer()
-
Crea y retorna una nueva instancia de
AudioBuffer
Propiedades
AudioBuffer.sampleRate
Read only-
Returns a float representing the sample rate, in samples per second, of the PCM data stored in the buffer.
AudioBuffer.length
Read only-
Returns an integer representing the length, in sample-frames, of the PCM data stored in the buffer.
AudioBuffer.duration
Read only-
Returns a double representing the duration, in seconds, of the PCM data stored in the buffer.
AudioBuffer.numberOfChannels
Read only-
Returns an integer representing the number of discrete audio channels described by the PCM data stored in the buffer.
Métodos
AudioBuffer.getChannelData()
-
Returns a
Float32Array
containing the PCM data associated with the channel, defined by thechannel
parameter (with0
representing the first channel). AudioBuffer.copyFromChannel()
-
Copies the samples from the specified channel of the
AudioBuffer
to thedestination
array. AudioBuffer.copyToChannel()
-
Copies the samples to the specified channel of the
AudioBuffer
, from thesource
array.
Ejemplo
El siguiente ejemplo muestra como se crea un AudioBuffer
y rellena con un sonido blanco aleatorio. Puedes encontrar el código completo en nuestro repositorio: webaudio-examples; y una versión disponible: running live
var audioCtx = new (window.AudioContext || window.webkitAudioContext)();
// Create an empty three-second stereo buffer at the sample rate of the AudioContext
var myArrayBuffer = audioCtx.createBuffer(
2,
audioCtx.sampleRate * 3,
audioCtx.sampleRate,
);
// Fill the buffer with white noise;
// just random values between -1.0 and 1.0
for (var channel = 0; channel < myArrayBuffer.numberOfChannels; channel++) {
// This gives us the actual array that contains the data
var nowBuffering = myArrayBuffer.getChannelData(channel);
for (var i = 0; i < myArrayBuffer.length; i++) {
// Math.random() is in [0; 1.0]
// audio needs to be in [-1.0; 1.0]
nowBuffering[i] = Math.random() * 2 - 1;
}
}
// Get an AudioBufferSourceNode.
// This is the AudioNode to use when we want to play an AudioBuffer
var source = audioCtx.createBufferSource();
// set the buffer in the AudioBufferSourceNode
source.buffer = myArrayBuffer;
// connect the AudioBufferSourceNode to the
// destination so we can hear the sound
source.connect(audioCtx.destination);
// start the source playing
source.start();
Especificaciones
Specification |
---|
Web Audio API # AudioBuffer |
Compatibilidad con navegadores
BCD tables only load in the browser