The Audio API extension extends the HTML5 specification of the <audio> and <video> media elements by exposing Audio Metadata and raw Audio data. This enables users to visualize audio data, to process this audio data and to create new Audio data.
The 'loadedmetadata' Event
When the metadata of the media element is available, it will trigger a 'loadedmetadata' Event. This event has the following attributes:
- mozChannels: Number of Channels
- mozSampleRate: Sample rate per second
- mozFrameBufferLength: Number of samples collected in all channels
This information will be needed after to decode the Audio Data stream. The following example extracts the data from an audio element:
The 'MozAudioAvailable' Event
As the audio is played, sample data is made available to the audio layer and the Audio buffer (size defined in mozFrameBufferLength) gets filled with those samples. Once the buffer is full, the event MozAudioAvailable will be triggered. This event will therefore contain the raw samples of a period of time. Those samples may or may not have been played yet at the time of the event and have not been adjusted for mute/volume settings on the media element. Playing, pausing, and seeking the audio also affect the streaming of this raw audio data.
The MozAudioAvailable event has 2 attributes:
- frameBuffer: Framebuffer (i.e., an array) containing decoded audio sample data (i.e., floats).
- time: Time for these samples measured from the start in seconds
Writing an Audio Stream
Processing an Audio Stream
- Wikipedia Article on Digital Audio: http://en.wikipedia.org/wiki/Digital_audio