Introducing the Audio API extension

  • Revision slug: Introducing_the_Audio_API_Extension
  • Revision title: Introducing the Audio API extension
  • Revision id: 39345
  • Created:
  • Creator: sebmozilla
  • Is current revision? No
  • Comment page created, 203 words added

Revision Content

The Audio API extension extends the HTML5 specification of the <audio> and <video> media elements by exposing Audio Metadata and raw Audio data. This enables users to visualize audio data, to process this audio data and to create new Audio data.

Getting the Information of the Audio stream with the 'loadedmetadata' Event

When the metadata of the media element is available, it will trigger a 'loadedmetadata' Event. This event has the following attributes:

  • mozChannels: Number of Channels
  • mozSampleRate: Sample rate per second
  • mozFrameBufferLength: Sample * Number of channels

This information will be needed after to decode the Audio Data stream.

Getting the raw data of the Audio stream with the 'MozAudioAvailable' Event

     As the audio is played, sample data is made available to the audio layer and MozAudioAvailable events are triggered. These events contain raw samples. Those samples may or may not have been played yet at the time of the event and have not been adjusted for mute/volume settings on the media element. Playing, pausing, and seeking the audio also affect the streaming of this raw audio data.

Writing an Audio Stream

asdasd

Processing an Audio Stream

asdasd

See also:

Revision Source

<p>The Audio API extension extends the HTML5 specification of the &lt;audio&gt; and &lt;video&gt; media elements by exposing Audio Metadata and raw Audio data. This enables users to visualize audio data, to process this audio data and to create new Audio data.</p>
<h2>Getting the Information of the Audio stream with the '<strong>loadedmetadata'</strong> Event</h2>
<p>When the metadata of the media element is available, it will trigger a 'loadedmetadata' Event. This event has the following attributes:</p>
<ul> <li>mozChannels: Number of Channels</li> <li>mozSampleRate: <span style="font-weight: bold;">S</span><strong>ample rate per second</strong></li> <li>mozFrameBufferLength: Sample * Number of channels</li>
</ul>
<p>This information will be needed after to decode the Audio Data stream.</p>
<h2>Getting the raw data of the Audio stream with the '<strong>MozAudioAvailable' </strong>Event</h2>
<p>     As the audio is played, sample data is made available to the audio layer and <strong>MozAudioAvailable</strong> events are triggered. These events contain raw samples. Those samples may or may not have been played yet at the time of the event and have not been adjusted for mute/volume settings on the media element. Playing, pausing, and seeking the audio also affect the streaming of this raw audio data.</p>
<h2>Writing an Audio Stream</h2>
<p>asdasd</p>
<h2>Processing an Audio Stream</h2>
<p>asdasd</p>
<p>See also:</p>
<ul> <li>Wikipedia Article on Digital Audio: <a class=" external" href="http://en.wikipedia.org/wiki/Digital_audio" rel="freelink">http://en.wikipedia.org/wiki/Digital_audio</a></li>
</ul>
Revert to this revision