One of the biggest driving forces behind the growth of the Internet has been the insatiable demand from users for ever more multimedia in the form of audio and video. Initially, bandwidth was so precious that there was no such thing as live streaming, and it could take minutes or even hours to download an audio track, let alone a video.
The high cost of bandwidth and limited availability of fast modems drove the development of faster and more efficient compression algorithms, such as MP3 audio and MPEG video, but even then the only way to download files in any reasonable length of time was to drastically reduce their quality.
One of my earlier Internet projects, back in 1997, was the UK’s first online radio station licensed by the music authorities. Actually, it was more of a podcast (before the term was coined) because we made a daily half-hour show and then compressed it down to 8-bit, 11KHz mono using an algorithm originally developed for telephony, and it sounded like phone quality, or worse. Still, we quickly gained thousands of listeners who would download the show and then listen to it as they surfed to the sites discussed in it by means of a pop-up browser window containing a plug-in.
Thankfully for us, and everyone publishing multimedia, it soon became possible to offer greater audio and video quality, but still only by asking the user to download and install a plug-in player. Flash became the most popular of these players, after beating rivals such as RealAudio, but it gained a bad reputation as the cause of many a browser crash, and constantly required upgrading when new versions were released.
So it was generally agreed that the way ahead was to come up with some web standards for supporting multimedia directly within the browser. Of course, browser developers such as Microsoft and Google had differing visions of what these standards should look like, but after the dust had settled, they had agreed on a subset of file types that all browsers should play natively, and these were introduced into the HTML5 specification.
Finally, it is possible (as long as you encode your audio and video in a few different formats) to upload multimedia to a web server, place a couple of HTML tags in a web page, and have the media play on any major desktop browser, smartphone, or tablet device, without the user having to download a plug-in or make any other changes.
There are still a lot of older browsers out there, so Flash remains important for supporting them. In this chapter, I show you how to add code to use Flash as a backup to HTML5 audio and video, to cover as many hardware and software combinations as possible.
The term codec stands for encoder/decoder. It describes the functionality provided by software that encodes and decodes media such as audio and video. In HTML5 there are a number of different sets of codecs available, depending on the browser used.
Following are the codecs supported by the HTML5 <audio>
tag (and also when audio is attached to HTML5 video):
audio/aac
.audio/mpeg
.audio/wav
, but you may also see audio/wave
.audio/ogg
, or sometimes audio/oga
.The following list summarizes the major operating systems and browsers, along with the audio types their latest versions support:
Apple iOS: AAC, MP3, PCM
Apple Safari: AAC, MP3, PCM
Google Android: 2.3+ AAC, MP3, Vorbis
Google Chrome: AAC, MP3, Vorbis
Microsoft Internet Explorer: AAC, MP3
Mozilla Firefox: MP3, PCM, Vorbis
Opera: PCM, Vorbis
The outcome of these different levels of codec support is that you always need at least two versions of each audio file to ensure it will play on all platforms. One of these should be Vorbis to support Opera, but for the second you have a choice of either AAC or MP3.
To cater to all platforms, you need to record or convert your content using multiple codecs and then list them all within <audio>
and </audio>
tags, as in Example 24-1. The nested <source>
tags then contain the various media you wish to offer to a browser. Because the controls
attribute is supplied, the result looks like Figure 24-1.
<audio controls> <source src='audio.m4a' type='audio/aac'> <source src='audio.mp3' type='audio/mp3'> <source src='audio.ogg' type='audio/ogg'> </audio>
In this example I included three different audio types, because that’s perfectly acceptable, and can be a good idea if you wish to ensure that each browser can locate its preferred format rather than just one it knows how to handle. However, the example will still play on all platforms if one or the other (but not both) of the MP3 or the AAC files is dropped.
The <audio>
element and its partner <source>
tag support several attributes:
autoplay
controls
loop
preload
src
type
If you don’t supply the controls
attribute to the <audio>
tag, and don’t use the autoplay
attribute either, the sound will not play and there won’t be a Play button for the user to click to start playback. This would leave you no option other than to offer this functionality in JavaScript, as in Example 24-2 (with the additional code required highlighted in bold), which provides the ability to play and pause the audio, as shown in Figure 24-2.
<!DOCTYPE html> <html> <head> <title>Playing Audio with JavaScript</title> <script src='OSC.js'></script> </head> <body> <audio id='myaudio'> <source src='audio.m4a' type='audio/aac'> <source src='audio.mp3' type='audio/mp3'> <source src='audio.ogg' type='audio/ogg'> </audio> <button onclick='playaudio()'>Play Audio</button> <button onclick='pauseaudio()'>Pause Audio</button> <script> function playaudio() { O('myaudio').play() } function pauseaudio() { O('myaudio').pause() } </script> </body> </html>
This works by calling the play
or pause
methods of the myaudio
element when the buttons are clicked.
It will probably be necessary to support older browsers for the foreseeable future by providing a fallback to Flash. Example 24-3 shows how you can do this using a Flash plug-in saved as audioplayer.swf (available, along with all the examples, in the free download at the http://lpmj.net companion website). The code to add is highlighted in bold.
<audio controls> <object type="application/x-shockwave-flash" data="audioplayer.swf" width="300" height="30"> <param name="FlashVars" value="mp3=audio.mp3&showstop=1&showvolume=1"> </object> <source src='audio.m4a' type='audio/aac'> <source src='audio.mp3' type='audio/mp3'> <source src='audio.ogg' type='audio/ogg'> </audio>
Here we take advantage of the fact that on non-HTML5 browsers, anything inside the <audio>
tag (other than the <source>
elements, which are ignored) will be acted on by the browser. Therefore, by placing an <object>
element there that calls up a Flash player, we ensure that any non-HTML5 browsers will at least have a chance of playing the audio, as long as they have Flash installed, as shown in Figure 24-3.
The particular audio player used in this example, audioplayer.swf, takes the following arguments and values to the FlashVars
attribute of the <param>
element:
mp3
showstop
1
, shows the Stop button; otherwise, it is not displayed.showvolume
1
, shows the volume bar; otherwise, it is not displayed.As with many elements, you can easily resize the object to (for example) 300×30 pixels by providing these values to its width
and height
attributes.
Playing video in HTML5 is quite similar to audio; you just use the <video>
tag and provide <source>
elements for the media you are offering. Example 24-4 shows how you do this with three different video codec types, as displayed in Figure 24-4.
<video width='560' height='320' controls> <source src='movie.mp4' type='video/mp4'> <source src='movie.webm' type='video/webm'> <source src='movie.ogv' type='video/ogg'> </video>
As with audio, there are a number of video codecs available, with differing support across multiple browsers. These codecs come in different containers, as follows:
video/mp4
.video/ogg
, or sometimes video/ogv
.video/webm
.These may then contain one of the following video codecs:
The following list details the major operating systems and browsers, along with the video types their latest versions support:
Apple iOS: MP4/H.264
Apple Safari: MP4/H.264
Google Android: MP4, OGG, WebM/H.264, Theora, VP8
Google Chrome: MP4, OGG, WebM/H.264, Theora, VP8, VP9
Internet Explorer: MP4/H.264
Mozilla Firefox: MP4, OGG, WebM/H.264, Theora, VP8, VP9
Opera: OGG, WebM/Theora, VP8
Looking at this list, it’s clear that MP4/H.264 is almost unanimously supported, except for the Opera browser. So if you’re prepared to ignore the 1 percent or so of users this comprises (and hope that Opera will soon have to adopt the format anyway), you need to supply your video using only one file type: MP4/H.264. But for maximum viewing, you really ought to encode in OGG/Theora or OGG/VP8 as well (but not VP9, as it’s not yet been adopted by Opera).
Therefore, the movie.webm file in Example 24-4 isn’t strictly needed, but shows how you can add all the different file types you like, to give browsers the opportunity to play back the formats they prefer.
The <video>
element and accompanying <source>
tag support the following attributes:
autoplay
controls
height
loop
muted
poster
preload
src
type
width
If you wish to control video playback from JavaScript, you can do so using code such as that in Example 24-5 (with the additional code required highlighted in bold), and shown in Figure 24-5.
<!DOCTYPE html> <html> <head> <title>Playing Video with JavaScript</title> <script src='OSC.js'></script> </head> <body> <video id='myvideo' width='560' height='320'> <source src='movie.mp4' type='video/mp4'> <source src='movie.webm' type='video/webm'> <source src='movie.ogv' type='video/ogg'> </video><br> <button onclick='playvideo()'>Play Video</button> <button onclick='pausevideo()'>Pause Video</button> <script> function playvideo() { O('myvideo').play() } function pausevideo() { O('myvideo').pause() } </script> </body> </html>
This code is just like controlling audio from JavaScript. Simply call the play
and/or pause
methods of the myvideo
object to play and pause the video.
Also as with audio, older versions of browsers will still be in general use for a while to come, so it makes sense to offer a Flash video fallback to people with non-HTML5 browsers. Example 24-6 shows you how to do this (highlighted in bold) using the flowplayer.swf file (available in the free download at http://lpmj.net), and Figure 24-6 shows how it displays in a browser that doesn’t support HTML5 video.
<video width='560' height='320' controls> <object width='560' height='320' type='application/x-shockwave-flash' data='flowplayer.swf'> <param name='movie' value='flowplayer.swf'> <param name='flashvars' value='config={"clip": { "url": "http://tinyurl.com/html5video-mp4", "autoPlay":false, "autoBuffering":true}}'> </object> <source src='movie.mp4' type='video/mp4'> <source src='movie.webm' type='video/webm'> <source src='movie.ogv' type='video/ogg'> </video>
This Flash video player is particular about security, so it won’t play videos from a local file system, only from a web server, so I have supplied a file on the Web (at tinyurl.com/html5video-mp4) for this example to play.
Here are the argument s to supply to the flashvars
attribute of the <param>
element:
url
autoPlay
true
, plays automatically; otherwise, waits for the Play button to be clicked.autoBuffering
true
, in order to minimize buffering later on with slow connections, before it starts playing, the video will be preloaded sufficiently for the available bandwidth.For more information on the Flash flowplayer program (and an HTML5 version), check out http://flowplayer.org.
Using the information in this chapter, you will be able to embed any audio and video on almost all browsers and platforms without worrying about whether users may or may not be able to play it.
In the following chapter, I’ll demonstrate the use of a number of other HTML5 features, including geolocation and local storage.
See Chapter 24 Answers in Appendix A for the answers to these questions.