Whether it comes in the form of simple beeps or in the form
of symphonies (or gangster rap or whatever), Android applications
often need to play audio. A few things in Android can play audio
automatically, such as a Notification
. However,
once you get past those, you are on your own.
Fortunately for you, Android offers support for audio playback, and we will examine some of the options in this chapter.
Understanding this chapter requires that you have read the core chapters of this book.
In Android, you have a few different places you can pull media clips from — one of these will hopefully fit your needs:
res/raw/
in your
project), so they are bundled with your application. The benefit is
that you’re guaranteed the clips will be there; the downside is that
they cannot be replaced without upgrading the application.assets/
in your project)
and reference them via file:///android_asset/
URLs in a Uri
. The
benefit over raw resources is that this location works with APIs that
expect Uri
parameters instead of resource IDs. The downside —
assets are only replaceable when the application is upgraded —
remains. On the whole, the audio APIs tend to favor raw resources
over assets.If you want to play back music, particularly material in MP3 format,
you will want to use the MediaPlayer
class. With it, you can feed
it an audio clip, start/stop/pause playback, and get notified on key
events, such as when the clip is ready to be played or is done
playing.
You have three ways to set up a MediaPlayer
and tell it what audio
clip to play:
MediaPlayer.create()
and
provide the resource ID of the clip.Uri
to the clip, use the Uri
-flavored version of
MediaPlayer.create()
.MediaPlayer
using the default constructor, then call setDataSource()
with the
path to the clip.
However, in this case, you also need to call prepare()
or prepareAsync()
. Both will set
up the clip to be ready to play, such as fetching the first few
seconds off the file or stream. The prepare()
method is
synchronous; as soon as it returns, the clip is ready to play. The
prepareAsync()
method is asynchronous.Once the clip is prepared, start()
begins playback, pause()
pauses playback (with start()
picking up playback where pause()
paused), and stop()
ends playback. One caveat: you cannot simply
call start()
again on the MediaPlayer
once you have called
stop()
— we’ll cover a workaround a bit later in this section.
To see this in action, take a look at the
Media/Audio
sample project. It contains a single activity — MainActivity
–
that offers playback of a Creative Commons-licensed audio clip, stored
as an Ogg Vorbis file as a raw resource (R.raw.clip
).
In onCreate()
, we use the static create()
factory method on MediaPlayer
to set up a MediaPlayer
for our audio clip:
@Override
public void onCreate(Bundle state) {
super.onCreate(state);
try {
mp=MediaPlayer.create(this, R.raw.clip);
mp.setOnCompletionListener(this);
}
catch (Exception e) {
goBlooey(e);
}
}
We also register the activity itself as the OnCompletionListener
, to find
out if the audio clip is played through to the end.
Under the covers, create()
not only creates an instance of MediaPlayer
and sets up the data source, but it also calls prepare()
, so the
MediaPlayer
is ready for use once create()
returns. However, this also
means that there might be an exception, such as providing an invalid
resource ID (e.g., one pointing to a Photoshop file instead of an audio
file). goBlooey()
simply logs the exception and shows a Toast
as a
crude form of error handling:
private void goBlooey(Exception e) {
Log.e(getClass().getSimpleName(), getString(R.string.msg_error),
e);
Toast
.makeText(this, R.string.msg_error_toast, Toast.LENGTH_LONG)
.show();
}
Our UI is purely a set of action bar items:
<?xml version="1.0" encoding="utf-8"?>
<menu xmlns:android="http://schemas.android.com/apk/res/android">
<item
android:id="@+id/play"
android:icon="@drawable/ic_play_arrow"
android:showAsAction="ifRoom"
android:title="@string/menu_play" />
<item
android:id="@+id/pause"
android:icon="@drawable/ic_pause"
android:showAsAction="ifRoom"
android:title="@string/menu_pause"
android:visible="false" />
<item
android:id="@+id/stop"
android:icon="@drawable/ic_stop"
android:showAsAction="ifRoom"
android:title="@string/menu_stop"
android:visible="false" />
</menu>
Note that only the play
action bar item is visible at the outset.
We will toggle the visibility of the action bar items based on the status
of the playback of the clip.
We populate the action bar in onCreateOptionsMenu()
, also retrieving
the three MenuItem
objects for our three action bar items:
@Override
public boolean onCreateOptionsMenu(Menu menu) {
getMenuInflater().inflate(R.menu.actions, menu);
play=menu.findItem(R.id.play);
pause=menu.findItem(R.id.pause);
stop=menu.findItem(R.id.stop);
return(super.onCreateOptionsMenu(menu));
}
onOptionsItemSelected()
merely delegates the three action items to
three similarly-named methods: play()
, pause()
, and stop()
:
@Override
public boolean onOptionsItemSelected(MenuItem item) {
switch (item.getItemId()) {
case R.id.play:
play();
return (true);
case R.id.pause:
pause();
return (true);
case R.id.stop:
stop();
return (true);
}
return(super.onOptionsItemSelected(item));
}
The play()
method calls start()
to cause the MediaPlayer
to
begin playing back the audio clip. It also toggles the visibility of
the action bar items, so the pause
and stop
ones are now visible:
private void play() {
mp.start();
play.setVisible(false);
pause.setVisible(true);
stop.setVisible(true);
}
play()
is asynchronous, as the audio clip plays back on another
system-supplied thread. We are not tying up the main application thread
by playing back this clip.
pause()
is similar: it calls pause()
to cause the MediaPlayer
to pause playback of the audio clip. It also toggles the visibility of
the action bar items, so the play
and stop
ones are now visible:
private void pause() {
mp.pause();
play.setVisible(true);
pause.setVisible(false);
stop.setVisible(true);
}
Where things get a bit complicated is in the stop()
method. There,
we not only want to stop playback, but also set up the MediaPlayer
to
be able to play back from the beginning of the clip:
private void stop() {
mp.stop();
pause.setVisible(false);
stop.setVisible(false);
findViewById(android.R.id.content).postDelayed(new Runnable() {
@Override
public void run() {
try {
mp.prepare();
mp.seekTo(0);
play.setVisible(true);
}
catch (Exception e) {
goBlooey(e);
}
}
}, 100);
}
Stopping the playback is merely a matter of calling stop()
on the
MediaPlayer
. After that, we hide all the action bar items.
To reset the MediaPlayer
to play the clip again, you need to re-prepare
the player, via prepare()
or prepareAsync()
. You also need to call
seekTo(0)
. seekTo()
positions playback at a certain number of
milliseconds from the beginning of the audio, so seekTo(0)
repositions
the player back to the beginning.
However, if we try doing this work right away, we get odd results, owing
to the asynchronous nature of the media playback. We need to let the
stop()
complete its work before we prepare()
and seekTo()
. Unfortunately,
there is no listener interface for the stop-completed event. So, we fake
it, delaying the “rewind” of the clip by 100 milliseconds.
Also, once the MediaPlayer
is ready again, we enable the play
action
bar item, allowing playback to commence from start, if the user wants.
Finally, in onDestroy()
, we stop()
the MediaPlayer
if it returns
true
from isPlaying()
, so the playback does not continue after
the activity is destroyed:
@Override
public void onDestroy() {
super.onDestroy();
if (mp.isPlaying()) {
mp.stop();
}
}
While MediaPlayer
is the primary audio playback option,
particularly for content along the lines of MP3 files, there are
other alternatives if you are looking to build other sorts of
applications, notably games and custom forms of streaming audio.
The SoundPool
class’s claim to fame is the ability to overlay
multiple sounds, and do so in a prioritized fashion, so your
application can just ask for sounds to be played and SoundPool
deals with each sound starting, stopping, and blending while playing.
This may make more sense with an example.
Suppose you are creating a first-person shooter. Such a game may have several sounds going on at any one time:
And so on.
In principle, SoundPool
can blend all of those together into a
single audio stream for output. Your game might set up the wind and
surf as constant background sounds, toggle the feet and panting on
and off based on the character’s movement, randomly add the barked
orders, and tie the gunfire based on actual game play.
In reality, your average smartphone will lack the CPU power to handle
all of that audio without harming the frame rate of the game. So, to
keep the frame rate up, you tell SoundPool
to play at most two
streams at once. This means that when nothing else is happening in
the game, you will hear the wind and surf, but during the actual
battle, those sounds get dropped out — the user might never
even miss them — so the game speed remains good.
The lowest-level Java API for playing back audio is AudioTrack
. It
has two main roles:
MediaPlayer
handles. While
MediaPlayer
can handle RTSP, for example, it does not offer SIP. If
you want to create a SIP client (perhaps for a VOIP or Web
conferencing application), you will need to convert the incoming data
stream to PCM format, then hand the stream off to an AudioTrack
instance for playback.AudioTrack
for playback, you will use the least amount of overhead,
minimizing CPU impact on game play and on battery life.If you want your phone to sound like… well… a phone, you can use
ToneGenerator
to have it play back
dual-tone multi-frequency (DTMF) tones. In
other words, you can simulate the sounds played by a regular
“touch-tone” phone in response to button presses. This is used by the
Android dialer, for example, to play back the tones when users dial
the phone using the on-screen keypad, as an audio reinforcement.
Note that these will play through the phone’s earpiece, speaker, or
attached headset. They do not play through the outbound call stream.
In principle, you might be able to get ToneGenerator
to play tones
through the speaker loud enough to be picked up by the microphone,
but this probably is not a recommended practice.
You can create a ToneGenerator
through its constructor. This
takes two parameters:
final private ToneGenerator beeper=
new ToneGenerator(AudioManager.STREAM_NOTIFICATION, 100);
The stream indication helps with muting; if the user has muted this
particular stream, then ToneGenerator
will wind up not generating a tone.
Then, when you need a beep, you can call startTone()
, with
the identifier of one of the many DTMF tones listed on the ToneGenerator
class:
beeper.startTone(ToneGenerator.TONE_PROP_NACK);
Many of the tones have a fixed duration. In that case, startTone()
will play that tone to completion, or until you call stopTone()
to
interrupt it.
Some tones will repeat indefinitely. There is a second startTone()
variant that takes a duration in milliseconds that you can use to
automatically stop the tone after a particular period of time. Or, you
can use stopTone()
to stop the one.