An iOS device can be used for the same purpose as the original iPod — to hold and play music, podcasts, and audiobooks. These items constitute the device’s iPod library, or music library. iOS provides the programmer with various forms of access to the device’s music library; you can:
These abilities are provided by the Media Player framework. You’ll need to @import MediaPlayer
.
Everything in the music library, as seen by your code, is an MPMediaEntity. This is an abstract class that endows its subclasses with the ability to describe themselves through key–value pairs called properties. (This use of the word “properties” has nothing to do with Objective-C language properties; these properties are more like entries in an NSDictionary.)
MPMediaEntity has two concrete subclasses, MPMediaItem and MPMediaCollection. An MPMediaItem is a single item (a “song”). An MPMediaCollection is an ordered list of MPMediaItems, rather like an array; it has a count
, and its items
property is an array.
An MPMediaItem has a type, according to the value of its MPMediaItemPropertyMediaType
property: it might, for example, be music, a podcast, an audiobook, or a video. A media item’s properties will be intuitively familiar from your use of iTunes: it has a title, an album title, a track number, an artist, a composer, and so on. Different types of item have slightly different properties; for example, a podcast, in addition to its normal title, has a podcast title.
A playlist is an MPMediaPlaylist, a subclass of MPMediaCollection. Its properties include a title, a flag indicating whether it is a “smart” playlist, and so on.
The property keys have names like MPMediaItemPropertyTitle
. To fetch a property’s value, call valueForProperty:
with its key. You can fetch multiple properties with enumerateValuesForProperties:usingBlock:
.
An item’s artwork image is an instance of the MPMediaItemArtwork class, from which you are supposed to be able to get the image itself scaled to a specified size by calling imageWithSize:
; my experience is that in reality you’ll receive an image of any old size the system cares to give you, so you may have to scale it further yourself. This, for example, is what my Albumen app does:
MPMediaItemArtwork* art = //... UIImage* im = [art imageWithSize:CGSizeMake(36,36)]; // but it probably *isn't* 36 by 36; scale it so that it is if (im) { CGFloat scalew = 36.0/im.size.width; CGFloat scaleh = 36.0/im.size.height; CGFloat scale = (scalew < scaleh) ? scalew : scaleh; CGSize sz = CGSizeMake(im.size.width*scale, im.size.height*scale); UIGraphicsBeginImageContextWithOptions(sz, NO, 0); [im drawInRect:CGRectMake(0,0,sz.width,sz.height)]; im = UIGraphicsGetImageFromCurrentImageContext(); UIGraphicsEndImageContext(); }
Obtaining actual information from the music library requires a query, an MPMediaQuery. First, you form the query. There are two main ways to do this:
MPMediaQuery provides several class methods that form a query ready to ask the music library for all of its songs, or all of its podcasts, and so on. Here’s the complete list:
songsQuery
podcastsQuery
audiobooksQuery
playlistsQuery
albumsQuery
artistsQuery
composersQuery
genresQuery
compilationsQuery
You can attach to the query one or more MPMediaPropertyPredicate instances, forming a set (NSSet) of predicates. These predicates filter the music library according to criteria you specify; to be included in the result, a media item must successfully pass through all the filters (in other words, the predicates are combined using logical-and). A predicate is a simple comparison. It has three aspects:
canFilterByProperty:
).
MPMediaPredicateComparisonEqualTo
, the default) or contain the value you provide (MPMediaPredicateComparisonContains
).
These two ways of forming a query are actually the same; a convenience constructor is just a quick way of obtaining a query already endowed with a filter predicate.
A query also groups its results, according to its groupingType
. Your choices are:
MPMediaGroupingTitle
MPMediaGroupingAlbum
MPMediaGroupingArtist
MPMediaGroupingAlbumArtist
MPMediaGroupingComposer
MPMediaGroupingGenre
MPMediaGroupingPlaylist
MPMediaGroupingPodcastTitle
The query convenience constructors all supply a groupingType
in addition to a filter predicate. Indeed, the grouping is often the salient aspect of the query. For example, an albumsQuery
is in fact merely a songsQuery
with the added feature that its results are grouped by album.
The groups resulting from a query are collections; that is, each is an MPMediaItemCollection. This class, you will recall, is an MPMediaEntity subclass, so a collection has properties. In addition, it has items
and a count
. It also has a representativeItem
property, which gives you just one item from the collection. The reason you need this is that properties of a collection are often embodied in its items rather than in the collection itself. For example, an album has no title; rather, its items have album titles that are all the same. So to learn the title of an album, you ask for the album title of a representative item.
After you form the query, you perform the query. You do this simply by asking for the query’s results. You can ask either for its collections
(if you care about the groups returned from the query) or for its items
. Here, I’ll discover the titles of all the albums:
MPMediaQuery* query = [MPMediaQuery albumsQuery]; NSArray* result = [query collections]; // prove we've performed the query, by logging the album titles for (MPMediaItemCollection* album in result) NSLog(@"%@", [album.representativeItem valueForProperty:MPMediaItemPropertyAlbumTitle]); /* Output starts like this on my device: Beethoven Canons Beethoven Dances Beethoven Piano Duet Beethoven Piano Other Brahms Lieder ... */
Now let’s make our query more elaborate; we’ll get the titles of all the albums whose name contains “Beethoven”. Observe that what we really do is to ask for all songs whose album title contains “Beethoven”, grouped by album; then we learn the album title of a representative item from each resulting collection:
MPMediaQuery* query = [MPMediaQuery albumsQuery]; MPMediaPropertyPredicate* hasBeethoven = [MPMediaPropertyPredicate predicateWithValue:@"Beethoven" forProperty:MPMediaItemPropertyAlbumTitle comparisonType:MPMediaPredicateComparisonContains]; [query addFilterPredicate:hasBeethoven]; NSArray* result = [query collections]; for (MPMediaItemCollection* album in result) NSLog(@"%@", [album.representativeItem valueForProperty:MPMediaItemPropertyAlbumTitle]); /* Output on my device: Beethoven Canons Beethoven Dances Beethoven Piano Duet Beethoven Piano Other */
Similarly, we can get the titles of all the albums containing any songs whose name contains “Sonata”. To do so, we ask for all songs whose title contains “Sonata”, grouped by album; then, as before, we learn the album title of a representative item from each resulting collection:
MPMediaQuery* query = [MPMediaQuery albumsQuery]; MPMediaPropertyPredicate* hasSonata = [MPMediaPropertyPredicate predicateWithValue:@"Sonata" forProperty:MPMediaItemPropertyTitle comparisonType:MPMediaPredicateComparisonContains]; [query addFilterPredicate:hasSonata]; NSArray* result = [query collections]; for (MPMediaItemCollection* album in result) NSLog(@"%@", [album.representativeItem valueForProperty:MPMediaItemPropertyAlbumTitle]); /* Output on my device: Beethoven Piano Duet Beethoven Piano Other Scarlatti Complete Sonatas, Vol. I */
An interesting complication is that the Scarlatti album listed in the results of that example is not actually present on my device. The user’s music library can include purchases and iTunes Match songs that are actually off in “the cloud”. The user can prevent such songs from appearing in the Music app (in the Settings app, Music → Show All Music → Off), but they are still present in the library, and therefore in the results of our queries.
I’ll modify the previous example to list only albums containing “Sonata” songs that are present on the device. The concept “present on the device” is embodied by the MPMediaItemPropertyIsCloudItem
property. All we have to do is add a second predicate:
MPMediaQuery* query = [MPMediaQuery albumsQuery]; MPMediaPropertyPredicate* hasSonata = [MPMediaPropertyPredicate predicateWithValue:@"Sonata" forProperty:MPMediaItemPropertyTitle comparisonType:MPMediaPredicateComparisonContains]; [query addFilterPredicate:hasSonata]; MPMediaPropertyPredicate* isPresent = [MPMediaPropertyPredicate predicateWithValue:@NO forProperty:MPMediaItemPropertyIsCloudItem comparisonType:MPMediaPredicateComparisonEqualTo]; [query addFilterPredicate:isPresent]; NSArray* result = [query collections]; for (MPMediaItemCollection* album in result) NSLog(@"%@", [album.representativeItem valueForProperty:MPMediaItemPropertyAlbumTitle]); /* Output on my device: Beethoven Piano Duet Beethoven Piano Other */
The results of an albumsQuery
are actually songs (MPMediaItems). That means we can immediately access any song in any of those albums. Let’s modify the output from our previous query to print the titles of all the matching songs in the first album returned, which happens to be the Beethoven Piano Duet album. We don’t have to change our query, so I’ll start at the point where we perform it; result
is the array of collections returned from our query:
// ... same as before ... MPMediaItemCollection* album = result[0]; for (MPMediaItem* song in album.items) NSLog(@"%@", [song valueForProperty:MPMediaItemPropertyTitle]); /* Output on my device: Sonata for piano 4-hands in D major Op. 6 - 1. Allegro molto Sonata for piano 4-hands in D major Op. 6 - 2. Rondo */
One of the properties of an MPMediaEntity is its persistent ID, which uniquely identifies this song (MPMediaItemPropertyPersistentID
) or playlist (MPMediaPlaylistPropertyPersistentID
). No other means of identification is guaranteed unique; two songs or two playlists can have the same title, for example. Using the persistent ID, you can retrieve again at a later time the same song or playlist you retrieved earlier, even across launches of your app. All sorts of things have persistent IDs — entities in general (MPMediaEntityPropertyPersistentID
), albums, artists, composers, and more.
While you are maintaining the results of a search, the contents of the music library may themselves change. For example, the user might connect the device to a computer and add or delete music with iTunes. This can put your results out of date. For this reason, the library’s own modified date is available through the MPMediaLibrary class. Call the class method defaultMediaLibrary
to get the actual library instance; now you can ask it for its lastModifiedDate
. You can also register to receive a notification, MPMediaLibraryDidChangeNotification
, when the music library is modified. This notification is not emitted unless you first send the library beginGeneratingLibraryChangeNotifications
; you should eventually balance this with endGeneratingLibraryChangeNotifications
.
The user can play a song that lives in the cloud without explicitly downloading it. In iOS 7, this can cause MPMediaLibraryDidChangeNotification
to be triggered even though there is no change in the library — the library still consists of the same songs, and this song is still in the cloud (its MPMediaItemPropertyIsCloudItem
is still @YES
). Alternatively, the user can explicitly download the song; this causes the song to be no longer a cloud item, which is correct, but it can also cause MPMediaLibraryDidChangeNotification
to be triggered twice in quick succession. Finally, if the user deletes the song from the device (so that it returns to being a cloud item), MPMediaLibraryDidChangeNotification
is not triggered. Also, if the user so much as looks at the Radio tab in the Music app, MPMediaLibraryDidChangeNotification
may be triggered many times in quick succession. I regard all of this as a bug.
The Media Player framework class for playing an MPMediaItem is MPMusicPlayerController. It comes in two flavors, depending on which class method you use to get an instance:
iPodMusicPlayer
applicationMusicPlayer
Plays an MPMediaItem from the music library within your application. The song being played by the applicationMusicPlayer
can be different from the Music app’s current song. This player stops when your app is not in the foreground.
An applicationMusicPlayer
MPMusicPlayerController is not really inside your app. It is actually the global music player behaving differently. It has its own audio session. You cannot play its audio when your app is in the background. You cannot make it the target of remote control events. If these limitations prove troublesome, use the iPodMusicPlayer
(or some other means of playing the song, as discussed later in this chapter).
A music player doesn’t merely play an item; it plays from a queue of items. This behavior is familiar from iTunes and the Music app. For example, in iTunes, when you switch to a playlist and double-click the first song to start playing, when iTunes comes to the end of that song, it proceeds by default to the next song in the playlist. So at that moment, its queue is the totality of songs in the playlist. The music player behaves the same way; when it reaches the end of a song, it proceeds to the next song in its queue.
Your methods for controlling playback also reflect this queue-based orientation. In addition to the expected play
, pause
, and stop
commands, there’s a skipToNextItem
and skipToPreviousItem
command. Anyone who has ever used iTunes or the Music app (or, for that matter, an old-fashioned iPod) will have an intuitive grasp of this and everything else a music player does. For example, you can also set a music player’s repeatMode
and shuffleMode
, just as in iTunes.
You provide a music player with its queue in one of two ways:
items
are the items of the queue.
collectionWithItems:
or initWithItems:
.
In this example, we collect all songs actually present in the library shorter than 30 seconds into a queue and set the queue playing in random order using the application-internal music player:
MPMediaQuery* query = [MPMediaQuery songsQuery]; MPMediaPropertyPredicate* isPresent = [MPMediaPropertyPredicate predicateWithValue:@NO forProperty:MPMediaItemPropertyIsCloudItem comparisonType:MPMediaPredicateComparisonEqualTo]; [query addFilterPredicate:isPresent]; NSMutableArray* marr = [NSMutableArray array]; MPMediaItemCollection* queue = nil; for (MPMediaItem* song in query.items) { NSNumber* dur = [song valueForProperty:MPMediaItemPropertyPlaybackDuration]; if ([dur floatValue] < 30) [marr addObject: song]; } if ([marr count] == 0) NSLog(@"No songs that short!"); else queue = [MPMediaItemCollection collectionWithItems:marr]; if (queue) { MPMusicPlayerController* player = [MPMusicPlayerController applicationMusicPlayer]; [player setQueueWithItemCollection:queue]; player.shuffleMode = MPMusicShuffleModeSongs; [player play]; }
If a music player is currently playing, setting its queue will stop it; restarting play is up to you.
You can ask a music player for its nowPlayingItem
, and since this is an MPMediaItem, you can learn all about it through its properties. Unfortunately, you can’t query a music player as to its queue, but you can keep your own pointer to the MPMediaItemCollection constituting the queue when you hand it to the music player, and you can ask the music player for which song within the queue is currently playing (indexOfNowPlayingItem
). The user can completely change the queue of an iPodMusicPlayer
, so if control over the queue is important to you, use the applicationMusicPlayer
.
A music player has a playbackState
that you can query to learn what it’s doing (whether it is playing, paused, stopped, or seeking). It also emits notifications so you can hear about changes in its state:
MPMusicPlayerControllerPlaybackStateDidChangeNotification
MPMusicPlayerControllerNowPlayingItemDidChangeNotification
MPMusicPlayerControllerVolumeDidChangeNotification
These notifications are not emitted until you tell the music player to beginGeneratingPlaybackNotifications
. This is an instance method, so you can arrange to receive notifications from just one of the two possible music players. If you do receive notifications from both, you can distinguish them by examining the NSNotification’s object
and comparing it to each player. You should eventually balance this call with endGeneratingPlaybackNotifications
.
To illustrate, I’ll extend the previous example to set a UILabel in our interface every time a different song starts playing. Before we start the player playing, we insert these lines to generate the notifications:
[player beginGeneratingPlaybackNotifications]; [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(changed:) name:MPMusicPlayerControllerNowPlayingItemDidChangeNotification object:player]; self.q = queue; // retain a pointer to the queue
And here’s how we respond to those notifications:
- (void) changed: (NSNotification*) n { MPMusicPlayerController* player = [MPMusicPlayerController applicationMusicPlayer]; if ([n object] == player) { // just playing safe NSString* title = [player.nowPlayingItem valueForProperty:MPMediaItemPropertyTitle]; NSUInteger ix = player.indexOfNowPlayingItem; if (NSNotFound == ix) self.label.text = @""; else self.label.text = [NSString stringWithFormat:@"%i of %i: %@", ix+1, [self.q count], title]; } }
There’s no periodic notification as a song plays and the current playhead position advances. To get this information, you’ll have to resort to polling. This is not objectionable as long as your polling interval is reasonably sparse; your display may occasionally fall a little behind reality, but this won’t usually matter. To illustrate, let’s add to our existing example a UIProgressView (self.prog
) showing the current percentage of the current song played by the global player. I’ll use an NSTimer to poll the state of the player every second:
self.timer = [NSTimer scheduledTimerWithTimeInterval:1 target:self selector:@selector(timerFired:) userInfo:nil repeats:YES]; self.timer.tolerance = 0.1;
When the timer fires, the progress view displays the state of the currently playing item:
- (void) timerFired: (id) dummy { MPMusicPlayerController* mp = [MPMusicPlayerController applicationMusicPlayer]; MPMediaItem* item = mp.nowPlayingItem; if (!item || mp.playbackState == MPMusicPlaybackStateStopped) { self.prog.hidden = YES; return; } self.prog.hidden = NO; NSTimeInterval current = mp.currentPlaybackTime; NSTimeInterval total = [[item valueForProperty:MPMediaItemPropertyPlaybackDuration] doubleValue]; self.prog.progress = current / total; }
The applicationMusicPlayer
has no user interface, unless you count the remote playback controls (Figure 14-1); if you want the user to have controls for playing and stopping a song, you’ll have to create them yourself. The iPodMusicPlayer
has its own natural interface — the Music app.
The Media Player framework offers a slider for letting the user set the system output volume, along with an AirPlay route button if appropriate; this is an MPVolumeView. An MPVolumeView works only on a device — not in the Simulator. It is customizable similarly to a UISlider; you can set the images for the two halves of the track, the thumb, and even the AirPlay route button, for both the normal and the highlighted state (while the user is touching the thumb). New in iOS 7, you can also customize the image (volumeWarningSliderImage
) that flashes in the right half of the track when the user tries to exceed the volume limit (set in the Settings app, Music → Volume Limit). In this example, we make the left half of the track black and the right half red, with flashing orange if the volume limit is exceeded, and we provide a custom thumb image:
CGSize sz = CGSizeMake(20,20); UIGraphicsBeginImageContextWithOptions( CGSizeMake(sz.height,sz.height), NO, 0); [[UIColor blackColor] setFill]; [[UIBezierPath bezierPathWithOvalInRect: CGRectMake(0,0,sz.height,sz.height)] fill]; UIImage* im1 = UIGraphicsGetImageFromCurrentImageContext(); [[UIColor redColor] setFill]; [[UIBezierPath bezierPathWithOvalInRect: CGRectMake(0,0,sz.height,sz.height)] fill]; UIImage* im2 = UIGraphicsGetImageFromCurrentImageContext(); [[UIColor orangeColor] setFill]; [[UIBezierPath bezierPathWithOvalInRect: CGRectMake(0,0,sz.height,sz.height)] fill]; UIImage* im3 = UIGraphicsGetImageFromCurrentImageContext(); UIGraphicsEndImageContext(); [self.vv setMinimumVolumeSliderImage: [im1 resizableImageWithCapInsets:UIEdgeInsetsMake(9,9,9,9) resizingMode:UIImageResizingModeStretch] forState:UIControlStateNormal]; [self.vv setMaximumVolumeSliderImage: [im2 resizableImageWithCapInsets:UIEdgeInsetsMake(9,9,9,9) resizingMode:UIImageResizingModeStretch] forState:UIControlStateNormal]; [self.vv setVolumeWarningSliderImage: [im3 resizableImageWithCapInsets:UIEdgeInsetsMake(9,9,9,9) resizingMode:UIImageResizingModeStretch]]; UIImage* thumb = [UIImage imageNamed:@"SmileyRound.png"]; sz = CGSizeMake(40,40); UIGraphicsBeginImageContextWithOptions(sz, NO, 0); [thumb drawInRect:CGRectMake(0,0,sz.width,sz.height)]; thumb = UIGraphicsGetImageFromCurrentImageContext(); UIGraphicsEndImageContext(); [self.vv setVolumeThumbImage:thumb forState:UIControlStateNormal];
In my testing, the orange warning flash never appeared unless the EU Volume Limit setting was also switched to On (Developer → EU Volume Limit in the Settings app). Presumably this feature works on devices destined for the European Union market, but on my device, the MPVolumeView ignores the Volume Limit from the Settings app.
For further customization, you can subclass MPVolumeView and override volumeSliderRectForBounds:
. (An additional overridable method is documented, volumeThumbRectForBounds:volumeSliderRect:value:
, but in my testing it is never called; I regard this as a bug.)
New in iOS 7, you can register for notifications when a wireless route (Bluetooth or AirPlay) appears or disappears (MPVolumeViewWirelessRoutesAvailableDidChangeNotification
) and when a wireless route becomes active or inactive (MPVolumeViewWirelessRouteActiveDidChangeNotification
).
MPMusicPlayerController is convenient and simple, but it’s also simpleminded. Its audio session isn’t your audio session; the music player doesn’t really belong to you. An MPMediaItem, however, has an MPMediaItemPropertyAssetURL
key whose value is a URL. Now everything from Chapter 14 and Chapter 15 comes into play.
So, for example, having obtained an MPMediaItem’s MPMediaItemPropertyAssetURL
, you could use that URL in any of the following ways:
initWithContentsOfURL:error:
or initWithContentsOfURL:fileTypeHint:error:
).
contentURL
of an MPMoviePlayerController.
initWithURL:
or playerWithURL:
).
assetWithURL:
).
Each of these ways of playing an MPMediaItem has its advantages. For example, AVAudioPlayer is easy to use, and lets you loop a sound, poll the power value of its channels, and so forth. MPMoviePlayerController gives you a built-in play/pause button and playhead slider. AVAsset gives you the full power of the AV Foundation framework, letting you edit the sound, assemble multiple sounds, perform a fadeout effect, and even attach the sound to a video (and then play it with an AVPlayer).
To demonstrate the use of AVAsset and AVPlayer, I’ll use AVQueuePlayer (an AVPlayer subclass) to play a sequence of MPMediaItems, just as MPMusicPlayerController does:
NSArray* arr = // array of MPMediaItem NSMutableArray* assets = [NSMutableArray array]; for (MPMediaItem* item in arr) { AVPlayerItem* pi = [[AVPlayerItem alloc] initWithURL: [item valueForProperty:MPMediaItemPropertyAssetURL]]; [assets addObject:pi]; } self.qp = [AVQueuePlayer queuePlayerWithItems:assets]; [self.qp play];
That works, but I have the impression, based on something said in one of the WWDC 2011 videos, that instead of adding a whole batch of AVPlayerItems to an AVQueuePlayer all at once, you’re supposed to add just a few AVPlayerItems to start with and then add each additional AVPlayerItem when an item finishes playing. So I’ll start out by adding just three AVPlayerItems, and use key–value observing to watch for changes in the AVQueuePlayer’s currentItem
:
NSArray* arr = // array of MPMediaItem self.assets = [NSMutableArray array]; for (MPMediaItem* item in arr) { AVPlayerItem* pi = [[AVPlayerItem alloc] initWithURL: [item valueForProperty:MPMediaItemPropertyAssetURL]]; [self.assets addObject:pi]; } self->_curnum = 0; // we'll need this later self->_total = [self.assets count]; // ditto self.qp = [AVQueuePlayer queuePlayerWithItems: [self.assets objectsAtIndexes: [NSIndexSet indexSetWithIndexesInRange:NSMakeRange(0,3)]]]; [self.assets removeObjectsAtIndexes: [NSIndexSet indexSetWithIndexesInRange:NSMakeRange(0,3)]]; [self.qp addObserver:self forKeyPath:@"currentItem" options:0 context:nil]; [self.qp play];
The implementation of observeValueForKeyPath:...
looks like this:
AVPlayerItem* item = self.qp.currentItem; NSArray* arr = item.asset.commonMetadata; arr = [AVMetadataItem metadataItemsFromArray:arr withKey:AVMetadataCommonKeyTitle keySpace:AVMetadataKeySpaceCommon]; AVMetadataItem* met = arr[0]; [met loadValuesAsynchronouslyForKeys:@[@"value"] completionHandler:^{ dispatch_async(dispatch_get_main_queue(), ^{ self.label.text = [NSString stringWithFormat:@"%d of %d: %@", ++self->_curnum, self->_total, met.value]; }); }]; if (![self.assets count]) return; AVPlayerItem* newItem = self.assets[0]; [self.qp insertItem:newItem afterItem:[self.qp.items lastObject]]; [self.assets removeObjectAtIndex:0];
That code illustrates how to extract metadata from an AVAsset by way of an AVMetadataItem; in this case, we fetch the AVMetadataCommonKeyTitle
and get its value
property, as the equivalent of fetching an MPMediaItem’s MPMediaItemPropertyTitle
property in our earlier code. As with everything else in the AV Foundation world, it can take time for the value
property to become available, so we call loadValuesAsynchronouslyForKeys:completionHandler:
to run our completion handler when it is available. There are no guarantees about what thread the completion handler will be called on, so to set the label’s text, I step out to the main thread (Chapter 25).
In the last three lines, we pull an AVPlayerItem off the front of our assets
mutable array and add it to the end of the AVQueuePlayer’s queue. The AVQueuePlayer itself deletes an item from the start of its queue after playing it, so this way the queue never exceeds three items in length.
Just as in the previous example, where we updated a progress view in response to the firing of a timer to reflect an MPMusicPlayerController’s current item’s time and duration, we can do the same thing with the currently playing AVPlayerItem. Again, we can’t be certain when the duration
property will become available, so we call loadValuesAsynchronouslyForKeys:completionHandler:
, and again, to update the progress view, we step out to the main thread:
if (!self.qp.currentItem) { // finished! self.prog.hidden = YES; [self.timer invalidate]; } else { AVPlayerItem* item = self.qp.currentItem; AVAsset* asset = item.asset; [asset loadValuesAsynchronouslyForKeys:@[@"duration"] completionHandler:^{ dispatch_async(dispatch_get_main_queue(), ^{ CMTime cur = self.qp.currentTime; CMTime dur = asset.duration; self.prog.progress = CMTimeGetSeconds(cur)/CMTimeGetSeconds(dur); self.prog.hidden = NO; }); }]; }
The music picker (MPMediaPickerController), supplied by the Media Player framework, is a view controller (UIViewController) whose view is a self-contained navigation interface in which the user can select a media item. This interface looks very much like the iPhone Music app. You have no access to the actual view; you are expected to present the view controller (presentViewController:animated:completion:
).
You can limit the type of media items displayed by creating the controller using initWithMediaTypes:
. You can make a prompt appear at the top of the navigation bar (prompt
). And you can govern whether the user can choose multiple media items or just one, with the allowsPickingMultipleItems
property. You can filter out items stored in the cloud by setting showsCloudItems
to NO.
While the view is showing, you learn what the user is doing through two delegate methods (MPMediaPickerControllerDelegate); the presented view controller is not automatically dismissed, so it is up to you dismiss it in these delegate methods:
mediaPicker:didPickMediaItems:
mediaPickerDidCancel:
The behavior of the delegate methods depends on the value of the controller’s allowsPickingMultipleItems
:
allowsPickingMultipleItems
is NO (the default)
mediaPicker:didPickMediaItems:
is called, handing you an MPMediaItemCollection consisting of that item; you are likely to dismiss the presented view controller at this point. When the user taps Cancel, your mediaPickerDidCancel:
is called.
allowsPickingMultipleItems
is YES
mediaPicker:didPickMediaItems:
is called, handing you an MPMediaItemCollection consisting of all items the user tapped. Your mediaPickerDidCancel:
is never called.
In this example, we put up the music picker; we then play the user’s chosen media item with the application’s music player. The example works equally well whether allowsPickingMultipleItems
is YES or NO:
- (void) presentPicker { MPMediaPickerController* picker = [MPMediaPickerController new]; picker.delegate = self; // picker.allowsPickingMultipleItems = YES; [self presentViewController:picker animated:YES completion:nil]; } - (void) mediaPicker: (MPMediaPickerController*) mediaPicker didPickMediaItems: (MPMediaItemCollection*) mediaItemCollection { MPMusicPlayerController* player = [MPMusicPlayerController applicationMusicPlayer]; [player setQueueWithItemCollection:mediaItemCollection]; [player play]; [self dismissViewControllerAnimated:YES completion:nil]; } - (void) mediaPickerDidCancel: (MPMediaPickerController*) mediaPicker { [self dismissViewControllerAnimated:YES completion:nil]; }
On the iPad, the music picker can be displayed as a presented view, and I think it looks best that way. But it also works reasonably well in a popover, especially if we increase its popoverContentSize
; so I’ll use this opportunity to provide a complete example (Example 16-1) of managing a single view controller as either a presented view on the iPhone or a popover on the iPad.
The presentPicker
method is now a button’s control event action handler, so that we can point the popover’s arrow to the button. How we summon the picker depends on the device; we use UI_USER_INTERFACE_IDIOM
to distinguish the two cases. If it’s an iPad, we create a popover and set an instance variable to retain it (as discussed in Chapter 9). Two methods dismiss the picker, so that operation is factored out into a utility method (dismissPicker:
) that does one thing if there’s a popover and another if there’s a presented view controller.
- (void) presentPicker: (id) sender { MPMediaPickerController* picker = [MPMediaPickerController new]; picker.delegate = self; // picker.allowsPickingMultipleItems = YES; if (UI_USER_INTERFACE_IDIOM() == UIUserInterfaceIdiomPhone) [self presentViewController:picker animated:YES completion:nil]; else { UIPopoverController* pop = [[UIPopoverController alloc] initWithContentViewController:picker]; self.currentPop = pop; pop.popoverContentSize = CGSizeMake(500,600); [pop presentPopoverFromRect:[sender bounds] inView:sender permittedArrowDirections:UIPopoverArrowDirectionAny animated:YES]; pop.passthroughViews = nil; } } - (void) dismissPicker: (MPMediaPickerController*) mediaPicker { if (self.currentPop && self.currentPop.popoverVisible) { [self.currentPop dismissPopoverAnimated:YES]; self.currentPop = nil; } else { [self dismissViewControllerAnimated:YES completion:nil]; } } - (void)mediaPicker: (MPMediaPickerController *)mediaPicker didPickMediaItems:(MPMediaItemCollection *)mediaItemCollection { MPMusicPlayerController* player = [MPMusicPlayerController applicationMusicPlayer]; [player setQueueWithItemCollection:mediaItemCollection]; [player play]; [self dismissPicker: mediaPicker]; } - (void)mediaPickerDidCancel:(MPMediaPickerController *)mediaPicker { [self dismissPicker: mediaPicker]; } - (void)popoverControllerDidDismissPopover:(UIPopoverController*)popoverController { self.currentPop = nil; }