Multiple Video Playback on iOS

01/11/2012 § 44 Comments


As usual let me start by telling you a quick story on how this post came to be…

Once upon a time..NOT. So, I was working on this project for Hyundai at nKey when we got into this screen that requires two videos playing at the same time so the user could see how a car would behave with and without a feature (like Electronic Stability Control). As an experienced developer I immediately told the customer we should merge both videos so that we could play “both at the same time” in iOS. I explained him that in order to play videos on iOS, Apple has released a long time ago the MediaPlayer.framework which according to the docs (and real life :P) isn’t able to handle more than one video playback at a time (although you can have two MPMoviePlayerController instances).

He said OK and that was what we did. However the requirements changed and we had to add a background video that plays on a loop..aaand I started to have problems coordinating all these video playbacks so that only one was playing at a time and the user wouldn’t notice it.

Fortunately nKey sent me to the iOS Tech Talk that was happening on São Paulo/BR this very monday and there I got into a talk where the Media Technologies Evangelist, Eryk Vershen was discussing the AVFoundation.framework and how it is used by the MediaPlayer.framework (aka MPMoviePlayerController) for video playback. After the talk during Wine and Cheese I got to speak to Eryk about my issue and explained him how I was thinking about handling the problem. His answer was something like “Sure! Go for it! iOS surely is capable of playing multiple videos at the same time!…Oh…and I think something around 4 is the limit”. That answer made me happy and curious so I asked him why is the MediaPlayer.framework incapable of handling multiple video playback if that wasn’t a library limitation…he told me the MPMoviePlayerController was created to present cut-scenes on games early on…that this is why on previous iOS versions only fullscreen playback was allowed and that this limitation is a matter of legacy.

When I got back to my notebook I worked on this very basic version of a video player using the AVFoundation.framework (which obviously I made more robust when I got back to the office so that we could use it on the project).

Okdq, story told. Let’s get back to work!

The AVFoundation framework provides you the AVPlayer object to implement controllers and user interfaces for single- or multiple-item playback. The visual results generated by the AVPlayer object can be displayed  in a CoreAnimation layer of class AVPlayerLayer. In AVFoundation timed audiovisual media such as videos and sounds are represented by an AVAsset object. According to the docs, each asset contains a collection of tracks that are intended to be presented or processed together, each of a uniform media type, including but not limited to audio, video, text, closed captions, and subtitles. Due to the nature of timed audiovisual media, upon successful initialization of an asset some or all of the values for its keys may not be immediately available. In order to avoid blocking the main thread, you can register your interest in particular keys and become notified when their values are available.

Having this in mind, subclass UIViewController and name this class VideoPlayerViewController. Just like the MPMoviePlayerController, let’s add a NSURL property that tells us from where we should grab our video. Like described above, add the following code to load the AVAsset once the URL is set.

 #pragma mark - Public methods  
 - (void)setURL:(NSURL*)URL {
      [_URL release];
      _URL = [URL copy];
      AVURLAsset *asset = [AVURLAsset URLAssetWithURL:_URL options:nil];
      NSArray *requestedKeys = [NSArray arrayWithObjects:kTracksKey,
                kPlayableKey, nil];
      [asset loadValuesAsynchronouslyForKeys:requestedKeys
                           completionHandler: ^{ dispatch_async(
                                    dispatch_get_main_queue(), ^{
                                 [self prepareToPlayAsset:asset
 withKeys:requestedKeys];
                           });
      }];
} 
- (NSURL*)URL {
      return _URL;
}

So, once the URL for the video is set we create an asset to inspect the resource referenced by the given URL and asynchronously load up the values for the asset keys “tracks” and “playable”. At loading completion we can operate on the AVPlayer on the main queue (the main queue is used to naturally ensure safe access to a player’s nonatomic properties while dynamic changes in playback state may be reported).

 #pragma mark - Private methods 
- (void)prepareToPlayAsset: (AVURLAsset *)asset withKeys
 (NSArray *)requestedKeys { 
     for (NSString *thisKey in requestedKeys) { 
        NSError *error = nil;  
        AVKeyValueStatus keyStatus = [asset  
             statusOfValueForKey:thisKey
                           error:&error];  
        if (keyStatus == AVKeyValueStatusFailed) {  
           return;
        } 
     }
 if (!asset.playable) {
           return;
      }
      if (self.playerItem) {
          [self.playerItem removeObserver:self forKeyPath:kStatusKey];
          [[NSNotificationCenter defaultCenter] removeObserver:self 
                   name:AVPlayerItemDidPlayToEndTimeNotification
                 object:self.playerItem];
      }
      self.playerItem = [AVPlayerItem playerItemWithAsset:asset];
      [self.playerItem addObserver:self forKeyPath:kStatusKey 
              options:NSKeyValueObservingOptionInitial |
                      NSKeyValueObservingOptionNew
              context:
           AVPlayerDemoPlaybackViewControllerStatusObservationContext];
      if (![self player]) {
            [self setPlayer:[AVPlayer playerWithPlayerItem:self.playerItem]];
            [self.player addObserver:self forKeyPath:kCurrentItemKey 
                  options:NSKeyValueObservingOptionInitial |
                          NSKeyValueObservingOptionNew
                  context:
             AVPlayerDemoPlaybackViewControllerCurrentItemObservationContext];
      }
      if (self.player.currentItem != self.playerItem) {
             [[self player] replaceCurrentItemWithPlayerItem:self.playerItem];
      }
}

At the completion of the loading of the values for all keys on the asset that we require, we check whether loading was successfull and whether the asset is playable. If so, we set up an AVPlayerItem (representation of the presentation state of an asset that’s played by an AVPlayer object) and an AVPlayer to play the asset. Note that I didn’t add any error handling at this point. Here we should probably create a delegate and let the view controller or whoever is using your player to decide what is the best way to handle the possible errors.

Also we added some key-value observers so that we are notified when our view should be tied to the player and when the the AVPlayerItem is ready to play.

#pragma mark - Key Valye Observing

- (void)observeValueForKeyPath: (NSString*) path
                      ofObject: (id)object
                        change: (NSDictionary*)change
                       context: (void*)context {
	if (context == AVPlayerDemoPlaybackViewControllerStatusObservation
             Context) {
              AVPlayerStatus status = [[change objectForKey:
                NSKeyValueChangeNewKey] integerValue];
              if (status == AVPlayerStatusReadyToPlay) {
                   [self.player play];
              }
	} else if (context == AVPlayerDemoPlaybackViewControllerCurrentItem
             ObservationContext) {
              AVPlayerItem *newPlayerItem = [change objectForKey:
                 NSKeyValueChangeNewKey];

              if (newPlayerItem) {
                  [self.playerView setPlayer:self.player];
                  [self.playerView setVideoFillMode:
                      AVLayerVideoGravityResizeAspect];
              }
	} else {
		[super observeValueForKeyPath:path ofObject: object
                    change:change context:context];
	}
}

Once the AVPlayerItem is on place we are free to attach the AVPlayer to the player layer that displays visual output. We also make sure to preserve the video’s aspect ratio and fit the video within the layer’s bounds.

As soon as the AVPlayer is ready, we command it to play! iOS does the hard work 🙂

As I mentioned earlier, to play the visual component of an asset, you need a view containing an AVPlayerLayer layer to which the output of an AVPlayer object can be directed. This is how you subclass a UIView to meet the requirements:

@implementation VideoPlayerView

+ (Class)layerClass {
	return [AVPlayerLayer class];
}

- (AVPlayer*)player {
	return [(AVPlayerLayer*)[self layer] player];
}

- (void)setPlayer: (AVPlayer*)player {
	[(AVPlayerLayer*)[self layer] setPlayer:player];
}

- (void)setVideoFillMode: (NSString *)fillMode {
	AVPlayerLayer *playerLayer = (AVPlayerLayer*)[self layer];
	playerLayer.videoGravity = fillMode;
}

@end

And this is it!

Of course I didn’t add all the necessary code for building and running the project but I wouldn’t let you down! Go to GitHub and download the full source code!

Where Am I?

You are currently browsing entries tagged with AVPlayerItem at iOS Guy.