iOS 开发 AVPlayer 深入浅出

作者: SomeBoy | 来源:发表于2015-12-26 10:29 被阅读36361次

首先介绍一篇AVPlayer 的文章:
http://www.cnblogs.com/mzds/p/3711867.html
然后我再写写我在实际项目中遇到的问题

1.__ 然后监听playerItem的status和loadedTimeRange属性,status有三种状态__ ==>这是原文中的话,但是后面列出的属性却是AVPlayer 的status(应该是作者笔误),其实AVPlayerItemAVPlayer 都有status 属性的,而且可以使用KVO监听的。
文档中枚举类型如下:

typedef NS_ENUM(NSInteger, AVPlayerItemStatus) {
 AVPlayerItemStatusUnknown,
 AVPlayerItemStatusReadyToPlay,
 AVPlayerItemStatusFailed
};  

而AVPlayer 的status 枚举类型如下:

typedef NS_ENUM(NSInteger, AVPlayerStatus) {
 AVPlayerStatusUnknown,
 AVPlayerStatusReadyToPlay,
 AVPlayerStatusFailed
};

看清楚啊!!!看清楚这两个之间的区别,这里我主要想说明:
AVPlayerItemStatus是代表当前播放资源item 的状态(可以理解成这url链接or视频文件。。。可以播放成功/失败)
AVPlayerStatus是代表当前播放器的状态。

我在编程的时候遇到一个问题就是AVPlayer 的status 为
AVPlayerStatusReadyToPlay,但是视频就是播放不成功,后来将KVO的监听换成了AVPlayerItem ,返回了AVPlayerItemStatusFailed。

编程的时候最好使用item 的status,会准确点。
2.addPeriodicTimeObserverForInterval
给AVPlayer 添加time Observer 有利于我们去检测播放进度
但是添加以后一定要记得移除,其实不移除程序不会崩溃,但是这个线程是不会释放的,会占用你大量的内存资源(当时发现这个问题的时候我搞了一上午,说实话当时我根本不知道是哪里出现了问题,自己对AVPlayer 根本不了解)
苹果文档中的注释:

@result
An object conforming to the NSObject protocol. You must retain this returned value as long as you want the time observer to be invoked by the player.
Pass this object to -removeTimeObserver: to cancel time observation.

3.CMTime 结构体
连接的教程里面 给的参数是CMTimeMake(1, 1),其实就是1s调用一下block,
打个比方CMTimeMake(a, b)就是a/b秒之后调用一下block
介绍一个网站有关这个结构体的:
https://zwo28.wordpress.com/2015/03/06/%E8%A7%86%E9%A2%91%E5%90%88%E6%88%90%E4%B8%ADcmtime%E7%9A%84%E7%90%86%E8%A7%A3%EF%BC%8C%E4%BB%A5%E5%8F%8A%E5%88%A9%E7%94%A8cmtime%E5%AE%9E%E7%8E%B0%E8%BF%87%E6%B8%A1%E6%95%88%E6%9E%9C/

4.拖动slider 播放跳跃播放,要使用AVPlayer 对象的seekToTime:方法,
举个最简单的例子:假如一个video视频有20s,想要跳到10s进行播放(_palyer 为AVPlayer 对象)
[_player seekToTime:CMTimeMake(10,1)];后面的参数写1,前面的参数写将要播放的秒数,我试验得出的结果,不要问我问什么,需要自己理解。
5.播放到结尾怎么回到开头呢?
[_player seekToTime:kCMTimeZero];

下面放上我的写的代码:写的不好请指正,多谢。

AVViewController.h

#import <UIKit/UIKit.h> 
@interface AVViewController : UIViewController 
@end

AVViewController.m

#import "AVViewController.h"
#import "VideoView.h"

@interface AVViewController () <VideoSomeDelegate>

@property (nonatomic ,strong) VideoView *videoView;

@property (nonatomic ,strong) NSMutableArray<NSLayoutConstraint *> *array;

@property (nonatomic ,strong) UISlider *videoSlider;

@property (nonatomic ,strong) NSMutableArray<NSLayoutConstraint *> *sliderArray;

@end

@implementation AVViewController

- (void)viewDidLoad {
    [super viewDidLoad];
    [self.view setBackgroundColor:[UIColor whiteColor]];
    [self initVideoView];
    
}

- (void)initVideoView {
    
    //NSString *path = [[NSBundle mainBundle] pathForResource:@"some" ofType:@"mp4"];//这个时播放本地的,播放本地的时候还需要改VideoView.m中的代码
    NSString *path = @"http://static.tripbe.com/videofiles/20121214/9533522808.f4v.mp4";
    _videoView = [[VideoView alloc] initWithUrl:path delegate:self];
    _videoView.someDelegate = self;
    [_videoView setTranslatesAutoresizingMaskIntoConstraints:NO];
    [self.view addSubview:_videoView];
    [self initVideoSlider];
    
    if (self.traitCollection.verticalSizeClass == UIUserInterfaceSizeClassCompact) {
        [self installLandspace];
    } else {
        [self installVertical];
    }
}
- (void)installVertical {
    if (_array != nil) {
        [self.view removeConstraints:_array];
        [_array removeAllObjects];
        [self.view removeConstraints:_sliderArray];
        [_sliderArray removeAllObjects];
    } else {
        _array = [NSMutableArray array];
        _sliderArray = [NSMutableArray array];
    }
    id topGuide = self.topLayoutGuide;
    NSDictionary *dic = @{@"top":@100,@"height":@180,@"edge":@20,@"space":@80};
    [_array addObjectsFromArray:[NSLayoutConstraint constraintsWithVisualFormat:@"H:|[_videoView]|" options:0 metrics:nil views:NSDictionaryOfVariableBindings(_videoView)]];
    [_array addObjectsFromArray:[NSLayoutConstraint constraintsWithVisualFormat:@"H:|-(edge)-[_videoSlider]-(edge)-|" options:0 metrics:dic views:NSDictionaryOfVariableBindings(_videoSlider)]];
    [_array addObjectsFromArray:[NSLayoutConstraint constraintsWithVisualFormat:@"V:|[topGuide]-(top)-[_videoView(==height)]-(space)-[_videoSlider]" options:0 metrics:dic views:NSDictionaryOfVariableBindings(_videoView,topGuide,_videoSlider)]];
    [self.view addConstraints:_array];
    
    
    
}
- (void)installLandspace {
    if (_array != nil) {
        
        [self.view removeConstraints:_array];
        [_array removeAllObjects];
        
        [self.view removeConstraints:_sliderArray];
        [_sliderArray removeAllObjects];
    } else {
        
        _array = [NSMutableArray array];
        _sliderArray = [NSMutableArray array];
    }
    
    id topGuide = self.topLayoutGuide;
    NSDictionary *dic = @{@"edge":@20,@"space":@30};
    
    [_array addObjectsFromArray:[NSLayoutConstraint constraintsWithVisualFormat:@"H:|[_videoView]|" options:0 metrics:nil views:NSDictionaryOfVariableBindings(_videoView)]];
    [_array addObjectsFromArray:[NSLayoutConstraint constraintsWithVisualFormat:@"V:|[topGuide][_videoView]|" options:0 metrics:nil views:NSDictionaryOfVariableBindings(_videoView,topGuide)]];
    [self.view addConstraints:_array];
    
    [_sliderArray addObjectsFromArray:[NSLayoutConstraint constraintsWithVisualFormat:@"H:|-(edge)-[_videoSlider]-(edge)-|" options:0 metrics:dic views:NSDictionaryOfVariableBindings(_videoSlider)]];
    [_sliderArray addObjectsFromArray:[NSLayoutConstraint constraintsWithVisualFormat:@"V:[_videoSlider]-(space)-|" options:0 metrics:dic views:NSDictionaryOfVariableBindings(_videoSlider)]];
    [self.view addConstraints:_sliderArray];
}
- (void)initVideoSlider {
    
    _videoSlider = [[UISlider alloc] init];
    [_videoSlider setTranslatesAutoresizingMaskIntoConstraints:NO];
    [_videoSlider setThumbImage:[UIImage imageNamed:@"sliderButton"] forState:UIControlStateNormal];
    [self.view addSubview:_videoSlider];
    
}
- (void)willTransitionToTraitCollection:(UITraitCollection *)newCollection withTransitionCoordinator:(id <UIViewControllerTransitionCoordinator>)coordinator {

    [super willTransitionToTraitCollection:newCollection withTransitionCoordinator:coordinator];
    [coordinator animateAlongsideTransition:^(id <UIViewControllerTransitionCoordinatorContext> context) {
        
        if (newCollection.verticalSizeClass == UIUserInterfaceSizeClassCompact) {
            [self installLandspace];
        } else {
            [self installVertical];
        }
        [self.view setNeedsLayout];
    } completion:nil];

}
- (void)didReceiveMemoryWarning {
    [super didReceiveMemoryWarning];
}
#pragma mark -
- (void)flushCurrentTime:(NSString *)timeString sliderValue:(float)sliderValue {
    _videoSlider.value = sliderValue;
}
/*
#pragma mark - Navigation

// In a storyboard-based application, you will often want to do a little preparation before navigation
- (void)prepareForSegue:(UIStoryboardSegue *)segue sender:(id)sender {
    // Get the new view controller using [segue destinationViewController].
    // Pass the selected object to the new view controller.
}
*/

@end

VideoView.h

#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>

@protocol VideoSomeDelegate <NSObject>

@required

- (void)flushCurrentTime:(NSString *)timeString sliderValue:(float)sliderValue;

//- (void)flushVideoLength:(float)videoLength;




@end

@interface VideoView : UIView

@property (nonatomic ,strong) NSString *playerUrl;


@property (nonatomic ,readonly) AVPlayerItem *item;

@property (nonatomic ,readonly) AVPlayerLayer *playerLayer;

@property (nonatomic ,readonly) AVPlayer *player;

@property (nonatomic ,weak) id <VideoSomeDelegate> someDelegate;

- (id)initWithUrl:(NSString *)path delegate:(id<VideoSomeDelegate>)delegate;


@end

@interface VideoView  (Guester)

- (void)addSwipeView;

@end

VideoView.m

#import "VideoView.h"
#import <AVFoundation/AVFoundation.h>
#import <MediaPlayer/MPVolumeView.h>    
typedef enum  {
    ChangeNone,
    ChangeVoice,
    ChangeLigth,
    ChangeCMTime
}Change;


@interface VideoView ()

@property (nonatomic ,readwrite) AVPlayerItem *item;

@property (nonatomic ,readwrite) AVPlayerLayer *playerLayer;

@property (nonatomic ,readwrite) AVPlayer *player;

@property (nonatomic ,strong)  id timeObser;

@property (nonatomic ,assign) float videoLength;

@property (nonatomic ,assign) Change changeKind;

@property (nonatomic ,assign) CGPoint lastPoint;

//Gesture
@property (nonatomic ,strong) UIPanGestureRecognizer *panGesture;
@property (nonatomic ,strong) MPVolumeView *volumeView;
@property (nonatomic ,weak) UISlider *volumeSlider;
@property (nonatomic ,strong) UIView *darkView;
@end

@implementation VideoView

- (id)initWithUrl:(NSString *)path delegate:(id<VideoSomeDelegate>)delegate {
    if (self = [super init]) {
        _playerUrl = path;
        _someDelegate = delegate;
        [self setBackgroundColor:[UIColor blackColor]];
        [self setUpPlayer];
        [self addSwipeView];
    
    }
    return self;
}
- (void)setUpPlayer {
    //本地视频
    //NSURL *rul = [NSURL fileURLWithPath:_playerUrl];

     NSURL *url = [NSURL URLWithString:_playerUrl];
     NSLog(@"%@",url);
     
    _item = [[AVPlayerItem alloc] initWithURL:url];
    _player = [AVPlayer playerWithPlayerItem:_item];
    _playerLayer = [AVPlayerLayer playerLayerWithPlayer:_player];
    _playerLayer.videoGravity = AVLayerVideoGravityResizeAspect;
    [self.layer addSublayer:_playerLayer];
    
    [self addVideoKVO];
    [self addVideoTimerObserver];
    [self addVideoNotic];
}


#pragma mark - KVO
- (void)addVideoKVO
{
    //KVO
    [_item addObserver:self forKeyPath:@"status" options:NSKeyValueObservingOptionNew context:nil];
    [_item addObserver:self forKeyPath:@"loadedTimeRanges" options:NSKeyValueObservingOptionNew context:nil];
    [_item addObserver:self forKeyPath:@"playbackBufferEmpty" options:NSKeyValueObservingOptionNew context:nil];
}
- (void)removeVideoKVO {
    [_item removeObserver:self forKeyPath:@"status"];
    [_item removeObserver:self forKeyPath:@"loadedTimeRanges"];
    [_item removeObserver:self forKeyPath:@"playbackBufferEmpty"];
}
- (void)observeValueForKeyPath:(nullable NSString *)keyPath ofObject:(nullable id)object change:(nullable NSDictionary<NSString*, id> *)change context:(nullable void *)context {

    if ([keyPath isEqualToString:@"status"]) {
        AVPlayerItemStatus status = _item.status;
        switch (status) {
            case AVPlayerItemStatusReadyToPlay:
            {
                NSLog(@"AVPlayerItemStatusReadyToPlay");
                [_player play];
                _videoLength = floor(_item.asset.duration.value * 1.0/ _item.asset.duration.timescale);
            }
                break;
            case AVPlayerItemStatusUnknown:
            {
                NSLog(@"AVPlayerItemStatusUnknown");
            }
                break;
            case AVPlayerItemStatusFailed:
            {
                NSLog(@"AVPlayerItemStatusFailed");
                NSLog(@"%@",_item.error);
            }
                break;
                
            default:
                break;
        }
    } else if ([keyPath isEqualToString:@"loadedTimeRanges"]) {
    
    } else if ([keyPath isEqualToString:@"playbackBufferEmpty"]) {
        
    }
}
#pragma mark - Notic
- (void)addVideoNotic {
    
    //Notification
    [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(movieToEnd:) name:AVPlayerItemDidPlayToEndTimeNotification object:nil];
    [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(movieJumped:) name:AVPlayerItemTimeJumpedNotification object:nil];
    [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(movieStalle:) name:AVPlayerItemPlaybackStalledNotification object:nil];
    [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(backGroundPauseMoive) name:UIApplicationDidEnterBackgroundNotification object:nil];
    
}
- (void)removeVideoNotic {
    //
    [[NSNotificationCenter defaultCenter] removeObserver:self name:AVPlayerItemDidPlayToEndTimeNotification object:nil];
    [[NSNotificationCenter defaultCenter] removeObserver:self name:AVPlayerItemPlaybackStalledNotification object:nil];
    [[NSNotificationCenter defaultCenter] removeObserver:self name:AVPlayerItemTimeJumpedNotification object:nil];
    [[NSNotificationCenter defaultCenter] removeObserver:self];
}

- (void)movieToEnd:(NSNotification *)notic {
    NSLog(@"%@",NSStringFromSelector(_cmd));
}
- (void)movieJumped:(NSNotification *)notic {
    NSLog(@"%@",NSStringFromSelector(_cmd));
}
- (void)movieStalle:(NSNotification *)notic {
    NSLog(@"%@",NSStringFromSelector(_cmd));
}
- (void)backGroundPauseMoive {
    NSLog(@"%@",NSStringFromSelector(_cmd));
}

#pragma mark - TimerObserver
- (void)addVideoTimerObserver {
    __weak typeof (self)self_ = self;
    _timeObser = [_player addPeriodicTimeObserverForInterval:CMTimeMake(1, 1) queue:NULL usingBlock:^(CMTime time) {
        float currentTimeValue = time.value*1.0/time.timescale/self_.videoLength;
        NSString *currentString = [self_ getStringFromCMTime:time];

        if ([self_.someDelegate respondsToSelector:@selector(flushCurrentTime:sliderValue:)]) {
            [self_.someDelegate flushCurrentTime:currentString sliderValue:currentTimeValue];
        } else {
            NSLog(@"no response");
        }
        NSLog(@"%@",self_.someDelegate);
    }];
}
- (void)removeVideoTimerObserver {
    NSLog(@"%@",NSStringFromSelector(_cmd));
    [_player removeTimeObserver:_timeObser];
}


#pragma mark - Utils
- (NSString *)getStringFromCMTime:(CMTime)time
{
    float currentTimeValue = (CGFloat)time.value/time.timescale;//得到当前的播放时
    
    NSDate * currentDate = [NSDate dateWithTimeIntervalSince1970:currentTimeValue];
    NSCalendar *calendar = [[NSCalendar alloc] initWithCalendarIdentifier:NSCalendarIdentifierGregorian];
    NSInteger unitFlags = NSCalendarUnitHour | NSCalendarUnitMinute | NSCalendarUnitSecond ;
    NSDateComponents *components = [calendar components:unitFlags fromDate:currentDate];
    
    if (currentTimeValue >= 3600 )
    {
        return [NSString stringWithFormat:@"%ld:%ld:%ld",components.hour,components.minute,components.second];
    }
    else
    {
        return [NSString stringWithFormat:@"%ld:%ld",components.minute,components.second];
    }
}

- (NSString *)getVideoLengthFromTimeLength:(float)timeLength
{
    NSDate * date = [NSDate dateWithTimeIntervalSince1970:timeLength];
    NSCalendar *calendar = [[NSCalendar alloc] initWithCalendarIdentifier:NSCalendarIdentifierGregorian];
    NSInteger unitFlags = NSCalendarUnitHour | NSCalendarUnitMinute | NSCalendarUnitSecond ;
    NSDateComponents *components = [calendar components:unitFlags fromDate:date];
    
    if (timeLength >= 3600 )
    {
        return [NSString stringWithFormat:@"%ld:%ld:%ld",components.hour,components.minute,components.second];
    }
    else
    {
        return [NSString stringWithFormat:@"%ld:%ld",components.minute,components.second];
    }
}

- (void)layoutSubviews {
    [super layoutSubviews];
    _playerLayer.frame = self.bounds;
}

#pragma mark - release 
- (void)dealloc {
    NSLog(@"%@",NSStringFromSelector(_cmd));
    [self removeVideoTimerObserver];
    [self removeVideoNotic];
    [self removeVideoKVO];
}

@end

#pragma mark - VideoView (Guester)

@implementation VideoView (Guester)

- (void)addSwipeView {
    _panGesture = [[UIPanGestureRecognizer alloc] initWithTarget:self action:@selector(swipeAction:)];
    [self addGestureRecognizer:_panGesture];
    [self setUpDarkView];
}
- (void)setUpDarkView {
    _darkView = [[UIView alloc] init];
    [_darkView setTranslatesAutoresizingMaskIntoConstraints:NO];
    [_darkView setBackgroundColor:[UIColor blackColor]];
    _darkView.alpha = 0.0;
    [self addSubview:_darkView];
    
    NSMutableArray *darkArray = [NSMutableArray array];
    [darkArray addObjectsFromArray:[NSLayoutConstraint constraintsWithVisualFormat:@"H:|[_darkView]|" options:0 metrics:nil views:NSDictionaryOfVariableBindings(_darkView)]];
    [darkArray addObjectsFromArray:[NSLayoutConstraint constraintsWithVisualFormat:@"V:|[_darkView]|" options:0 metrics:nil views:NSDictionaryOfVariableBindings(_darkView)]];
    [self addConstraints:darkArray];
}

- (void)swipeAction:(UISwipeGestureRecognizer *)gesture {
    
    switch (gesture.state) {
        case UIGestureRecognizerStateBegan:
        {
            _changeKind = ChangeNone;
            _lastPoint = [gesture locationInView:self];
        }
            break;
        case  UIGestureRecognizerStateChanged:
        {
            [self getChangeKindValue:[gesture locationInView:self]];
            
        }
            break;
        case UIGestureRecognizerStateEnded:
        {
            if (_changeKind == ChangeCMTime) {
                [self changeEndForCMTime:[gesture locationInView:self]];
            }
            _changeKind = ChangeNone;
            _lastPoint = CGPointZero;
        }
        default:
            break;
    }
    
}
- (void)getChangeKindValue:(CGPoint)pointNow {
    
    switch (_changeKind) {
            
        case ChangeNone:
        {
            [self changeForNone:pointNow];
        }
            break;
        case ChangeCMTime:
        {
            [self changeForCMTime:pointNow];
        }
            break;
        case ChangeLigth:
        {
            [self changeForLigth:pointNow];
        }
            break;
        case ChangeVoice:
        {
            [self changeForVoice:pointNow];
        }
            break;
            
        default:
            break;
    }
}
- (void)changeForNone:(CGPoint) pointNow {
    if (fabs(pointNow.x - _lastPoint.x) > fabs(pointNow.y - _lastPoint.y)) {
        _changeKind = ChangeCMTime;
    } else {
        float halfWight = self.bounds.size.width / 2;
        if (_lastPoint.x < halfWight) {
            _changeKind =  ChangeLigth;
        } else {
            _changeKind =   ChangeVoice;
        }
        _lastPoint = pointNow;
    }
}
- (void)changeForCMTime:(CGPoint) pointNow {
    float number = fabs(pointNow.x - _lastPoint.x);
    if (pointNow.x > _lastPoint.x && number > 10) {
        float currentTime = _player.currentTime.value / _player.currentTime.timescale;
        float tobeTime = currentTime + number*0.5;
        NSLog(@"forwart to  changeTo  time:%f",tobeTime);
    } else if (pointNow.x < _lastPoint.x && number > 10) {
        float currentTime = _player.currentTime.value / _player.currentTime.timescale;
        float tobeTime = currentTime - number*0.5;
        NSLog(@"back to  time:%f",tobeTime);
    }
}
- (void)changeForLigth:(CGPoint) pointNow {
    float number = fabs(pointNow.y - _lastPoint.y);
    if (pointNow.y > _lastPoint.y && number > 10) {
        _lastPoint = pointNow;
        [self minLigth];
        
    } else if (pointNow.y < _lastPoint.y && number > 10) {
        _lastPoint = pointNow;
        [self upperLigth];
    }
}
- (void)changeForVoice:(CGPoint)pointNow {
    float number = fabs(pointNow.y - _lastPoint.y);
    if (pointNow.y > _lastPoint.y && number > 10) {
        _lastPoint = pointNow;
        [self minVolume];
    } else if (pointNow.y < _lastPoint.y && number > 10) {
        _lastPoint = pointNow;
        [self upperVolume];
    }
}
- (void)changeEndForCMTime:(CGPoint)pointNow {
    if (pointNow.x > _lastPoint.x ) {
        NSLog(@"end for CMTime Upper");
        float length = fabs(pointNow.x - _lastPoint.x);
        [self upperCMTime:length];
    } else {
        NSLog(@"end for CMTime min");
        float length = fabs(pointNow.x - _lastPoint.x);
        [self mineCMTime:length];
    }
}
- (void)upperLigth {
    
    if (_darkView.alpha >= 0.1) {
        _darkView.alpha =  _darkView.alpha - 0.1;
    }
    
}
- (void)minLigth {
    if (_darkView.alpha <= 1.0) {
        _darkView.alpha =  _darkView.alpha + 0.1;
    }
}

- (void)upperVolume {
    if (self.volumeSlider.value <= 1.0) {
        self.volumeSlider.value =  self.volumeSlider.value + 0.1 ;
    }
    
}
- (void)minVolume {
    if (self.volumeSlider.value >= 0.0) {
        self.volumeSlider.value =  self.volumeSlider.value - 0.1 ;
    }
}
#pragma mark -CMTIME
- (void)upperCMTime:(float)length {

    float currentTime = _player.currentTime.value / _player.currentTime.timescale;
    float tobeTime = currentTime + length*0.5;
    if (tobeTime > _videoLength) {
        [_player seekToTime:_item.asset.duration];
    } else {
        [_player seekToTime:CMTimeMake(tobeTime, 1)];
    }
}
- (void)mineCMTime:(float)length {

    float currentTime = _player.currentTime.value / _player.currentTime.timescale;
    float tobeTime = currentTime - length*0.5;
    if (tobeTime <= 0) {
        [_player seekToTime:kCMTimeZero];
    } else {
        [_player seekToTime:CMTimeMake(tobeTime, 1)];
    }
}

- (MPVolumeView *)volumeView {
    
    if (_volumeView == nil) {
        _volumeView = [[MPVolumeView alloc] init];
        _volumeView.hidden = YES;
        [self addSubview:_volumeView];
    }
    return _volumeView;
}

- (UISlider *)volumeSlider {
    if (_volumeSlider== nil) {
        NSLog(@"%@",[self.volumeView subviews]);
        for (UIView  *subView in [self.volumeView subviews]) {
            if ([subView.class.description isEqualToString:@"MPVolumeSlider"]) {
                _volumeSlider = (UISlider*)subView;
                break;
            }
        }
    }
    return _volumeSlider;
}

@end

相关文章

网友评论

  • Mister志伟:AVPlayer 的
    -(void)seekToTime
    方法当time时间大小小于一秒的时候,AVPlayer就不会作同步的处理(如:快进,后退等),会依然保留在原来的位置。
    AVPlayerDemo里面,通过Slider去拖动时,也是当经过时间大与1的时候才会进行一次同步。
    想请问一下怎么让AVPlayer 快进/后退 1秒以下的时间?
  • Lifg:我用AVPlayer 播放远程视频 就是播放不了 监听item的状态是准备播放,用safari 也不能播放 ,你有遇到过这种情况吗
    SomeBoy:@XF1994 NSLocalizedFailureReason=The server is not correctly configured. 这个意思好像是服务器配置错误(我百度翻译的),估计不是你的问题,你下载一个VLC app ,AppStore 上有,你试试 能不能播,尽量问下你们服务器的开发人员,不要自己闷头搞。
    Lifg:@SomeBoy 首先非常感谢你的回复,我网上找了个url播放没有问题,我自己用手机录了个视频 ,然后让后台传到服务器上,结果还是报错
    Error Domain=AVFoundationErrorDomain Code=-11850 \"Operation Stopped\" UserInfo={NSLocalizedDescription=Operation Stopped, NSUnderlyingError=0x156ceac0 {Error Domain=NSOSStatusErrorDomain Code=-12939 \"(null)\"}, NSLocalizedFailureReason=The server is not correctly configured.
    我百度谷歌都没有什么结果,能不能帮着看看,
    SomeBoy:@XF1994 建议你下载一个VLC app 看看VLC 这个app 能不能播放你的视频,如果能播放说明视频没有问题,或者在网上找个视频的url (AVPlayer支持的,例如MP4),用自己的代码播放一下,证明你的代码OK。如果两者都没有问题,估计是AVPlayer 不支持你的视频格式。
  • Sias_Orange:请问楼主 我想播放一个视频中间的某一段 比如一个视频10s 我想播放2s到5s 播放完成后 如何触发AVPlayer 播放结束的通知 ?
  • 不管you多苦:avplayer加载wmv视频失败
  • 75281188d37c:你好,,请问如何修改视频的清晰度呀?
    SomeBoy:@75281188d37c 这个应该和解码有关系吧,具体的我也不知道
  • 纸质书签:感谢,找了几天的问题在这里解决了
  • 泰好笑勒:dealloc 方法会调用么?
  • hhgvg:请问下AVplayer 播放网络音频 不下载在本地可以获取音频的封面信息吗
  • ed5f813c1894:请问 我想让视频强制播放完成 不进入编辑状态
  • 2c315d65e1e0:AVPLAYER能否控制多行字幕显示的高度?
  • 愚人船ios:怎么我的seektoTime 方法不起作用呢总是回到0 写死也没用
  • 大大大浣熊:正在找这方面的资源,感谢楼主的无私分享,学到了很多,谢谢。
  • 空转风:去了貌似也不行
    *** First throw call stack:
    (0x2f624ecb 0x39dbfce7 0x2f6287f7 0x2f6270f7 0x2f576058 0x7c521 0x7c449 0x31e5aa33 0x31e5a7f1 0x31fe6bf3 0x31f0446f 0x31f04279 0x31f04211 0x31e562e5 0x31ad231b 0x31acdb3f 0x31acd9d1 0x31acd3e5 0x31acd1f7 0x31e5999f 0x2f5effaf 0x2f5ef477 0x2f5edc67 0x2f558729 0x2f55850b 0x344c76d3 0x31eb9871 0x7c399 0x3a2bdab7)
    libc++abi.dylib: terminating with uncaught exception of type NSException
  • 空转风:楼主这个能支持iOS7吗?我真机崩溃……
    SomeBoy:@年光逝也被僵尸号占了 日志是什么?有可能是转屏的问题,把转屏的代码去了试试
  • e777fd17af2a:快进没有执行,播放时正常的 :cold_sweat:
    252f815e8f6d:确实拖动slider不能快进
    SomeBoy:@一颗小葱花 我测试了,应该是执行的吧
  • aeb7ae7458c4:AVPlayer 能连续播放多个视频么?就是播放一个url然后接着播放第二个url
    63217a85f027:@SomeBoy 用AVQUEUEPLAYER
    63217a85f027:@SomeBoy 可以的
    SomeBoy:@Aldtck 应该可以,但是我没有测试过 你可以坐下实验
  • Shumin_Wu:up 文章里面跳转那里有疑问? 我是这样理解的。你的 slider.maxvalue 应该是 item.duration 转化后的秒数,value / timeScale, 然后 滑动滑块跳转的时候,创建的 CMTime = CMTimeMake( 帧数 , 帧率)。 我列个公式吧。
    // self.playProgress.maxValue = value / timeScale
    // value = progress.value * timeScale
    // CMTimemake(value, timeScale) = (value / timeScale, timeScale / timescale )= (progress.value, 1.0)

    文章不错,多多交流。我学到了要移除 时间观察者。
  • FengxinLi:请问一下楼主音频可以实现收听一个url连接不AVPlayer
    FengxinLi:@SomeBoy 谢谢了
    SomeBoy:@Fengxinliju 可以 AVPlayer 音频视频通吃
  • SomeBoy:http://www.jianshu.com/p/93ce1748ea57
    http://vombat.tumblr.com/post/86294492874/caching-audio-streamed-using-avplayer
    给你推荐两个人的文章,都有源码的,可以满足你的需求,后面的可能需要翻墙
  • c946e16a6cdc:亲,请问边听边下载要如何实现哇?一遍的流量实现试听的同时下载下来。。。
  • sea7reen:方便看下你源码吗
    SomeBoy:@sea7reen https://github.com/demoYang/aVide.git
  • sea7reen:我们做直播遇到个问题 可以请教你吗

本文标题:iOS 开发 AVPlayer 深入浅出

本文链接:https://www.haomeiwen.com/subject/flcbhttx.html