美文网首页iOS相关ios进阶酷
iOS 给视频添加水印 动画 背景音乐

iOS 给视频添加水印 动画 背景音乐

作者: 丶若若 | 来源:发表于2017-04-18 10:42 被阅读983次

    视频处理主要是用到以下这几个类

    AVMutableComposition、AVMutableVideoComposition、AVMutableAudioMix、AVMutableVideoCompositionInstruction、AVMutableVideoCompositionLayerInstruction、AVAssetExportSession 等。其中 AVMutableComposition 可以用来操作音频和视频的组合,AVMutableVideoComposition 可以用来对视频进行操作,AVMutableAudioMix 类是给视频添加音频的,AVMutableVideoCompositionInstruction和AVMutableVideoCompositionLayerInstruction 一般都是配合使用,用来给视频添加水印或者旋转视频方向,AVAssetExportSession 是用来进行视频导出操作的。需要值得注意的是当App进入后台之后,会对使用到GPU的代码操作进行限制,会造成崩溃,而视频处理这些功能多数会使用到GPU,所以需要做对应的防错处理。

    在这里我会使用Apple的官方Demo “AVSimpleEditoriOS” 作为讲解案例,该案例采用Command设计模式来组织代码,其中基类的AVSECommand包含了一些各个子类Command共用的属性。本文就视频相关操作做简要介绍,说明一些相关的操作,并标注一些重点代码,希望本文可以起到抛砖引玉的效果,让大家对视频剪辑处理有个初步印象,然后可以根据Apple官方Demo的内容进行相应的修改。大家可以下载相应的Apple官方Demo运行查看结果。

    第一节:给视频添加水印和背景边框

    今天第一节先讲解如何为一个视频添加边框和动画,首先说明的是,这种边框和动画并不能直接修改视频的某一帧给他增加边框或者产生动画效果,这种动画更像是给视频的上面加一个calayer,然后控制这个layer产生动画效果。因为具体到某一帧的这种操作不是iphone应该做的他也做不到。

    我们先来看一张图,了解一下给video增加动画的原理。

    你可以看到videoLayer这个东西,其实这个layer就是负责显示我们的视频,和他同级的是一个叫animationLayer的东西,我们能够掌控并且玩出花样的其实是这个叫animationLayer的东西,因为这个animationLayer可以由我们自己创建。

    其实很简单,和我们videoLayer同级别的layer叫animationLayer(就是background),他们共同有个父类叫做parentLayer,那么增加边框无非是把animationLayer这个layer找个边框的图片,然后把他放到videoLayer的下面,然后把videoLayer(crop也就是裁剪)的尺寸控制到刚好能显示animationLayer的四边,这样,不就成了带边框的效果么。简单点讲就是 把裁剪好的视频 videoLayer放在了 背景上然后再加入到parentLayer里 。

    视频添加水印和背景边框具体步骤

    1.拿到视频和音频资源

    2.创建AVMutableComposition对象

    3.往AVMutableComposition对象添加视频资源,同时设置视频资源的时间段和插入点

    4.往AVMutableComposition对象添加音频资源,同时设置音频资源的时间段和插入点

    5.创建视频组合器对象 AVMutableVideoComposition 并设置frame和渲染宽高

    6.创建视频组合器指令对象,设置指令的作用范围

    7.创建视频组合器图层指令对象,设置指令的作用范围

    8.视频组合器图层指令对象 放入 视频组合器指令对象中

    9.视频组合器指令对象放入视频组合器对象

    10.创建水印图层Layer并设置frame和水印的位置,并将水印加入视频组合器中

    具体代码实现

    - (void)performWithAsset:(AVAsset*)asset withImageNamed:(NSString*)imgName withColorName:(NSString*)color withMusicName:(NSString *)musicName with:(CGRect)photoSize{

    CGSize videoSize;

    // 获取视频资源和音频资源

    AVAssetTrack *assetVideoTrack = nil;

    AVAssetTrack *assetAudioTrack = nil;

    // Check if the asset contains video and audio tracks

    if ([[asset tracksWithMediaType:AVMediaTypeVideo] count] != 0) {

    assetVideoTrack = [asset tracksWithMediaType:AVMediaTypeVideo][0];

    }

    if ([[asset tracksWithMediaType:AVMediaTypeAudio] count] != 0) {

    assetAudioTrack = [asset tracksWithMediaType:AVMediaTypeAudio][0];

    }

    CMTime insertionPoint = kCMTimeZero;

    NSError *error = nil;

    // 创建组合器对象并添加视频资源和音频资源

    // Step 1

    // Create a composition with the given asset and insert audio and video tracks into it from the asset

    if(!self.mutableComposition) {

    // Check if a composition already exists, else create a composition using the input asset

    self.mutableComposition = [AVMutableComposition composition];

    // Insert the video and audio tracks from AVAsset

    if (assetVideoTrack != nil) {

    AVMutableCompositionTrack *compositionVideoTrack = [self.mutableComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];

    [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [asset duration]) ofTrack:assetVideoTrack atTime:insertionPoint error:&error];

    }

    if (assetAudioTrack != nil) {

    AVMutableCompositionTrack *compositionAudioTrack = [self.mutableComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];

    [compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [asset duration]) ofTrack:assetAudioTrack atTime:insertionPoint error:&error];

    }

    }

    // Step 2  创建一个和视频相同大小的背景层

    if ([[self.mutableComposition tracksWithMediaType:AVMediaTypeVideo] count] != 0) {

    if(!self.mutableVideoComposition) {

    self.mutableVideoComposition = [AVMutableVideoComposition videoComposition];

    AVMutableVideoCompositionInstruction *passThroughInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];

    AVAssetTrack *videoTrack = [self.mutableComposition tracksWithMediaType:AVMediaTypeVideo][0];

    AVMutableVideoCompositionLayerInstruction *passThroughLayer = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];

    //对视频大小进行裁剪,我这里是将视频裁剪成 等宽高的正方形

    CGFloat rate;

    CGSize renderSize = CGSizeMake(0, 0);

    renderSize.width = MAX(renderSize.width, videoTrack.naturalSize.height);

    renderSize.height = MAX(renderSize.height, videoTrack.naturalSize.width);

    CGFloat renderW = MIN(renderSize.width, renderSize.height);

    rate = renderW / MIN(videoTrack.naturalSize.width, videoTrack.naturalSize.height);

    //对视频的方向进行处理

    CGAffineTransform translateToCenter;

    CGAffineTransform mixedTransform;

    NSInteger degrees = [self degressFromVideoFileWithURL:asset];

    if (degrees == 0) {

    if (videoTrack.naturalSize.width == videoTrack.naturalSize.height) {

    translateToCenter = CGAffineTransformMakeTranslation(0.0, 0.0);

    }else{

    translateToCenter = CGAffineTransformMakeTranslation(-140.0, 0.0);

    }

    mixedTransform = CGAffineTransformRotate(translateToCenter,0);

    }else{

    if(degrees == 90){

    //顺时针旋转90°

    NSLog(@"视频旋转90度,home按键在左");

    translateToCenter = CGAffineTransformMakeTranslation(videoTrack.naturalSize.height, -240);

    mixedTransform = CGAffineTransformRotate(translateToCenter,M_PI_2);

    }else if(degrees == 180){

    //顺时针旋转180°

    NSLog(@"视频旋转180度,home按键在上");

    translateToCenter = CGAffineTransformMakeTranslation(videoTrack.naturalSize.width, videoTrack.naturalSize.height);

    mixedTransform = CGAffineTransformRotate(translateToCenter,M_PI);

    }else if(degrees == 270){

    //顺时针旋转270°

    NSLog(@"视频旋转270度,home按键在右");

    translateToCenter = CGAffineTransformMakeTranslation(0.0, videoTrack.naturalSize.width);

    mixedTransform = CGAffineTransformRotate(translateToCenter,M_PI_2*3.0);

    }

    }

    //将处理好的视频赋值给AVMutableVideoCompositionLayerInstruction

    [passThroughLayer setTransform:mixedTransform atTime:kCMTimeZero];

    [passThroughLayer setOpacity:0.0 atTime:[asset duration]];

    //然后将处理好的AVMutableVideoCompositionLayerInstruction赋值给AVMutableVideoCompositionInstruction

    passThroughInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, [asset duration]);

    passThroughInstruction.layerInstructions = @[passThroughLayer];

    //创建 video composition 

    self.mutableVideoComposition.instructions = @[passThroughInstruction];

    self.mutableVideoComposition.frameDuration = CMTimeMake(1, 30); // 30 fps

    self.mutableVideoComposition.renderSize = CGSizeMake(renderW, renderW);

    }

    videoSize = self.mutableVideoComposition.renderSize;

    //添加背景

    self.watermarkLayer = [self watermarkLayerForSize:videoSize withImageNamed:imgName withColorName:color];

    }

    }

    // Step 3 发送通知到你想要处理的界面

    // Notify AVSEViewController about add watermark operation completion

    [[NSNotificationCenter defaultCenter] postNotificationName:AVSEEditCommandCompletionNotification object:self];

    // 这里执行添加背景边框 和 水印的 layer

    - (CALayer*)watermarkLayerForSize:(CGSize)videoSize withImageNamed:(NSString*)imgName withColorName:(NSString *)color{

    // Create a layer for the title

    CALayer *titleLayer = [CALayer layer];

    titleLayer.bounds = CGRectMake(0, 0, videoSize.width, videoSize.height);//此处的Frame为视频展示试图的大小

    titleLayer.masksToBounds = true;

    UIImage *image = [UIImage imageNamed:imgName];

    titleLayer.contents = (id)image.CGImage;

    titleLayer.position = CGPointMake(videoSize.width/2, videoSize.height/2);

    //还能给背景设置样式和颜色

    //do something...

    return titleLayer;

    }

    - (void)exportWillBegin{

    CALayer *parentLayer = [CALayer layer];

    CALayer *videoLayer = [CALayer layer];

    if (self.backGroundLayer) {

    videoLayer.frame = CGRectMake(20, 20, self.videoComposition.renderSize.width - 40, self.videoComposition.renderSize.height - 40);

    }else{

    videoLayer.frame = CGRectMake(0, 0, self.videoComposition.renderSize.width, self.videoComposition.renderSize.height);

    }

    parentLayer.frame = CGRectMake(0, 0, self.videoComposition.renderSize.width, self.videoComposition.renderSize.height);

    //这里是先添加背景backgroundLayer,然后再添加videoLayer,那么视频就会在背景上了。

    //而添加水印则是相反的,我们把overlayLayer放在了videolayer的上面,所以水印总是显示在视频之上的。

    [parentLayer addSublayer:backgroundLayer];

    [parentLayer addSublayer:videoLayer];

    //添加水印到视频组合器里

    self.videoComposition.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];

    }

    //之后导出就可以了

    第二节:视频添加动画

    添加动画的原理 和 添加水印是一样的 

    - (CALayer*)watermarkLayerForSize:(CGSize)videoSize withImageNamed:(NSString*)imgName withColorName:(NSString *)color{

    // Create a layer for the title

    CALayer *overlayLayer1 = [CALayer layer];

    [overlayLayer1 setContents:(id)[animationImage CGImage]];

    overlayLayer1.frame = CGRectMake(size.width/2-64, size.height/2 + 200, 128, 128);

    [overlayLayer1 setMasksToBounds:YES];

    CABasicAnimation *animation = [CABasicAnimation animationWithKeyPath:@"transform.rotation"];

    animation.duration=2.0;

    animation.repeatCount=5;

    animation.autoreverses=YES;

    // rotate from 0 to 360

    animation.fromValue=[NSNumber numberWithFloat:0.0];

    animation.toValue=[NSNumber numberWithFloat:(2.0 * M_PI)];

    animation.beginTime = AVCoreAnimationBeginTimeAtZero;       //注意一定要设置开始时间不然不显示

    [overlayLayer1 addAnimation:animation forKey:@"rotation"];

    return overlayLayer1;

    }

    第三节:视频添加音频

    1.拿到视频和音频资源

    2.创建AVMutableComposition对象

    3.往AVMutableComposition对象添加视频资源,同时设置视频资源的时间段和插入点

    4.往AVMutableComposition对象添加音频资源,同时设置音频资源的时间段和插入点

    5.往AVMutableComposition对象添加要追加的音频资源,同时设置音频资源的时间段,插入点和混合模式

    - (void)performWithAsset:(AVAsset*)asset withImageNamed:(NSString*)imgName withMusicName:(NSString *)musicName

    {

    AVAssetTrack *assetVideoTrack = nil;

    AVAssetTrack *assetAudioTrack = nil;

    // Check if the asset contains video and audio tracks

    //    拿到视频和音频资源

    if ([[asset tracksWithMediaType:AVMediaTypeVideo] count] != 0) {

    assetVideoTrack = [asset tracksWithMediaType:AVMediaTypeVideo][0];

    }

    if ([[asset tracksWithMediaType:AVMediaTypeAudio] count] != 0) {

    assetAudioTrack = [asset tracksWithMediaType:AVMediaTypeAudio][0];

    }

    NSError *error = nil;

    // Step 1

    NSArray *Arr = [NSArray array];

    Arr = [musicName componentsSeparatedByString:@"."];

    NSString *audioURL = [[NSBundle mainBundle] pathForResource:[Arr firstObject] ofType:[Arr lastObject]];

    AVAsset *audioAsset = [[AVURLAsset alloc] initWithURL:[NSURL fileURLWithPath:audioURL] options:nil];

    AVAssetTrack *newAudioTrack = [audioAsset tracksWithMediaType:AVMediaTypeAudio][0];

    // Step 2

    //    创建AVMutableComposition对象

    if (!self.mutableComposition) {

    // Check whether a composition has already been created, i.e, some other tool has already been applied.

    // 创建  new composition组合器

    self.mutableComposition = [AVMutableComposition composition];

    // Add tracks to composition from the input video asset

    if (assetVideoTrack != nil) {

    //            往AVMutableComposition对象添加视频资源,同时设置视频资源的时间段和插入点

    AVMutableCompositionTrack *compositionVideoTrack = [self.mutableComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];

    [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [asset duration]) ofTrack:assetVideoTrack atTime:kCMTimeZero error:&error];

    }

    if (assetAudioTrack != nil) {

    //          往AVMutableComposition对象添加音频资源, 同时设置音频资源的时间段和插入点

    AVMutableCompositionTrack *compositionAudioTrack = [self.mutableComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];

    [compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [asset duration]) ofTrack:assetAudioTrack atTime:kCMTimeZero error:&error];

    }

    }

    // Step 3

    //    添加音频资源到composition

    AVMutableCompositionTrack *customAudioTrack = [self.mutableComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];

    [customAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [self.mutableComposition duration]) ofTrack:newAudioTrack atTime:kCMTimeZero error:&error];

    // Step 4

    //    设置添加资源中的音频时间段,并与原有视频中的音频混合

    AVMutableAudioMixInputParameters *mixParameters = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:customAudioTrack];

    [mixParameters setVolumeRampFromStartVolume:1 toEndVolume:0 timeRange:CMTimeRangeMake(kCMTimeZero, self.mutableComposition.duration)];

    self.mutableAudioMix = [AVMutableAudioMix audioMix];

    self.mutableAudioMix.inputParameters = @[mixParameters];

    // Step 5 发送通知到你想要处理的界面,这里完成之后就可以直接导出了,因为音乐已经添加到对于的视频上了

    [[NSNotificationCenter defaultCenter] postNotificationName:AVSEEditCommandCompletionNotification object:self];

    }

    需要注意的是:如果你的视频在界面上进行展示了,那在添加了音乐之后就要去刷新UI。我这里使用的是AVPlayer来进行播放

    //视频播放试图

    - (void)video:(NSString *)videoPath{

    NSURL *url = [NSURL fileURLWithPath:videoPath];

    self.player = [AVPlayer playerWithURL:url];

    AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player];

    playerLayer.bounds = self.photo.bounds;

    playerLayer.position = CGPointMake(self.photo.bounds.size.width/2, self.photo.bounds.size.height/2);

    playerLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;//视频填充模式

    [self.photo.layer addSublayer:playerLayer];

    [self.player play];

    [self reloadPlayerViewWithMusic];

    }

    - (void)playbackFinished:(NSNotification *)n{

    //注册的通知  可以自动把 AVPlayerItem 对象传过来,只要接收一下就OK

    AVPlayerItem * playerItem = [n object];

    //关键代码

    [playerItem seekToTime:kCMTimeZero];

    [self.player play];

    NSLog(@"重播");

    }

    // 添加音乐后 刷新视频播放器 

    - (void)reloadPlayerViewWithMusic{

    //    添加music

    mutableAudioMix = self.audioMix;

    AVPlayerItem * playerItem = [AVPlayerItem playerItemWithAsset:self.composition];

    //    判断是否 有音频文件

    if(self.videoComposition && self.audioMix){

    playerItem.videoComposition = self.videoComposition;

    playerItem.audioMix = self.audioMix;

    [self.player replaceCurrentItemWithPlayerItem:playerItem];

    }

    }


    第五节:视频导出

    1.创建输出路径

    2.根据AVMutableComposition对象创建AVAssetExportSession视频导出对象

    3.设置AVAssetExportSession的AVMutableVideoComposition对象,AVMutableAudioMix对象,视频导出路径,视频导出格式

    4.异步导出视频,根据导出结果做对应处理。

    - (void)performWithAsset:(AVAsset*)asset withImageNamed:(NSString*)imgName withColorName:(NSString*)color withMusicName:(NSString *)musicName  with:(CGRect)photoSize{

    // Step 1

    //    创建输出路径

    NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);

    NSString *documentsDirectory = [paths objectAtIndex:0];

    NSString *outputURL =  [documentsDirectory stringByAppendingPathComponent:

    [NSString stringWithFormat:@"FinalVideo-%d.m4v",arc4random() % 1000]];

    // Step 2

    //    创建导出对象

    self.exportSession = [[AVAssetExportSession alloc] initWithAsset:self.mutableComposition presetName:AVAssetExportPresetHighestQuality];

    //    视频组合器

    self.exportSession.videoComposition = self.mutableVideoComposition;

    //    音频组合器

    self.exportSession.audioMix = self.mutableAudioMix;

    self.exportSession.shouldOptimizeForNetworkUse = YES;

    self.exportSession.outputURL = [NSURL fileURLWithPath:outputURL];

    //    导出格式

    self.exportSession.outputFileType = AVFileTypeQuickTimeMovie;

    [self.exportSession exportAsynchronouslyWithCompletionHandler:^(void){

    switch (self.exportSession.status) {

    case AVAssetExportSessionStatusCompleted:{

    dispatch_async(dispatch_get_main_queue(), ^{

    //输出完成后

    [self writeVideoToPhotoLibrary:self.exportSession.outputURL];

    });

    // Step 3  发送通知到指定的界面

    [[NSNotificationCenter defaultCenter] postNotificationName:AVSEExportCommandCompletionNotification

    object:self];

    }

    break;

    case AVAssetExportSessionStatusFailed:

    NSLog(@"Failed:%@",self.exportSession.error);

    [[NSNotificationCenter defaultCenter] postNotificationName:@"ExportCommandFaild" object:nil];

    break;

    case AVAssetExportSessionStatusCancelled:

    NSLog(@"Canceled:%@",self.exportSession.error);

    break;

    default:

    break;

    }

    }];

    }

    - (void)writeVideoToPhotoLibrary:(NSURL *)url

    {

    ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];

    //    保存视频到指定的相溥

    [library saveVideo:url toAlbum:PHOTO_ALBUM_NAME completion:^(NSURL *assetURL, NSError *error) {

    } failure:^(NSError *error) {

    }];

    }

    相关文章

      网友评论

        本文标题:iOS 给视频添加水印 动画 背景音乐

        本文链接:https://www.haomeiwen.com/subject/spqfzttx.html