美文网首页AV Foundation专题
AVFoundation:视频过渡效果

AVFoundation:视频过渡效果

作者: MonKey_Money | 来源:发表于2020-09-04 13:41 被阅读0次

前沿

随着视频时间轴的前进,两个场景间快速过渡或切换,其中没有任何效果。我们经常看到这种不带效果的过渡,不过并不是这一种过渡方式。通常在时间轴出现明显变化时,比如情节或位置的变换,两个场景间会使用一些类型的动画过渡效果。诸如渐隐,溶解和擦除等视频间的过渡被广泛用于各种创意效果。


image.png

关键类

AVVideoComposition

对两个或多个视频轨道组合在一起的方法给出了总体描述。由一组时间范围和描述组合行为的介绍内容组成,这些信息出现在组合资源内的任意时间点。

除了包含描述输入视频层组合的信息之外,还提供了配置视频组合的渲染尺寸、缩放和帧时长等属性。视频组合配置确定了委托对象处理时AVComposition的呈现方式。这里的委托对象比如AVPlayer或AVAssetImageGenerator。
AVVideoComposition并不是AVComposition的子类,没有直接关联。在视频播放、淡出或处理时会用AVVideoComposition来控制资源视频轨道的视频组合行为。

AVVideoCompositionInstruction

AVVideoComposition是由一组AVVideoCompositionInstruction对象格式定义的指令组成的。这个对象所提供的最关键的一段数据是组合对象时间轴内的时间范围信息,这一时间范围是在某一组合形式出现时的时间范文,要执行的组合特质是通过其layerInstructions集合定义的。

AVVideoCompositionLayerInstruction

用于定义对给定视频轨道应用的模糊、变形和裁剪效果。

它提供了一些方法用于在特定的时间点火灾一个时间范围内对这些值进修修改。在一段时间内对这些值应用渐变操作可以让开发者创建出动态的过渡效果,比如溶解和渐淡效果。

与所有AVFoundation媒体编辑类一样,视频组合API具有不可变和可变两种形式。不可变超类形式使用于客户端对象,比如AVAssetExportSession,不过当创建自己的视频组合应用程序时,应该使用不可变子类。

AVVideoComposition并不直接和AVComposition相关。相反,这些对象和雷系AVPlayerItem的客户端相关联,在播放组合或进行其他处理时使用这些对象。

不将AVComposition与输出行为强耦合,可以再播放、导出或处理视频时更灵活地确定如何使用这些行为。

示例

我们通过布局两个轨道,才能实现视频过渡效果。

视频布局

我们可以按照如图所示布局资源AVAsset


vedio.png

资源文件分配

  self.composition = [AVMutableComposition composition]; //合成后的资源
        AVAsset *firstVideoAsset = [AVAsset assetWithURL:[[NSBundle mainBundle] URLForResource:@"01_nebula" withExtension:@"mp4"]];
        AVAsset *secondVideoAsset = [AVAsset assetWithURL:[[NSBundle mainBundle] URLForResource:@"03_nebula" withExtension:@"mp4"]];
        AVAsset *thirdVideoAsset = [AVAsset assetWithURL:[[NSBundle mainBundle] URLForResource:@"04_quasar" withExtension:@"mp4"]];
        AVAsset *firstAudioAsset =[AVAsset assetWithURL:[[NSBundle mainBundle] URLForResource:@"02 Keep Going" withExtension:@"m4a"]];
//视频轨道1
        AVMutableCompositionTrack *videoTrack = [self.composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
//视频轨道2
         AVMutableCompositionTrack *videoTrack2 = [self.composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
        AVMutableCompositionTrack *audioTrack = [self.composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
        
        NSArray *videoAssets = @[firstVideoAsset,secondVideoAsset,thirdVideoAsset];
        NSArray *videoTracks = @[videoTrack,videoTrack2];
        CMTime cursorTime = kCMTimeZero;
//把资源加入到轨道中
        for (int i = 0; i< videoAssets.count; i++) {
            NSUInteger trackIndex = i % 2;
            AVMutableCompositionTrack *currentTrack = videoTracks[trackIndex];
            AVAsset *asset = videoAssets[i];
            AVAssetTrack *assetTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] firstObject];
            CMTimeRange timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration);
            [currentTrack insertTimeRange:timeRange ofTrack:assetTrack atTime:cursorTime error:nil];
            cursorTime = CMTimeAdd(cursorTime, timeRange.duration);
            cursorTime = CMTimeSubtract(cursorTime, CMTimeMake(2, 1));
            
        }
        //音频资源加入到音频轨道中
        NSError *error;
        AVAssetTrack *assetTrack =[[firstAudioAsset tracksWithMediaType:AVMediaTypeAudio] firstObject];
        [audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, CMTimeMake(26, 1)) ofTrack:assetTrack atTime:kCMTimeZero error:&error];
        //设置混响
        AVMutableAudioMixInputParameters *audioMixInputParameters = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:audioTrack];
        [audioMixInputParameters setVolumeRampFromStartVolume:0.0 toEndVolume:0.5 timeRange:CMTimeRangeMake(CMTimeMake(0, 1), CMTimeMake(5, 1))];
           [audioMixInputParameters setVolumeRampFromStartVolume:0.5 toEndVolume:0.0 timeRange:CMTimeRangeMake(CMTimeMake(5, 1), CMTimeMake(25, 1))];

        self.myMutableAudioMix = [AVMutableAudioMix audioMix];
        self.myMutableAudioMix.inputParameters = @[audioMixInputParameters];

处理通过和过渡时间

    CMTime cursorTime = kCMTimeZero;
    CMTime transDuration = CMTimeMake(2, 1);
    NSMutableArray *passThroughTimeRanges = [NSMutableArray array];
    NSMutableArray *transitionTimeRanges= [NSMutableArray array];
    NSInteger videoCount = assets.count;
    for (int i = 0; i < videoCount; i++) {
        AVAsset *asset = assets[i];
        CMTimeRange timeRange = CMTimeRangeMake(cursorTime, asset.duration);
        if (i > 0) {
            timeRange.start = CMTimeAdd(timeRange.start, transDuration);
            timeRange.duration = CMTimeSubtract(timeRange.duration, transDuration);
        }
        if (i+1 < videoCount) {
            timeRange.duration = CMTimeSubtract(timeRange.duration, transDuration);
        }
        [passThroughTimeRanges addObject:[NSValue valueWithCMTimeRange:timeRange]];
        cursorTime = CMTimeAdd(cursorTime, asset.duration);
        cursorTime = CMTimeSubtract(cursorTime, transDuration);
        if (i + 1 <videoCount) {
            timeRange = CMTimeRangeMake(cursorTime, transDuration);
            [transitionTimeRanges addObject:[NSValue valueWithCMTimeRange:timeRange]];
        }
    }
    self.passThroughTimeRanges = passThroughTimeRanges;
    self.transitionTimeRanges = transitionTimeRanges;

创建AVVideoCompositionInstruction

NSMutableArray *compositionInstructions = [NSMutableArray array];

// Look up all of the video tracks in the composition
NSArray *tracks = [self.composition tracksWithMediaType:AVMediaTypeVideo];

for (NSUInteger i = 0; i < self.passThroughTimeRanges.count; i++) {         // 1

    // Calculate the trackIndex to operate upon: 0, 1, 0, 1, etc.
    NSUInteger trackIndex = i % 2;

    AVMutableCompositionTrack *currentTrack = tracks[trackIndex];

    AVMutableVideoCompositionInstruction *instruction =                     // 2
        [AVMutableVideoCompositionInstruction videoCompositionInstruction];

    instruction.timeRange =                                                 // 3
        [self.passThroughTimeRanges[i] CMTimeRangeValue];

    AVMutableVideoCompositionLayerInstruction *layerInstruction =           // 4
        [AVMutableVideoCompositionLayerInstruction
            videoCompositionLayerInstructionWithAssetTrack:currentTrack];

    instruction.layerInstructions = @[layerInstruction];

    [compositionInstructions addObject:instruction];

    if (i < self.transitionTimeRanges.count) {

        AVCompositionTrack *foregroundTrack = tracks[trackIndex];           // 5
        AVCompositionTrack *backgroundTrack = tracks[1 - trackIndex];

        AVMutableVideoCompositionInstruction *instruction =                 // 6
            [AVMutableVideoCompositionInstruction videoCompositionInstruction];

        CMTimeRange timeRange = [self.transitionTimeRanges[i] CMTimeRangeValue];
        instruction.timeRange = timeRange;

        AVMutableVideoCompositionLayerInstruction *fromLayerInstruction =   // 7
            [AVMutableVideoCompositionLayerInstruction
                videoCompositionLayerInstructionWithAssetTrack:foregroundTrack];
        //溶解过渡效果
//        [fromLayerInstruction setOpacityRampFromStartOpacity:1.0 toEndOpacity:0.0 timeRange:timeRange];
        AVMutableVideoCompositionLayerInstruction *toLayerInstruction =
            [AVMutableVideoCompositionLayerInstruction
                videoCompositionLayerInstructionWithAssetTrack:backgroundTrack];


        //擦除
        if(i %2 == 0){
            CGRect startRect = CGRectMake(0, 0, 1280.0f, 720.0f);
            CGRect endRect = CGRectMake(0, 720, 1280.0f, 0.0f);
            [fromLayerInstruction setCropRectangleRampFromStartCropRectangle:startRect toEndCropRectangle:endRect timeRange:timeRange];

        }else {
            //推入过滤效果
            CGAffineTransform identityTransform = CGAffineTransformIdentity;


                CGAffineTransform fromDestTransform =
                             CGAffineTransformMakeTranslation(-1280.0f, 0.0);

            CGAffineTransform toStartTransform =
                             CGAffineTransformMakeTranslation(1280.0f, 0.0);

            [fromLayerInstruction setTransformRampFromStartTransform:identityTransform
                                                        toEndTransform:fromDestTransform
                                                             timeRange:timeRange];

            [toLayerInstruction setTransformRampFromStartTransform:toStartTransform
                                                      toEndTransform:identityTransform
                                                           timeRange:timeRange];
        }
         
        
        
        instruction.layerInstructions = @[fromLayerInstruction,             // 8
                                          toLayerInstruction];

        [compositionInstructions addObject:instruction];
    }

}

创建AVMutableVideoComposition

  AVMutableVideoComposition *videoComposition =
        [AVMutableVideoComposition videoComposition];

    videoComposition.instructions = compositionInstructions;
    videoComposition.renderSize = CGSizeMake(1280.0f, 720.0f);
    videoComposition.frameDuration = CMTimeMake(1, 30);
    videoComposition.renderScale = 1.0f;

视频过渡和混合设置方式一样,都是只能设置在AVAssetExportSession或者AVPlayerItem上的videoComposition上。
demo

相关文章

网友评论

    本文标题:AVFoundation:视频过渡效果

    本文链接:https://www.haomeiwen.com/subject/avfosktx.html