iOS 实现视频的录制,视频和音频的合成

作者: PersonChen_QJ | 来源:发表于2018-06-11 16:10 被阅读54次

    一、前沿

    • 项目需要,实现录制无声小视频,在无声小视频的基础上,再录制声音。

    二、总体步骤

      1. 录制无声小视频(有声无声可选)
      1. 播放录制的小视频
      1. 实现录音功能,录音的音频长度等于小视频的视频长度
      1. 实现把音频加入到小视频当中。(原视频如果有声音的话,是否保留原来的视频声音,即加入的音频为背景音乐)

    三、录制视频

    步骤

      1. 创建捕捉会话
      1. 设置视频的输入
      1. 设置音频的输入
      1. 输出源设置,这里视频,音频数据会合并到一起输出,在代理方法中可以单独拿到视频或者音频数据,给AVCaptureMovieFileOutput指定路径,开始录制之后就会向这个路径写入数据
      1. 添加视频预览层
      1. 开始采集数据,这个时候还没有写入数据,用户点击录制后就可以开始写入数据
    步骤 1 :创建捕捉会话
    self.session = [[AVCaptureSession alloc] init];
    if ([_session canSetSessionPreset:AVCaptureSessionPreset640x480]) {
            //设置分辨率
           _session.sessionPreset=AVCaptureSessionPreset640x480;
     }
    
    步骤 2 :设置视频的输入
    - (void)setUpVideo
    {
        // 1.1 获取视频输入设备(摄像头)
        AVCaptureDevice *videoCaptureDevice=[self getCameraDeviceWithPosition:AVCaptureDevicePositionBack];//取得后置摄像头
        
        // 视频 HDR (高动态范围图像)
        // videoCaptureDevice.videoHDREnabled = YES;
        // 设置最大,最小帧速率
        //videoCaptureDevice.activeVideoMinFrameDuration = CMTimeMake(1, 60);
        // 1.2 创建视频输入源
        NSError *error=nil;
        self.videoInput= [[AVCaptureDeviceInput alloc] initWithDevice:videoCaptureDevice error:&error];
        // 1.3 将视频输入源添加到会话
        if ([self.session canAddInput:self.videoInput]) {
            [self.session addInput:self.videoInput];
        }
    }
    
    步骤 3 :设置音频的输入
       // 2.1 获取音频输入设备
        AVCaptureDevice *audioCaptureDevice=[[AVCaptureDevice devicesWithMediaType:AVMediaTypeAudio] firstObject];
        NSError *error=nil;
        // 2.2 创建音频输入源
        self.audioInput = [[AVCaptureDeviceInput alloc] initWithDevice:audioCaptureDevice error:&error];
        // 2.3 将音频输入源添加到会话
        if ([self.session canAddInput:self.audioInput]) {
            [self.session addInput:self.audioInput];
        }
    
    步骤 4 :输出源设置
    - (void)setUpFileOut
    {
        // 3.1初始化设备输出对象,用于获得输出数据
        self.FileOutput=[[AVCaptureMovieFileOutput alloc]init];
        
        // 3.2设置输出对象的一些属性
        AVCaptureConnection *captureConnection=[self.FileOutput connectionWithMediaType:AVMediaTypeVideo];
        //设置防抖
        //视频防抖 是在 iOS 6 和 iPhone 4S 发布时引入的功能。到了 iPhone 6,增加了更强劲和流畅的防抖模式,被称为影院级的视频防抖动。相关的 API 也有所改动 (目前为止并没有在文档中反映出来,不过可以查看头文件)。防抖并不是在捕获设备上配置的,而是在 AVCaptureConnection 上设置。由于不是所有的设备格式都支持全部的防抖模式,所以在实际应用中应事先确认具体的防抖模式是否支持:
        if ([captureConnection isVideoStabilizationSupported ]) {
            captureConnection.preferredVideoStabilizationMode=AVCaptureVideoStabilizationModeAuto;
        }
        //预览图层和视频方向保持一致
        captureConnection.videoOrientation = [self.previewlayer connection].videoOrientation;
        
        // 3.3将设备输出添加到会话中
        if ([_session canAddOutput:_FileOutput]) {
            [_session addOutput:_FileOutput];
        }
    }
    
    步骤 5 :添加视频预览层
    - (void)setUpPreviewLayerWithType:(ATVideoViewType )type
    {
        CGRect rect = CGRectZero;
        switch (type) {
            case Type1X1:
                rect = CGRectMake(0, 0, kScreenWidth, kScreenWidth);
                break;
            case Type4X3:
                rect = CGRectMake(0, 0, kScreenWidth, kScreenWidth*4/3);
                break;
            case TypeFullScreen:
                rect = [UIScreen mainScreen].bounds;
                break;
            default:
                rect = [UIScreen mainScreen].bounds;
                break;
        }
        self.previewlayer.frame = rect;
        [_superView.layer insertSublayer:self.previewlayer atIndex:0];
    }
    
    步骤 6 :开始采集数据
    [self.session startRunning];
    

    四、播放视频

    步骤

      1. 创建播放器
      1. 设置文件播放路径
      1. 监听事件
    步骤 1 :创建播放器
       self.videoPlayer = [[MPMoviePlayerController alloc] init];
        [self.videoPlayer.view setFrame:self.view.bounds];
        self.videoPlayer.view.autoresizingMask = UIViewAutoresizingFlexibleWidth | UIViewAutoresizingFlexibleHeight;
        [self.view addSubview:self.videoPlayer.view];
        [self.videoPlayer prepareToPlay];
        self.videoPlayer.controlStyle = MPMovieControlStyleNone;
        self.videoPlayer.shouldAutoplay = YES;
        self.videoPlayer.repeatMode = MPMovieRepeatModeOne;
    
    步骤 2 :设置文件播放路径
       self.videoPlayer.contentURL = self.videoUrl;
        [self.videoPlayer play];
    
    步骤 3 :监听事件
       //播放完成
        [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(captureFinished:) name:MPMoviePlayerThumbnailImageRequestDidFinishNotification object:nil];
        //播放状态发生改变
        [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(stateChanged) name:MPMoviePlayerPlaybackStateDidChangeNotification object:nil];
    

    五、录制录音

    步骤

      1. 创建捕捉会话
      1. 设置采样参数(音频格式,质量等)
      1. 初始化对象
      1. 实现录音
    步骤 1 :创建捕捉会话
        AVAudioSession *session =[AVAudioSession sharedInstance];
        NSError *sessionError;
        [session setCategory:AVAudioSessionCategoryPlayAndRecord error:&sessionError];
        
        if (session != nil) {
            [session setActive:YES error:nil];
        }
        self.session = session;
    
    步骤 2 :设置采样参数(音频格式,质量等)
    //设置参数
        NSDictionary *recordSetting = [[NSDictionary alloc] initWithObjectsAndKeys:
                                       //采样率  8000/11025/22050/44100/96000(影响音频的质量)
                                       [NSNumber numberWithFloat: 8000.0],AVSampleRateKey,
                                       // 音频格式
                                       [NSNumber numberWithInt: kAudioFormatLinearPCM],AVFormatIDKey,
                                       //采样位数  8、16、24、32 默认为16
                                       [NSNumber numberWithInt:16],AVLinearPCMBitDepthKey,
                                       // 音频通道数 1 或 2
                                       [NSNumber numberWithInt: 1], AVNumberOfChannelsKey,
                                       //录音质量
                                       [NSNumber numberWithInt:AVAudioQualityHigh],AVEncoderAudioQualityKey,
                                       nil];
    
    步骤 3 :初始化对象
         _recorder = [[AVAudioRecorder alloc] initWithURL:self.recordFileUrl settings:recordSetting error:nil];
        if (_recorder) {
            _recorder.meteringEnabled = YES;
            [_recorder prepareToRecord];
            [_recorder record];
            dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)(60 * NSEC_PER_SEC)), dispatch_get_main_queue(), ^{
                [self stopRecord];
            });
        }
    
    步骤 4 :实现录音
            [_recorder prepareToRecord];
            [_recorder record];
    

    六、视频合成

     参数:
     @param assetURL 原始视频路径
     @param BGMPath 背景音乐路径
     @param needOriginalVoice 是否添加原视频的声音
     @param videoVolume 视频音量
     @param BGMVolume 背景音乐音量
     @param completionHandle 合成后的回调
    
    + (void)editVideoSynthesizeVieoPath:(NSURL *)assetURL BGMPath:(NSURL *)BGMPath  needOriginalVoice:(BOOL)needOriginalVoice videoVolume:(CGFloat)videoVolume BGMVolume:(CGFloat)BGMVolume complition:(void (^)(NSURL *outputPath,BOOL isSucceed)) completionHandle{
        //    素材
        AVAsset *asset = [AVAsset assetWithURL:assetURL];
        AVAsset *audioAsset = [AVAsset assetWithURL:BGMPath];
        
        CMTime duration = asset.duration;
        CMTimeRange video_timeRange = CMTimeRangeMake(kCMTimeZero, duration);
        
        //    分离素材
        AVAssetTrack *videoAssetTrack = [[asset tracksWithMediaType:AVMediaTypeVideo]objectAtIndex:0];//视频素材
        AVAssetTrack *audioAssetTrack = [[audioAsset tracksWithMediaType:AVMediaTypeAudio]objectAtIndex:0];// 背景音乐音频素材
        
        //    创建编辑环境
        AVMutableComposition *composition = [[AVMutableComposition alloc]init];
        
        //    视频素材加入视频轨道
        AVMutableCompositionTrack *videoCompositionTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
        [videoCompositionTrack insertTimeRange:video_timeRange ofTrack:videoAssetTrack atTime:kCMTimeZero error:nil];
        
        //    音频素材加入音频轨道
        AVMutableCompositionTrack *audioCompositionTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
        [audioCompositionTrack insertTimeRange:video_timeRange ofTrack:audioAssetTrack atTime:kCMTimeZero error:nil];
        //    audioCompositionTrack.preferredVolume = 0;
        
        //    是否加入视频原声
        AVMutableCompositionTrack *originalAudioCompositionTrack = nil;
        if (needOriginalVoice) {
            AVAssetTrack *originalAudioAssetTrack = [[asset tracksWithMediaType:AVMediaTypeAudio]objectAtIndex:0];
            originalAudioCompositionTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
            [originalAudioCompositionTrack insertTimeRange:video_timeRange ofTrack:originalAudioAssetTrack atTime:kCMTimeZero error:nil];
        }
        
        //    创建导出素材类
        AVAssetExportSession *exporter = [[AVAssetExportSession alloc]initWithAsset:composition presetName:AVAssetExportPresetMediumQuality];
        
        
        //    设置输出路径
        NSURL *outputPath = [self exporterPath];
        exporter.outputURL = outputPath;
        exporter.outputFileType = AVFileTypeQuickTimeMovie;//指定输出格式
        exporter.shouldOptimizeForNetworkUse= YES;
        
        //    音量控制
        exporter.audioMix = [self buildAudioMixWithVideoTrack:originalAudioCompositionTrack VideoVolume:videoVolume BGMTrack:audioCompositionTrack BGMVolume:BGMVolume atTime:kCMTimeZero];
        
        [exporter exportAsynchronouslyWithCompletionHandler:^{
            switch ([exporter status]) {
                case AVAssetExportSessionStatusFailed: {
                    NSLog(@"合成失败:%@",[[exporter error] description]);
                    completionHandle(outputPath,NO);
                } break;
                case AVAssetExportSessionStatusCancelled: {
                    completionHandle(outputPath,NO);
                } break;
                case AVAssetExportSessionStatusCompleted: {
                    completionHandle(outputPath,YES);
                } break;
                default: {
                    completionHandle(outputPath,NO);
                } break;
            }
        }];
    }
    

    七、总结

    • 了解api的使用和属性。
    • demo地址

    相关文章

      网友评论

      • 佐_笾:为什么调用了视频合成的方法,播放的视频方向会改变

      本文标题:iOS 实现视频的录制,视频和音频的合成

      本文链接:https://www.haomeiwen.com/subject/qjdheftx.html