美文网首页
AVFoundation -> 采集音视频思路及详细步骤解析

AVFoundation -> 采集音视频思路及详细步骤解析

作者: 如意神王 | 来源:发表于2019-07-14 19:21 被阅读0次

    1. AVFoundation 中重要的类

    1. AVCaptureSession 音视频会话
    2. AVCaptureDevice 音视频设备
    3. AVCaptureDeviceInput 音视频输入
    4. AVCaptureVideoDataOutput 视频输出
    5. AVCaptureAudioDataOutput 音频输出
    6. AVCaptureVideoPreviewLayer 预览图层
    7. AVCaptureConnection 输入输出连接
    8. AVCaptureVideoDataOutputSampleBufferDelegate 视频输出代理
    9. AVCaptureAudioDataOutputSampleBufferDelegate 音频输出代理

    2. 采集音视频思路

    1. 创建音视频会话AVCaptureSession
    2. 设置视频信息
    3. 设置音频信息
    4. 创建预览图层
    5. 开启捕捉
    6. 处理代理输出

    3. 详细配置步骤

    1. 创建音视频会话 AVCaptureSession
    1. 创建 AVCaptureSession
    2. 设置分辨率
    2. 设置视频信息
    1. 创建视频输入 AVCaptureDevice -> AVCaptureDeviceInput
    2. 添加视频输入设备 AVCaptureDeviceInput - > AVCaptureSession
    3. 创建视频输出设备 AVCaptureVideoDataOutput
    4. 设置视频输出相关参数
      4.1 设置视频输出格式 YUV420
      4.2 是否丢弃延时帧 setAlwaysDiscardsLateVideoFrames
      4.3 设置视频输出代理和串行队列
      4.4 添加视频输出设备 AVCaptureVideoDataOutput -> AVCaptureSession
    5. 创建视频连接 AVCaptureVideoDataOutput -> AVCaptureConnection
    6. 设置视频方向 AVCaptureConnection->
    7. 判断是否支撑视频稳定 preferredVideoStabilizationMode
    3. 设置音频信息
    1. 创建音频输入 AVCaptureDevice -> AVCaptureDeviceInput
    2. 添加音频输入设备 AVCaptureDeviceInput -> AVCaptureSession
    3. 创建音频输出设备 AVCaptureAudioDataOutput
      3.1 设置音频输出代理和串行队列
      3.2 添加音频输出设备到会话AVCaptureAudioDataOutput -> AVCaptureSession
    4. 创建音频连接 AVCaptureAudioDataOutput -> AVCaptureConnection
    4. 创建预览图层
    1. 创建预览图层 AVCaptureConnection -> AVCaptureVideoPreviewLayer
    2. 设置预览图层显示模式-> 全屏
    5. 开启捕捉
    1. AVCaptureConnection -> startRunning
    6. 处理代理输出
    1. 音视频输出代理
    • (void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
    7. 总结
    1. 视频
    1. 添加视频输入设备 AVCaptureSession -> AVCaptureDeviceInput
    2. 添加视频输出设备 AVCaptureSession -> AVCaptureVideoDataOutput
    3. 建立视频连接 AVCaptureVideoDataOutput -> AVCaptureConnection
    4. 视频输出到代理 AVCaptureConnection -> delegate
    2. 音频
    1. 添加音频输入设备 AVCaptureSession -> AVCaptureDeviceInput
    2. 添加音频输出设备 AVCaptureSession -> AVCaptureAudioDataOutput
    3. 建立视频连接 AVCaptureVideoDataOutput -> AVCaptureConnection
    4. 音频输出到代理 AVCaptureConnection -> delegate
    3. 队列
    1. 会话异步串行队列 -> AVCaptureSession
    2. 视频输出异步串行队列 -> AVCaptureVideoDataOutput
    3. 音频输出异步串行队列 - > AVCaptureAudioDataOutput

    通过给视频会话AVCaptureSession添加音视频输入输出设备,并且建立连接后,采集的数据就会输出到代理方法里面,且Session 音频 视频输出都要放在异步串行队列里面

    4.具体代码实现

    1.头文件及属性

    #import <AVFoundation/AVFoundation.h>
    
    @interface VideoCaptureView ()
    <AVCaptureVideoDataOutputSampleBufferDelegate, AVCaptureAudioDataOutputSampleBufferDelegate>
    
    // 会话
    @property (nonatomic, strong) AVCaptureSession * captureSession;
    
    // 会话的串行队列
    @property (nonatomic, strong) dispatch_queue_t captureSessionQueue;
    
    // 视频输入设备
    @property (nonatomic, strong) AVCaptureDeviceInput * activeDeviceInput;
    
    // 视频连接
    @property (nonatomic, strong) AVCaptureConnection * videoConnection;
    
    // 音频连接
    @property (nonatomic, strong) AVCaptureConnection * audioConnection;
    
    // 音频数据输出
    @property (nonatomic, strong) AVCaptureAudioDataOutput * audioDataOutput;
    
    // 视频数据输出
    @property (nonatomic, strong) AVCaptureVideoDataOutput * videoDataOutput;
    
    1. 创建Session会话
    - (void)setupCaptureSession {
        // 1.创建AVCaptureSession
        AVCaptureSession * session = [[AVCaptureSession alloc] init];
        self.captureSession = session;
    
        // 2.设置分辨率 分辨率与编码时候的宽和高有关系
        if ([session canSetSessionPreset:AVCaptureSessionPresetHigh]) {
            session.sessionPreset = AVCaptureSessionPresetHigh;
        }
        
        // 3. 创建Session队列
        self.captureSessionQueue = dispatch_queue_create("captureSession queue", DISPATCH_QUEUE_SERIAL);
    }
    

    2.设置视频信息

    - (void)setupVideo {
        // 1. 创建视频输入设备 AVCaptureDevice -> AVCaptureDeviceInput
        AVCaptureDevice * device = nil;
        if (@available (iOS 10, *)) {
            device = [AVCaptureDevice defaultDeviceWithDeviceType:AVCaptureDeviceTypeBuiltInWideAngleCamera mediaType:AVMediaTypeVideo position:AVCaptureDevicePositionBack];
            
        } else {
            device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
        }
        
        // 创建视频输入AVCaptureDeviceInput
        NSError * error = nil;
        AVCaptureDeviceInput * videoInput = [[AVCaptureDeviceInput alloc] initWithDevice:device error:&error];
        if (error != nil) {
            NSLog(@"init AVCaptureDeviceInput failed");
        }
        
        self.activeDeviceInput = videoInput;
        
        // 2.AVCaptureSession 添加视频输入设备
        if ([self.captureSession canAddInput:videoInput]) {
            [self.captureSession addInput:videoInput];
        }
        
        // 3. 创建视频输出设备
        AVCaptureVideoDataOutput * videoDataOutput = [[AVCaptureVideoDataOutput alloc] init];
        self.videoDataOutput = videoDataOutput;
        
        // 4. 设置视频输出格式 YUV420
        videoDataOutput.videoSettings = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange] forKey:(id)kCVPixelBufferPixelFormatTypeKey];
        
        // 5. 是否丢弃延时帧
        [videoDataOutput setAlwaysDiscardsLateVideoFrames:YES];
    
        // 6. 设置视频输出代理和队列
        dispatch_queue_t videoQueue = dispatch_queue_create("video output queue", DISPATCH_QUEUE_SERIAL);
        [videoDataOutput setSampleBufferDelegate:self queue:videoQueue];
        
        
        // 7. 添加视频输出设备
        if ([self.captureSession canAddOutput:videoDataOutput]) {
            [self.captureSession addOutput:videoDataOutput];
        }
        
        // 8. 设置视频输出连接AVCaptureConnection
        AVCaptureConnection * connection = [videoDataOutput connectionWithMediaType:AVMediaTypeVideo];
        self.videoConnection = connection;
        
        // 9. 设置视频输出方向
        [connection setVideoOrientation:AVCaptureVideoOrientationPortrait];
        
        // 10. 判断是否支撑视频稳定 可以显著提高视频的质量 只会在录制视频文件涉及到
        if ([connection isVideoStabilizationSupported]) {
            connection.preferredVideoStabilizationMode = YES;
        }
    }
    
    1. 设置音频信息
    - (void)setupAudio {
        // 1. 创建视频输入设备 AVCaptureDevice -> AVCaptureDeviceInput
        AVCaptureDevice * device = nil;
        if (@available (iOS 10, *)) {
            device = [AVCaptureDevice defaultDeviceWithDeviceType:AVCaptureDeviceTypeBuiltInMicrophone mediaType:AVMediaTypeAudio position:AVCaptureDevicePositionUnspecified];
        } else {
            device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
        }
        
        // 创建音频输入AVCaptureDeviceInput
        NSError * error = nil;
        AVCaptureDeviceInput * audioInput = [[AVCaptureDeviceInput alloc] initWithDevice:device error:&error];
        if (error != nil) {
            NSLog(@"init AVCaptureDeviceInput failed");
        }
        
        // 2.添加音频输入设备
        if ([self.captureSession canAddInput:audioInput]) {
            [self.captureSession addInput:audioInput];
        }
        
        // 3. 创建音频输出设备
        AVCaptureAudioDataOutput * audioDataOutput = [[AVCaptureAudioDataOutput alloc] init];
        self.audioDataOutput = audioDataOutput;
        
        // 4. 设置音频代理和队列
        dispatch_queue_t audioQueue = dispatch_queue_create("audio output queue", DISPATCH_QUEUE_SERIAL);
        [audioDataOutput setSampleBufferDelegate:self queue:audioQueue];
        
        // 5. 添加音频输出设备
        if ([self.captureSession canAddOutput:audioDataOutput]) {
            [self.captureSession addOutput:audioDataOutput];
        }
    
        // 6.设置音频输出连接
        AVCaptureConnection * audioConnection = [audioDataOutput connectionWithMediaType:AVMediaTypeAudio];
        self.audioConnection = audioConnection;
    }
    
    1. 设置预览图层
    // 创建预览图层
    - (void)setupPreviewLayer {
        AVCaptureVideoPreviewLayer * previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.captureSession];
        previewLayer.frame = self.bounds;
        // 全屏
        previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
        [self.layer addSublayer:previewLayer];
    }
    

    5.开启结束捕捉音视频

    // 开始
    - (void)startRunningCaptureSession {
        dispatch_async(self.captureSessionQueue, ^{
            if (!self.captureSession.isRunning) {
                [self.captureSession startRunning];
            }
        });
    }
    
    // 结束
    - (void)stopRunningCaptureSession {
        dispatch_async(self.captureSessionQueue, ^{
            if (self.captureSession.isRunning) {
                [self.captureSession stopRunning];
            }
        });
    }
    

    6.音视频数据处理

    - (void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
        // 判断音频还是视频
        // 视频
        if ([connection isEqual:self.videoConnection]) {
            NSLog(@"video sampleBuffer%@", sampleBuffer);
        }
        
        // 音频
        if ([connection isEqual:self.audioConnection]) {
            NSLog(@"audio sampleBuffer%@", sampleBuffer);
        }
        
        // 视频
        if ([output isEqual:self.videoDataOutput]) {
            
        }
        
        // 音频
        if ([output isEqual:self.audioDataOutput]) {
            
        }
    }
    

    相关文章

      网友评论

          本文标题:AVFoundation -> 采集音视频思路及详细步骤解析

          本文链接:https://www.haomeiwen.com/subject/fgyikctx.html