美文网首页
iOS直播视屏采集

iOS直播视屏采集

作者: yuandiLiao | 来源:发表于2017-03-09 16:36 被阅读0次

    1.采集基本类

    1.AVFoundation:里面的主要音视频采集类有

    WechatIMG5.jpeg

    2.AVCaptureDevice:硬件设备,包括麦克风、摄像头等,用该类来初始化视屏采集和音频采集设备的对象(可初始化音频采集设备和视屏采集设备)

    3.AVCaptureDeviceInput:设备输入对象,管理采集设备采集到的输入数据(可初始化音频对象和视屏对象)

    4.AVCaptureOutput:硬件输出设备,接受到采集到输出数据

    5.AVCaptureVideoDataOutput:AVCaptureOutput的子类,视屏输出对象

    6.AVCaptureAudioDataOutput:AVCaptureOutput的子类,音频输出设备

    7.AVCaptureConnection:当把一个输入和输出添加到AVCaptureSession之后,AVCaptureSession就会在输入、输出设备之间建立连接,而且通过AVCaptureOutput可以获取这个连接对象。(和音频输出或在视屏输出建立连接,区分作用)

    8.AVCaptureVideoPreviewLayer:视屏采集的预览图层,视屏的录制效果展示layer层。

    9.AVCaptureSession:协调管理音视频的采集的输入和输出,是采集和数据输出的连接管理对象

    2.捕获音视频步骤:官方文档 (参考学习:http://www.jianshu.com/p/c71bfda055fa

    1.创建AVCaptureSession对象

    2.获取AVCaptureDevicel录像设备(摄像头),录音设备(麦克风),注意不具备输入数据功能,只是用来调节硬件设备的配置。

    3.根据音频/视频硬件设备(AVCaptureDevice)创建音频/视频硬件输入数据对象(AVCaptureDeviceInput),专门管理数据输入。

    4.创建视频输出数据管理对象(AVCaptureVideoDataOutput),并且设置样品缓存代理(setSampleBufferDelegate)就可以通过它拿到采集到的视频数据

    5.创建音频输出数据管理对象(AVCaptureAudioDataOutput),并且设置样品缓存代理(setSampleBufferDelegate)就可以通过它拿到采集到的音频数据

    6.将数据输入对象AVCaptureDeviceInput、数据输出对象AVCaptureOutput添加到媒体会话管理对象AVCaptureSession中,就会自动让音频输入与输出和视频输入与输出产生连接.

    7.创建视频预览图层AVCaptureVideoPreviewLayer并指定媒体会话,添加图层到显示容器layer中

    8.启动AVCaptureSession,只有开启,才会开始输入到输出数据流传输。

    3.上代码

    //会话对象
    @property (nonatomic,strong)AVCaptureSession *captureSession;
    //摄像头设备
    @property (nonatomic,strong)AVCaptureDevice *videoDevice;
    //声音设备
    @property (nonatomic,strong)AVCaptureDevice *audioDevice;
    //设备视屏输入对象
    @property (nonatomic,strong)AVCaptureDeviceInput *videoDeviceInput;
    //设备音频输入对象
    @property (nonatomic,strong)AVCaptureDeviceInput *audioDeviceInput;
    //设备视屏输出对象
    @property (nonatomic,strong)AVCaptureVideoDataOutput *videoOutput;
    //设备音频输出对象
    @property (nonatomic,strong)AVCaptureAudioDataOutput *audioOutput;
    //输入与输出连接
    @property (nonatomic,strong)AVCaptureConnection *videoConnection;
    
    
    -(void)captureVideoAndAudio
    {
        //开始配置
        [self.captureSession beginConfiguration];
        //添加视屏输入和音频输入到session中
        if ([self.captureSession canAddInput:self.videoDeviceInput]) {
            [self.captureSession addInput:self.videoDeviceInput];
        }
        if ([self.captureSession canAddInput:self.audioDeviceInput]) {
            [self.captureSession addInput:self.audioDeviceInput];
        }
        
        //设置代理,获取视屏输出数据
        //这里要创建一个串行队列
        dispatch_queue_t videoOutputQueue = dispatch_queue_create("videoOutputQueue", DISPATCH_QUEUE_SERIAL);
        [self.videoOutput setSampleBufferDelegate:self queue:videoOutputQueue];
        if ([self.captureSession canAddOutput:self.videoOutput]) {
            //将输出视屏加入session中
            [self.captureSession addOutput:self.videoOutput];
        }
        dispatch_queue_t audioOutputQueue = dispatch_queue_create("audioOutputQueue", DISPATCH_QUEUE_SERIAL);
        [self.audioOutput setSampleBufferDelegate:self queue:audioOutputQueue];
        if ([self.captureSession canAddOutput:self.audioOutput]) {
            [self.captureSession addOutput:self.audioOutput];
        }
        // 9.获取视频输入与输出连接,用于分辨音视频数据
        self.videoConnection = [self.videoOutput connectionWithMediaType:AVMediaTypeVideo];
        
        //显示在layer层
        AVCaptureVideoPreviewLayer *capturePreViewLayer = [[AVCaptureVideoPreviewLayer alloc]initWithSession:self.captureSession];
        capturePreViewLayer.frame = self.view.bounds;
        [self.view.layer insertSublayer:capturePreViewLayer atIndex:0];
        //完成配置
        [self.captureSession commitConfiguration];
        //启动会话
        [self.captureSession startRunning];
        
    }
    
    
    // 指定摄像头方向获取摄像头
    - (AVCaptureDevice *)getVideoDevice:(AVCaptureDevicePosition)position
    {
        NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
        for (AVCaptureDevice *device in devices) {
            if (device.position == position) {
                return device;
            }
        }
        return nil;
    }
    
    
    -(void)changeCameraPosition
    {
        
        // 获取需要改变的方向
        AVCaptureDevicePosition currentCameraPosition = [self.videoDevice position];
        
        if (currentCameraPosition == AVCaptureDevicePositionBack)
        {
            currentCameraPosition = AVCaptureDevicePositionFront;
        }
        else
        {
            currentCameraPosition = AVCaptureDevicePositionBack;
        }
    
    //    // 获取需要改变的方向
        //重新获取摄像设备
        AVCaptureDevice *videoDevice = [self getVideoDevice:currentCameraPosition];
        //重置视屏输入
        AVCaptureDeviceInput *videoDeviceInput = [[AVCaptureDeviceInput alloc]initWithDevice:videoDevice error:nil];
        if (videoDeviceInput) {
            [_captureSession beginConfiguration];
            [self.captureSession removeInput:self.videoDeviceInput];
            if ([self.captureSession canAddInput:videoDeviceInput]) {
                [self.captureSession addInput:videoDeviceInput];
                self.videoDeviceInput = videoDeviceInput;
                self.videoConnection = [self.videoOutput connectionWithMediaType:AVMediaTypeVideo];
                self.videoDevice = videoDevice;
            }else{
                [self.captureSession addInput:self.videoDeviceInput];
            }
            [self.captureSession commitConfiguration];
    
        }
    }
    
    

    最后demo地址( https://github.com/liaoYuanDi/LYDLiveDemo

    相关文章

      网友评论

          本文标题:iOS直播视屏采集

          本文链接:https://www.haomeiwen.com/subject/efowgttx.html