美文网首页
视音频的录制与数据采集

视音频的录制与数据采集

作者: Double_Chen | 来源:发表于2018-03-01 10:56 被阅读73次

视音频的录制可以通过AVFoundation或GPUImage来实现,但是目前几乎所有的小视频应用都会有滤镜功能,所以大都采用GPUImage框架来开发,不过GPUImage并没有视音频的剪辑、拼接功能,而且GPUImage的录制也是在AVFoundation上进行的封装,所以学习AVFoundation还是很有必要的。

AVFoundation录制视频需要用到的类:

  • AVCaptureSession: 类似于管理器

  • AVCaptureVideoPreviewLayer: 用于展示摄像头采集到的数据

  • AVCaptureInput: 输入(允许使用摄像头、麦克风设备)

  • AVCaptureOutput: 输出(输出视频、音频数据)

  • AVCaptureMovieFileOutput

  • AVAssetWriter

AVCaptureMovieFileOutput代码简单,可以很快速的封装好一个录制框架,缺点是无法采集录制过程中的数据,类似微信小视频,快速录制并发送;
AVAssetWriter代码相对复杂,并且貌似不能重新录制,录制完一个视频后需要销毁该对象,重新创建才能再次录制,优点是能实时采集到录制过程中的图片帧数据,便于在视频实时通讯中使用;
GPUImage录制视频的话,它的底层也是使用AVAssetWriter进行录制的,并且已经封装好,所以使用起来也很方便,而且通过GPUImageView可以实时展示滤镜效果,很不错。

视音频数据采集

声明属性

@property(nonatomic,strong) AVCaptureSession *session;
@property(nonatomic,strong) AVCaptureVideoPreviewLayer *previewLayer;
@property(nonatomic,strong) AVCaptureDeviceInput *videoInput;
@property(nonatomic,strong) AVCaptureDeviceInput *audioInput;
- (AVCaptureSession *)session {
    if (!_session) {
        _session = [[AVCaptureSession alloc] init];
        /*
         sessionPreset:
         AVCaptureSessionPresetHigh: 高分辨率, 最终效果根据设备不同有所差异
         AVCaptureSessionPresetMedium: 中等分辨率, 适合Wi-Fi分享. 最终效果根据设备不同有所差异
         AVCaptureSessionPresetLow: 低分辨率, 适合3G分享, 最终效果根据设备不同有所差异
         AVCaptureSessionPreset640x480: 640x480, VGA
         AVCaptureSessionPreset1280x720: 1280x720, 720p HD
         AVCaptureSessionPresetPhoto: 全屏照片, 不能用来作为输出视频
         */
        if (![_session canSetSessionPreset:AVCaptureSessionPresetMedium]) {
            _session.sessionPreset = AVCaptureSessionPresetMedium;  //默认中等画质
        }
    }
    return _session;
}

- (AVCaptureDeviceInput *)videoInput {
    if (!_videoInput) {
        if (_cameraInput == CameraInputBack) {
            _videoInput = self.backCameraInput;
        }else {
            _videoInput = self.frontCameraInput;
        }
    }
    return _videoInput;
}

- (AVCaptureDeviceInput *)audioInput {
    if (!_audioInput) {
        NSError *error = nil;
        AVCaptureDevice *mic = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
        _audioInput = [AVCaptureDeviceInput deviceInputWithDevice:mic error:&error];
        if (error) {
            NSLog(@"获取麦克风失败");
        }
    }
    return _audioInput;
}

//视频预览view
- (AVCaptureVideoPreviewLayer *)previewLayer {
    if (!_previewLayer) {
        _previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.session];
        _previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
    }
    return _previewLayer;
}

//后置摄像头输入
- (AVCaptureDeviceInput *)backCameraInput {
    if (!_backCameraInput) {
        NSError *error = nil;
        _backCameraInput = [[AVCaptureDeviceInput alloc] initWithDevice:[self backCamera] error:&error];
        if (error) {
            NSLog(@"获取后置摄像头失败");
        }
    }
    return _backCameraInput;
}

//前置摄像头输入
- (AVCaptureDeviceInput *)frontCameraInput {
    if (!_frontCameraInput) {
        NSError *error = nil;
        _frontCameraInput = [[AVCaptureDeviceInput alloc] initWithDevice:[self frontCamera] error:&error];
        if (error) {
            NSLog(@"获取前置摄像头失败");
        }
    }
    return _frontCameraInput;
}

//后置摄像头
- (AVCaptureDevice *)backCamera {
    return [self getCameraWithPosition:AVCaptureDevicePositionBack];
}

//前置摄像头
- (AVCaptureDevice *)frontCamera {
    return [self getCameraWithPosition:AVCaptureDevicePositionFront];
}

//返回指定设备
- (AVCaptureDevice *)getCameraWithPosition:(AVCaptureDevicePosition)position {
    
#if __IPHONE_10_0
    AVCaptureDeviceDiscoverySession *discoverySession = [AVCaptureDeviceDiscoverySession discoverySessionWithDeviceTypes:@[AVCaptureDeviceTypeBuiltInWideAngleCamera] mediaType:AVMediaTypeVideo position:position];
    NSArray *devices = discoverySession.devices;
#else
    NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
#endif
    
    for (AVCaptureDevice *device in devices) {
        if (device.position == position) {
            return device;
        }
    }
    return nil;
}

如何使用

    if (//要录制视频) {
        [self addDeviceInput:self.videoInput];
    }
    if (//要录制音频) {
        [self addDeviceInput:self.audioInput];
    }

- (void)addDeviceInput:(AVCaptureInput *)input {
    if ([self.session canAddInput:input]) {
        [self.session addInput:input];
    }
}

最后一步,开始采集数据,即可完成一个简单相机

[self.session startRunning];

AVCaptureMovieFileOutput的使用

相机已经做好了,接下来就是录制视音频了,我们先创建好AVCaptureMovieFileOutput对象

@property(nonatomic,strong) AVCaptureMovieFileOutput *fileOutput;
@property(nonatomic,strong) AVCaptureConnection *fileConnection;

- (AVCaptureMovieFileOutput *)fileOutput {
    if (!_fileOutput) {
        _fileOutput = [[AVCaptureMovieFileOutput alloc] init];
    }
    return _fileOutput;
}

- (AVCaptureConnection *)fileConnection {
    if (!_fileConnection) {
        _fileConnection = [self.fileOutput connectionWithMediaType:AVMediaTypeVideo];
        //开启防抖,如果支持防抖就开启,有的用就用,实测效果还不错
        if ([_fileConnection isVideoStabilizationSupported]) {
            _fileConnection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeAuto;
        }
    }
    return _fileConnection;
}

AVCaptureConnection可用来调整预览视频的方向,还能开启防抖功能

接下来就开始录制视频了,首先要遵循代理<AVCaptureFileOutputRecordingDelegate>

开始录制

//开始录制
- (void)startRecording {
    NSLog(@"startRecording");
    
    //在session运行后以及不在录制状态时才能进行录制
    if ([self.session isRunning] && self.fileOutput.isRecording == NO) {
        self.recordStatus = RecordStatusRecording;
        unlink([self.fileUrl.path UTF8String]);
        [self.fileOutput startRecordingToOutputFileURL:self.fileUrl recordingDelegate:self];
    }else {
        NSAssert([self.session isRunning], @"session must be running");
    }
}

要注意一点,上面传入的fileUrl路径为录制文件的保存路径,请确保该路径不存在已有文件,不然将无法写入

停止录制

- (void)stopRecording {
    NSLog(@"stopRecording");
    if (self.fileOutput.isRecording == YES) {
        [self.fileOutput stopRecording];
    }
}

开始录制和停止录制都会回调函数,在这里就可以做回调操作

# pragma mark - AVCaptureFileOutputRecordingDelegate
//开始录制回调函数
- (void)captureOutput:(AVCaptureFileOutput *)output didStartRecordingToOutputFileAtURL:(NSURL *)fileURL fromConnections:(NSArray<AVCaptureConnection *> *)connections {
    NSLog(@"didStartRecordingToOutputFileAtURL");
}

//结束录制回调函数,注意,[self.session stopRunning]将会停止录制
- (void)captureOutput:(AVCaptureFileOutput *)output didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray<AVCaptureConnection *> *)connections error:(nullable NSError *)error {
    NSLog(@"didFinishRecordingToOutputFileAtURL");
}

到这里我们就完成了录制功能,代码很简单对吧

关于暂停录制

关于暂停录制的实现,我还没有发现哪个框架有提供,即便有pause函数,也只是视频暂停,录制过程还是在继续的,所以暂停录制我实现的方案是视频拼接,即每一次视频录制完成都会进行临时存储,在点击完成操作的时候将这些分段视频进行拼接。

AVAssetWriter的使用

这个类相对会麻烦一些,但是它可以在录制过程中回调视频帧,用于网络实时通讯还是很方便的

开始,遵循代理<AVCaptureVideoDataOutputSampleBufferDelegate,AVCaptureAudioDataOutputSampleBufferDelegate>,这两个代理都有一个共同的函数,用于回调录制过程中的视频帧和音频帧

- (void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection;

声明属性

/*
 设备输出时需要控制在同一个线程内
 */
@property(nonatomic,strong) dispatch_queue_t recordQueue;
@property(nonatomic,strong) AVCaptureVideoDataOutput *videoOutput;
@property(nonatomic,strong) AVCaptureAudioDataOutput *audioOutput;

@property(nonatomic,strong) AVCaptureConnection *fileConnection;

@property(nonatomic,strong) AVAssetWriter *assetWriter;
@property(nonatomic,strong) AVAssetWriterInput *assetWriterVideoInput;
@property(nonatomic,strong) AVAssetWriterInput *assetWriterAudioInput;

@property(nonatomic,strong) NSDictionary *videoWriterSetting;   //视频属性
@property(nonatomic,strong) NSDictionary *audioWriterSetting;   //音频属性

这次多了一个dispatch_queue_t,因为使用这个类的话,必不可少的会需要解析视频帧,将其转为UIImage,这是耗时操作,所以会需要开辟线程去运行

初始化

- (dispatch_queue_t)recordQueue {
    if (!_recordQueue) {
        _recordQueue = dispatch_queue_create("media_writer", DISPATCH_QUEUE_SERIAL);
    }
    return _recordQueue;
}

- (AVCaptureVideoDataOutput *)videoOutput {
    if (!_videoOutput) {
        _videoOutput = [[AVCaptureVideoDataOutput alloc] init];
        _videoOutput.alwaysDiscardsLateVideoFrames = YES;   //立即丢弃旧帧,节省内存,默认YES
        
        //这里遇到个坑,在解析每帧数据的时候一直报CGBitmapContextCreate相关错误,加上下面代码就可以了,具体参考https://www.jianshu.com/p/61ca3a917fe5
        NSDictionary *videoSettings = @{(id)kCVPixelBufferPixelFormatTypeKey:@(kCVPixelFormatType_32BGRA)};
        [_videoOutput setVideoSettings:videoSettings];
        [_videoOutput setSampleBufferDelegate:self queue:self.recordQueue];
    }
    return _videoOutput;
}

- (AVCaptureAudioDataOutput *)audioOutput {
    if (!_audioOutput) {
        _audioOutput = [[AVCaptureAudioDataOutput alloc] init];
        [_audioOutput setSampleBufferDelegate:self queue:self.recordQueue];
    }
    return _audioOutput;
}

- (AVCaptureConnection *)fileConnection {
    if (!_fileConnection) {
        _fileConnection = [self.videoOutput connectionWithMediaType:AVMediaTypeVideo];
        //开启防抖,如果支持防抖就开启,有的用就用,实测效果还不错
        if ([_fileConnection isVideoStabilizationSupported]) {
            _fileConnection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeAuto;
        }
    }
    return _fileConnection;
}

- (AVAssetWriter *)assetWriter {
    if (!_assetWriter) {
        NSError *error = nil;
        NSURL *fileUrl = _currentUrl;
        [self removeFilePathIfExist:fileUrl.path];
        _assetWriter = [AVAssetWriter assetWriterWithURL:fileUrl fileType:AVFileTypeMPEG4 error:&error];
        if (error) {
            NSLog(@"writer对象创建失败: %@",error);
        }
    }
    return _assetWriter;
}

- (AVAssetWriterInput *)assetWriterVideoInput {
    if (!_assetWriterVideoInput) {
        _assetWriterVideoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:self.videoWriterSetting];
        _assetWriterVideoInput.expectsMediaDataInRealTime = YES;    //必须设为YES,否则会丢帧
    }
    return _assetWriterVideoInput;
}

- (AVAssetWriterInput *)assetWriterAudioInput {
    if (!_assetWriterAudioInput) {
        _assetWriterAudioInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeAudio outputSettings:self.audioWriterSetting];
        _assetWriterAudioInput.expectsMediaDataInRealTime = YES;
    }
    return _assetWriterAudioInput;
}

- (NSDictionary *)videoWriterSetting {
    
    CGFloat videoWidth = CGRectGetWidth(self.containerView.bounds);
    CGFloat videoHeight = CGRectGetHeight(self.containerView.bounds);
    
    //写入视频大小
    NSInteger numPixels = videoWidth * videoHeight;
    
    //每像素比特
    CGFloat bitsPerPixel = 6.0f;
    NSInteger bitsPerSecond = numPixels * bitsPerPixel;
    
    /*
     码率和帧率设置
     AVVideoCodecKey 编码方式:H.264编码
     AVVideoExpectedSourceFrameRateKey 帧率:每秒钟多少帧画面 
     AVVideoAverageBitRateKey 码率:单位时间内保存的数据量,码率: 编码效率, 码率越高,则画面越清晰, 如果码率较低会引起马赛克 --> 码率高有利于还原原始画面,但是也不利于传输)
     AVVideoMaxKeyFrameIntervalKey 关键帧(GOPsize)间隔:多少帧为一个GOP,
     */
    NSDictionary *compresstionProperties = @{AVVideoAverageBitRateKey:@(bitsPerSecond),
                                             AVVideoExpectedSourceFrameRateKey:@(30),
                                             AVVideoMaxKeyFrameIntervalKey:@(30),
                                             AVVideoProfileLevelKey:AVVideoProfileLevelH264BaselineAutoLevel,
                                             };
    
    return @{AVVideoCodecKey:AVVideoCodecH264,
             AVVideoScalingModeKey:AVVideoScalingModeResizeAspectFill,
             AVVideoWidthKey:@(videoWidth),
             AVVideoHeightKey:@(videoWidth),
             AVVideoCompressionPropertiesKey:compresstionProperties,
             };
}

- (NSDictionary *)audioWriterSetting {
    return @{AVEncoderBitRatePerChannelKey:@(28000),
             AVFormatIDKey:@(kAudioFormatMPEG4AAC),
             AVNumberOfChannelsKey:@(1),
             AVSampleRateKey:@(22050),
             };
}
- (void)initializeAssetWriter {
    //视频
    if (...) {
        if ([self.assetWriter canAddInput:self.assetWriterVideoInput]) {
            [self.assetWriter addInput:self.assetWriterVideoInput];
        }
    }
    //音频
    if (...) {
        if ([self.assetWriter canAddInput:self.assetWriterAudioInput]) {
            [self.assetWriter addInput:self.assetWriterAudioInput];
        }
    }
}

可以看到在初始化的时候可以进行高度定制,满足实时通讯或小视频拍摄的需求

开始录制

首先要注意这些操作最好都在同一个线程中处理,为什么不用我多说了吧

- (void)startRecording {
dispatch_async(self.recordQueue, ^{
        if (_assetWriter == nil) {
            [self initializeAssetWriter];
        }    
    }
}

停止录制

- (void)stopRecording {
    dispatch_async(self.recordQueue, ^{
        [self stopWrite];
    });
}

- (void)stopWrite {
    dispatch_async(self.recordQueue, ^{
        if (_assetWriter) {
            [_assetWriter finishWritingWithCompletionHandler:^{
                [self destoryWrite];
            }];
        }else {
            NSLog(@"writer对象为空");
        }
    });

}

/*
 writer对象只能使用一次,再次写入需要重新创建对象,参考链接:
 https://stackoverflow.com/questions/4911534/avassetwriter-multiple-sessions-and-the-status-property
*/
- (void)destoryWrite {
    dispatch_async(self.recordQueue, ^{
        @synchronized(self) {
            _assetWriter = nil;
            _assetWriterVideoInput = nil;
            _assetWriterAudioInput = nil;
            NSLog(@"摧毁写入对象~");
        }
    });
}

在AVAssetWriter初始化之后就会开始回调函数,开始录制的代码我们在该回调函数中写,同样需要保证写入文件的路径是有效的;

//在这里可以获取视频帧,可以在此实现滤镜等效果
- (void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
    if (sampleBuffer == NULL) {
        NSLog(@"empty sampleBuffer");
        return;
    }
        
    if (connection == [_videoOutput connectionWithMediaType:AVMediaTypeVideo]) {
        //视频
        if (_assetWriter == nil) {
            NSLog(@"_assetWriter 为空.");
            [self destoryWrite];
            return;
        }
        
        if (_assetWriter.status == AVAssetWriterStatusFailed) {
            NSLog(@"Error: %@", _assetWriter.error);
            [self stopWrite];
            return;
        }
        
        if (_assetWriter.status == AVAssetWriterStatusUnknown) {
            NSLog(@"开始写入");
            [_assetWriter startWriting];
            [_assetWriter startSessionAtSourceTime:CMSampleBufferGetPresentationTimeStamp(sampleBuffer)];
        }
        
        if (_assetWriterVideoInput.readyForMoreMediaData) {
            BOOL success = [_assetWriterVideoInput appendSampleBuffer:sampleBuffer];    //如果报错请检查是否已存在文件
            if (!success) {
                NSLog(@"append sampleBuffer fail~");
                
                @synchronized(self) {
                    [self stopRecording];
                }
            }
        }
    }
    
    if (connection == [_audioOutput connectionWithMediaType:AVMediaTypeAudio]) {
        //音频
        if (self.assetWriterAudioInput.readyForMoreMediaData) {
            BOOL success = [_assetWriterAudioInput appendSampleBuffer:sampleBuffer];
            if (!success) {
                NSLog(@"append sampleBuffer fail~");
                @synchronized(self) {
                    [self stopRecording];
                }
            }
        }
    }
    
}

CMSampleBufferRef转UIImage

给一个网上找到的方法

//CMSampleBufferRef转UIImage
-(UIImage *)imageFromSampleBuffer:(CMSampleBufferRef)sampleBuffer{
    // 为媒体数据设置一个CMSampleBuffer的Core Video图像缓存对象
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    
    // 锁定pixel buffer的基地址
    CVPixelBufferLockBaseAddress(imageBuffer, 0);
    // 得到pixel buffer的基地址
    void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
    // 得到pixel buffer的行字节数
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
    // 得到pixel buffer的宽和高
    size_t width = CVPixelBufferGetWidth(imageBuffer);
    size_t height = CVPixelBufferGetHeight(imageBuffer);
    // 创建一个依赖于设备的RGB颜色空间
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    // 用抽样缓存的数据创建一个位图格式的图形上下文(graphics context)对象
    CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
    // 根据这个位图context中的像素数据创建一个Quartz image对象
    CGImageRef quartzImage = CGBitmapContextCreateImage(context);
    // 解锁pixel buffer
    CVPixelBufferUnlockBaseAddress(imageBuffer,0);
    // 释放context和颜色空间
    CGContextRelease(context); CGColorSpaceRelease(colorSpace);
    // 用Quartz image创建一个UIImage对象image
    UIImage *image = [UIImage imageWithCGImage:quartzImage scale:1 orientation:UIImageOrientationUp];
    // 释放Quartz image对象
    CGImageRelease(quartzImage);
    return (image);
}

GPUImage录制视频

  • GPUImageVideoCamera: GPUImage封装的相机
  • GPUImageView: 视图展示
  • GPUImageFilterGroup: 滤镜组,
  • GPUImageMovieWriter: 视频写入类

声明属性

@property(nonatomic,strong) GPUImageVideoCamera *videoCamera;
@property(nonatomic,strong) GPUImageView *preView;
@property(nonatomic,strong) GPUImageFilterGroup *filterGroup;
@property(nonatomic,strong) GPUImageMovieWriter *movieWriter;
@property(nonatomic,strong) NSMutableArray *filters;  //滤镜个数

初始化

- (void)initializedCamera {
    //视频源
    _videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPresetHigh cameraPosition:AVCaptureDevicePositionFront];
    _videoCamera.outputImageOrientation = UIInterfaceOrientationPortrait;
    
    _videoCamera.horizontallyMirrorFrontFacingCamera = YES;
    
    //解决开始录制时相机闪烁的问题。在初始化时就添加声音输入
    [_videoCamera addAudioInputsAndOutputs];
    
    //滤镜组
    _filterGroup = [[GPUImageFilterGroup alloc] init];
     _filters = @[].mutableCopy;
    [self updateCamera];
}

- (void)updateCamera {
    [_videoCamera removeAllTargets];
    if (_filters.count == 0) {
        [_videoCamera addTarget:self.preView];
    }else {
        [_videoCamera addTarget:_filterGroup];
        [_filterGroup addTarget:self.preView];
    }
}

- (GPUImageView *)preView {
    if (!_preView) {
        _preView = [[GPUImageView alloc] init];
        _preView.fillMode = kGPUImageFillModePreserveAspectRatioAndFill;
    }
    return _preView;
}

写入对象GPUImageMovieWriter的创建,GPUImage封装的视频写入函数也是需要重新创建的

- (void)resetMovieWriter {
    if (_movieWriter) _movieWriter = nil;
    
    unlink([self.fileUrl.path UTF8String]);
        
    _movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:url size:self.preView.bounds.size];
    _movieWriter.encodingLiveVideo = YES;
    
    _videoCamera.audioEncodingTarget = _movieWriter;
}

视频采集
因为GPUImage底层进行了封装,所以我们开启摄像头采集的函数不再是[self.session startRunning]了

- (void)startRunning {
    [self.videoCamera startCameraCapture];  //开始采集
}

- (void)stopRunning {
    [self.videoCamera stopCameraCapture];   //停止采集
}

开始录制

- (void)startRecording {
    self.recordStatus = RecordStatusRecording;

    NSLog(@"新的录制");
    [self resetMovieWriter];
    [self addMovieWriter];
    
    _videoCamera.audioEncodingTarget = _movieWriter;
    [_movieWriter startRecording];
}

//添加视频写入target
- (void)addMovieWriter {
    if (self.filters.count == 0) {
        [_videoCamera addTarget:_movieWriter];
    }else {
        [_filterGroup addTarget:_movieWriter];
    }
}

停止录制

- (void)stopRecording {
    NSLog(@"暂停录制");
        
    [self removeMovieWriter];
    
    _videoCamera.audioEncodingTarget = nil;
    [_movieWriter finishRecording];
}

//移除视频写入target
- (void)removeMovieWriter {
    if (self.filters.count == 0) {
        [_videoCamera removeTarget:_movieWriter];
    }else {
        [_filterGroup removeTarget:_movieWriter];
    }
}

滤镜

使用GPUImage录制还可以实时添加滤镜功能

//更新滤镜
- (void)updateGPUImageFilters:(NSArray <GPUImageOutput<GPUImageInput> *>*)filters {
    
    NSLog(@"update filters: %@",filters);
    
    if (filters.count == 0) {
        [self removeAllFilters];
        [self updateCamera];
        return;
    }
    
    [self removeAllFilters];
    _filterGroup = [[GPUImageFilterGroup alloc] init];
    
    for (GPUImageOutput<GPUImageInput> *target in filters) {
        [self addGPUImageFilter:target filterGroup:_filterGroup];
    }
    
    [_filters addObjectsFromArray:filters];
    [_filterGroup useNextFrameForImageCapture];
    
    [self updateCamera];    
}

GitHub链接

相关文章

网友评论

      本文标题:视音频的录制与数据采集

      本文链接:https://www.haomeiwen.com/subject/yayixftx.html