美文网首页人猿星球收藏iosios
ios视频录制(仿微信小视频、音视频采集播放(文件录制)、视频裁

ios视频录制(仿微信小视频、音视频采集播放(文件录制)、视频裁

作者: joymake | 来源:发表于2017-06-13 15:27 被阅读2303次
    Magic.gif

    本章介绍一下视频采集的实现,主要有功能有
    1.音、视频文件录制播放
    2.焦距设置
    3.防抖功能
    4.摄像头切换
    5.手电筒功能
    6.聚焦处理
    7.二维码扫描
    8.视频裁剪压缩
    9.流数据采集处理(暂未处理,后期会补上)
    10.旋转检测(<CoreMotion/CoreMotion.h> 方位检测)

    实现思路如下

          由于小视频、流媒体、二维码扫描用的都是使用了AVFoundation的框架,只
    是输入AVCaptureInput、输出AVCaptureoutput对象不同和对应的输出内容处理不
    一样,所以想写一个工具类来集中处理
     功能还是比较全的,代码量也不小,目前大约六、七百行,通过.h文件大家可以自己
    去找自己感兴趣的地方去看
    因为是个多功能集成类,为了不至于一上来所有的输入输出对象都加入进来,所以所有
    输入输出对象以及设备管理对象均以懒加载的方式去按需加载
    

    .h 文件

    typedef NS_ENUM(NSInteger,ERecordResult) {
        ERecordSucess,
        ERecordLessThanMinTime,
        ERecordFaile
    };
    
    
    typedef NS_ENUM(NSUInteger,EAVCaptureOutputType) {
        EAVCaptureMovieFileOutput,      //文件输出
        EAVCaptureVideoDataOutput,      //data输出
        EAVCaptureMetadataOutput        //元数据输出
    };
    
    
    #import <Foundation/Foundation.h>
    #import <AVFoundation/AVFoundation.h>
    
    @protocol ReCordPlayProtoCol <NSObject>
    @optional
    - (void)joyRecordTimeCurrentTime:(CGFloat)currentTime
                           totalTime:(CGFloat)totalTime;
    
    - (void)joyCaptureOutput:(AVCaptureFileOutput *)captureOutput
    didStartRecordingToOutputFileAtURL:(NSURL *)fileURL
             fromConnections:(NSArray *)connections;
    
    -(void)joyCaptureOutput:(AVCaptureFileOutput *)captureOutput
    didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL
            fromConnections:(NSArray *)connections error:(NSError *)error
               recordResult:(ERecordResult)recordResult;
    
    - (void)joyCaptureOutput:(AVCaptureOutput *)captureOutput
    didOutputMetadataObjects:(NSArray *)metadataObjects
              fromConnection:(AVCaptureConnection *)connection;
    @end
    
    
    @interface JoyMediaRecordPlay : NSObject
    @property (nonatomic,strong)AVCaptureSession            *captureSession;
    @property (nonatomic,strong)AVCaptureVideoPreviewLayer  *preViewLayer;       //视图层
    //@property (nonatomic,assign)TIMERBLOCK recordProgressBlock;
    //@property (nonatomic,copy)IDBLOCK recordFinishBlock;
    @property (nonatomic,weak)id<ReCordPlayProtoCol>        delegate;
    @property (nonatomic,assign)EAVCaptureOutputType        captureOutputType;
    
    
    #pragma mark 初始化类型,默认录制文件
    -(instancetype)initWithCaptureType:(EAVCaptureOutputType)captureType;
    
    #pragma mark 准备录制
    - (void)preareReCord;
    
    #pragma mark 设置焦距
    - (void)updateVideoScaleAndCropFactor:(CGFloat)scale;
    
    #pragma mark 防抖功能
    - (void)openStabilization;
    
    #pragma mark 开始录制
    - (void)startRecordToFile:(NSURL *)outPutFile;
    
    #pragma mark 停止录制
    - (void)stopCurrentVideoRecording;
    
    #pragma mark 移除输入
    -(void)removeAVCaptureAudioDeviceInput;
    
    #pragma mark 手电筒
    - (void)switchTorch;
    
    #pragma mark 切换摄像头
    - (void)switchCamera;
    
    #pragma mark 设置聚焦点
    - (void)setFoucusWithPoint:(CGPoint)point;
    @end
    
    
    @interface JoyMediaRecordPlay (JoyRecorderPrivary)
    
    - (BOOL)isAvailableWithCamera;
    
    - (BOOL)isAvailableWithMic;
    
    - (void)getVideoAuth:(BOOLBLOCK)videoAuth;
    
    - (void)showAlert;
    
    #pragma mark 视频裁剪压缩
    + (void)mergeAndExportVideosAtFileURLs:(NSURL *)fileURL
                                    newUrl:(NSString *)mergeFilePath
                          widthHeightScale:(CGFloat)whScalle
                                presetName:(NSString *)presetName
                               mergeSucess:(VOIDBLOCK)mergeSucess;
    
    #pragma mark 视频保存相册
    + (void)saveToPhotoWithUrl:(NSURL *)url;
    
    #pragma mark - 视频地址
    
    + (NSString *)generateFilePathWithType:(NSString *)fileType;
    
    #pragma mark 获取文件大小
    + (CGFloat)getfileSize:(NSString *)filePath;
    
    @end
    

    .m文件

    #import "JoyMediaRecordPlay.h"
    #import <Photos/Photos.h>
    #import <AssetsLibrary/AssetsLibrary.h>
    #import <JoyAlert.h>
    
    /*
    这里简单介绍一下三个协议,按需去实现就行了,下面会详细讲具体处理
    AVCaptureFileOutputRecordingDelegate 文件录制协议,小视频用这个
    AVCaptureVideoDataOutputSampleBufferDelegate 流数据协议,流媒体数据
    AVCaptureMetadataOutputObjectsDelegate 元数据协议,二维码扫描
    */
    @interface JoyMediaRecordPlay ()<AVCaptureFileOutputRecordingDelegate,AVCaptureVideoDataOutputSampleBufferDelegate,AVCaptureMetadataOutputObjectsDelegate>
    @property (nonatomic,strong)NSTimer *timer;    //定时器,用于计时和时间进度条管理
    @property (nonatomic,assign)CGFloat recordTime;    //当前录制时间
    @property (nonatomic,assign)CGFloat totalTime;     //总录制时间
    @property (nonatomic,strong)AVCaptureDeviceInput        *mediaDeviceInput;          //视频输入
    @property (nonatomic,strong)AVCaptureDeviceInput        *audioDeviceInput;          //音频输入
    @property (nonatomic,strong)AVCaptureMovieFileOutput    *movieFileOutput;           //视频文件输出
    @property (nonatomic,strong)AVCaptureStillImageOutput   *stillImageOutput;          //图像输出
    @property (strong, nonatomic) AVCaptureVideoDataOutput  *videoDataOutput;           //视频data输出
    @property (strong, nonatomic) AVCaptureAudioDataOutput  *audioDataOutput;           //视频data输出
    @property (strong, nonatomic) AVCaptureMetadataOutput   *metadataOutput;            //元数据输出
    @property (strong, nonatomic) AVCaptureConnection       *captureConnection;
    @property (assign,nonatomic) UIBackgroundTaskIdentifier backgroundTaskIdentifier;   //后台任务标识
    
    @end
    
    
    static const CGFloat KTimerInterval = 0.05;
    static const CGFloat KMaxRecordTime = 20;
    static const CGFloat KMinRecordTime = 3;
    
    @implementation JoyMediaRecordPlay
    
    //初始化录制类型以确定要进行何种数据采集
    -(instancetype)initWithCaptureType:(EAVCaptureOutputType)captureType{
        if (self = [super init])
        {
            self.captureOutputType = captureType;
            __weak __typeof (&*self)weakSelf = self;
        //获取授权,成功则预准备录制,否则弹警告,警告框可自行去掉,因为本人用了
    自己写的一个pod库,你直接使用会因为找到相应文件而crash,也可以去pod中配置 
    JoyTool这库并update下来
            [self getVideoAuth:^(BOOL boolValue) {boolValue?[weakSelf preareReCord]:[weakSelf showAlert];}];
        }
        return self;
    }
    
    
    -(instancetype)init{
        if (self = [super init])
        {
            __weak __typeof (&*self)weakSelf = self;
            [self getVideoAuth:^(BOOL boolValue) {boolValue?[weakSelf preareReCord]:[weakSelf showAlert];}];
        }
        return self;
    }
    
    //录制时间,默认15s,否则按给定执行
    -(CGFloat)totalTime{
        return _totalTime = _totalTime?:15;
    }
    
    //输入输出对象管理,以及数据采集的启停管理
    -(AVCaptureSession *)captureSession{
        return _captureSession = _captureSession?:[[AVCaptureSession alloc]init];
    }
    
    #pragma mark private method 😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄
    #pragma mark 视频输入
    -(AVCaptureDeviceInput *)mediaDeviceInput{
        if (!_mediaDeviceInput) {
            __block AVCaptureDevice *frontCamera = nil;
            __block AVCaptureDevice *backCamera  = nil;
            NSArray *cameras = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
            [cameras enumerateObjectsUsingBlock:^(AVCaptureDevice *camera, NSUInteger idx, BOOL * _Nonnull stop) {
                if(camera.position == AVCaptureDevicePositionFront) {frontCamera = camera;}
                if(camera.position == AVCaptureDevicePositionBack)  {backCamera = camera;}
            }];
            [self setExposureModeWithDevice:backCamera];
            _mediaDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:backCamera error:nil];
        }
        return _mediaDeviceInput;
    }
    
    
    #pragma mark 音频输入
    -(AVCaptureDeviceInput *)audioDeviceInput{
        if (!_audioDeviceInput) {
            NSError *error;
            _audioDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio] error:&error];
        }
        return _audioDeviceInput;
    }
    
    #pragma mark 图片输出
    -(AVCaptureStillImageOutput *)stillImageOutput{
        if (!_stillImageOutput) {
            _stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
            [_stillImageOutput setOutputSettings:[[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG,AVVideoCodecKey, nil]];  //设置参数AVVideoCodecJPEG参数表示以JPEG的图片
        }
        return _stillImageOutput;
    }
    
    
    #pragma mark 文件输出
    -(AVCaptureMovieFileOutput *)movieFileOutput{
        return _movieFileOutput = _movieFileOutput?:[[AVCaptureMovieFileOutput alloc] init];
    }
    
    #pragma mark data输出
    -(AVCaptureVideoDataOutput *)videoDataOutput{
        if  (!_videoDataOutput){
            _videoDataOutput = [[AVCaptureVideoDataOutput alloc] init];
            _videoDataOutput.videoSettings = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange] forKey:(id)kCVPixelBufferPixelFormatTypeKey];
            dispatch_queue_t videoQueue = dispatch_queue_create("Video Capture Queue", DISPATCH_QUEUE_SERIAL);
            [_videoDataOutput setSampleBufferDelegate:self queue:videoQueue];
        }
        return _videoDataOutput;
    }
    
    
    #pragma mark 元数据输出
    -(AVCaptureMetadataOutput *)metadataOutput{
        if (!_metadataOutput){
            _metadataOutput = [[AVCaptureMetadataOutput alloc]init];
    //        _metadataOutput.rectOfInterest = CGRectMake(0.2, 0.2, 0.6, 0.6);
    //        //设置输出数据代理
            [_metadataOutput setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
        }
        return _metadataOutput;
    }
    
    
    #pragma mark 输入输出对象连接
    -(AVCaptureConnection *)captureConnection{
        return _captureConnection = _captureConnection?:[self.movieFileOutput connectionWithMediaType:AVMediaTypeVideo];
    }
    
    
    #pragma mark layer层
    -(AVCaptureVideoPreviewLayer *)preViewLayer{
        if (!_preViewLayer) {
            _preViewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
            _preViewLayer.masksToBounds = YES;
            _preViewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
        }
        return _preViewLayer;
    }
    
    
    //配置曝光模式
    - (void)setExposureModeWithDevice:(AVCaptureDevice *)device{
        NSError *error = nil;
        //注意改变设备属性前一定要首先调用lockForConfiguration:调用完之后使用unlockForConfiguration方法解锁
        [device lockForConfiguration:&error];
        //设置持续曝光模式
        if ([device isExposureModeSupported:AVCaptureExposureModeContinuousAutoExposure])[device setExposureMode:AVCaptureExposureModeContinuousAutoExposure];
        [device unlockForConfiguration];
    }
    
    
    -(NSTimer *)timer{
        if (!_timer)
        {
            _timer = [NSTimer scheduledTimerWithTimeInterval:KTimerInterval target:self selector:@selector(startTime:) userInfo:nil repeats:YES];
    //        [[NSRunLoop mainRunLoop] addTimer:_timer forMode:NSRunLoopCommonModes];
        }
        return _timer;
    }
    
    
    #pragma mark计时器事件处理
    - (void)startTime:(NSTimer *)timer{
    //    self.recordProgressBlock?self.recordProgressBlock(self.recordTime,self.totalTime):nil;
        if ([self.delegate respondsToSelector:@selector(joyRecordTimeCurrentTime:totalTime:)]) {
            [self.delegate joyRecordTimeCurrentTime:self.recordTime totalTime:self.totalTime];
        }
        self.recordTime += KTimerInterval;
        if(_recordTime>=KMaxRecordTime){[self stopCurrentVideoRecording];}
    }
    
    //定时器开启
    - (void)startTimer{
        [self.timer invalidate];
        self.timer = nil;
        self.recordTime = 0;
        [self.timer fire];
    }
    
    //停止计时
    - (void)stopTimer{
        [self.timer invalidate];
        self.timer = nil;
    }
    
    //文件视频开始录制代理,也就是小视频的开始录制
    #pragma mark - AVCaptureFileOutputRecordignDelegate
    - (void)captureOutput:(AVCaptureFileOutput *)captureOutput didStartRecordingToOutputFileAtURL:(NSURL *)fileURL fromConnections:(NSArray *)connections{
        [self startTimer];
        if([self.delegate respondsToSelector:@selector(joyCaptureOutput:didStartRecordingToOutputFileAtURL:fromConnections:)]){
            [self.delegate joyCaptureOutput:captureOutput didStartRecordingToOutputFileAtURL:fileURL fromConnections:connections];
        }
    }
    
    #pragma mark 文件录制结束代理 小视频录制结束
    -(void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error{
        [self endBackgroundTask];
        if ([self.delegate respondsToSelector:@selector(joyCaptureOutput:didFinishRecordingToOutputFileAtURL:fromConnections:error: recordResult:) ])
        {
            ERecordResult result = error?ERecordFaile:(self.recordTime>KMinRecordTime?ERecordSucess:ERecordLessThanMinTime);
            [self.delegate joyCaptureOutput:captureOutput didFinishRecordingToOutputFileAtURL:outputFileURL fromConnections:connections error:error recordResult:result];
        }
    }
    
    #pragma mark 流数据丢包 流媒体
    -(void)captureOutput:(AVCaptureOutput *)captureOutput didDropSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection{
        
    }
    
    #pragma mark 流数据输出 流媒体
    -(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection{
        
    }
    
    #pragma mark 扫描到数据 二维码扫描成功
    - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputMetadataObjects:(NSArray *)metadataObjects fromConnection:(AVCaptureConnection *)connection{
        if ([self.delegate respondsToSelector:@selector(joyCaptureOutput:didOutputMetadataObjects:fromConnection:)]) {
            [self.delegate joyCaptureOutput:captureOutput didOutputMetadataObjects:metadataObjects fromConnection:connection];
        }
    }
    
    -(void)dealloc{
        [self.timer invalidate];
        self.timer = nil;
        self.recordTime = 0;
        [self stopCurrentVideoRecording];
        [self.captureSession stopRunning];
        [self.preViewLayer removeFromSuperlayer];
    }
    
    
    #pragma mark private method 😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄😄End
    
    #pragma mark public method 🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺🎺 Start
    #pragma mark 准备录制
    - (void)preareReCord{
        [self.captureSession beginConfiguration];
        [self.captureSession canSetSessionPreset:AVCaptureSessionPresetMedium]?[self.captureSession setSessionPreset:AVCaptureSessionPresetHigh]:nil;
        [self.captureSession canAddInput:self.mediaDeviceInput]?[self.captureSession addInput:self.mediaDeviceInput]:nil;
        [self.captureSession canAddInput:self.audioDeviceInput]?[self.captureSession addInput:self.audioDeviceInput]:nil;
        [self.captureSession canAddOutput:self.stillImageOutput]?[self.captureSession addOutput:self.stillImageOutput]:nil;
        switch (self.captureOutputType)
        {
        case EAVCaptureVideoDataOutput:
            [self.captureSession canAddOutput:self.videoDataOutput]?[self.captureSession addOutput:self.videoDataOutput]:nil;
            break;
        case EAVCaptureMetadataOutput:
            [self.captureSession canAddOutput:self.metadataOutput]?[self.captureSession addOutput:self.metadataOutput]:nil;
            if ([_metadataOutput.availableMetadataObjectTypes containsObject:AVMetadataObjectTypeQRCode])
            {_metadataOutput.metadataObjectTypes = [NSArray arrayWithObjects:AVMetadataObjectTypeQRCode,AVMetadataObjectTypeUPCECode,
                                                    AVMetadataObjectTypeCode39Code,
                                                    AVMetadataObjectTypeCode39Mod43Code,
                                                    AVMetadataObjectTypeEAN13Code,
                                                    AVMetadataObjectTypeEAN8Code,
                                                    AVMetadataObjectTypeCode93Code,
                                                    AVMetadataObjectTypeCode128Code,
                                                    AVMetadataObjectTypePDF417Code,
                                                    AVMetadataObjectTypeQRCode,
                                                    AVMetadataObjectTypeAztecCode, nil];}
                
            break;
        default:
            [self.captureSession canAddOutput:self.movieFileOutput]?[self.captureSession addOutput:self.movieFileOutput]:nil;
            break;
        }
        //设置输出数据代理
        [self.captureSession commitConfiguration];
        [self openStabilization];
        [self.captureSession startRunning];
    }
    
    //这个地方讲一下,每次录制结束都要移除输入设备管理,否则下次录制代理不会走的
    #pragma mark 移除输入
    -(void)removeAVCaptureAudioDeviceInput
    {
        self.mediaDeviceInput?[self.captureSession removeInput:self.mediaDeviceInput]:nil;
        self.audioDeviceInput?[self.captureSession removeInput:self.audioDeviceInput]:nil;
        self.stillImageOutput?[self.captureSession removeOutput:self.stillImageOutput]:nil;
        switch (self.captureOutputType)
        {
        case EAVCaptureVideoDataOutput:
            self.videoDataOutput? [self.captureSession removeOutput:self.videoDataOutput]:nil;
            break;
        case EAVCaptureMetadataOutput:
            self.metadataOutput? [self.captureSession removeOutput:self.metadataOutput]:nil;
            break;
        default:
            self.movieFileOutput? [self.captureSession removeOutput:self.movieFileOutput]:nil;
            break;
        }
    }
    
    
    #pragma mark 设置焦距
    - (void)updateVideoScaleAndCropFactor:(CGFloat)scale{
        if (scale < self.mediaDeviceInput.device.activeFormat.videoMaxZoomFactor && scale>1)
        [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) {
            [captureDevice rampToVideoZoomFactor:scale withRate:10];
        }];
    }
    
    
    #pragma mark 防抖功能 并设置缩放比例最大以提高视频质量
    - (void)openStabilization{
        if ([self.captureConnection isVideoStabilizationSupported ] &&self.captureConnection.activeVideoStabilizationMode == AVCaptureVideoStabilizationModeOff)
        {
            self.captureConnection.preferredVideoStabilizationMode=AVCaptureVideoStabilizationModeAuto;//视频防抖
        }
        self.captureConnection.videoScaleAndCropFactor = _captureConnection.videoMaxScaleAndCropFactor;//镜头缩放最大
    }
    
    
    #pragma mark 开始录制
    - (void)startRecordToFile:(NSURL *)outPutFile{
        if (![self.movieFileOutput isRecording]) {  // 如果此时没有在录屏
            if ([[UIDevice currentDevice] isMultitaskingSupported])//如果支持多任务则则开始多任务
            {
            __weak __typeof(&*self)weakSelf = self;
            self.backgroundTaskIdentifier=[[UIApplication sharedApplication] beginBackgroundTaskWithExpirationHandler:^{[weakSelf endBackgroundTask];}];
            }
            
            if ([self.captureConnection isVideoOrientationSupported])
            self.captureConnection.videoOrientation =[self.preViewLayer connection].videoOrientation;
            self.recordTime = 0.0f;
            [_movieFileOutput startRecordingToOutputFileURL:outPutFile recordingDelegate:self];
        }
        else{[self.movieFileOutput stopRecording];//停止录制
        }
    }
    
    
    #pragma mark - 视频录制
    -(void)endBackgroundTask
    {
        if (self.backgroundTaskIdentifier != UIBackgroundTaskInvalid) {
            [[UIApplication sharedApplication] endBackgroundTask:self.backgroundTaskIdentifier];
        }
        self.backgroundTaskIdentifier = UIBackgroundTaskInvalid;
    }
    
    
    #pragma mark 暂停
    - (void)stopCurrentVideoRecording
    {
        if (self.movieFileOutput.isRecording) {
            [self stopTimer];
            [_movieFileOutput stopRecording];
        }
    }
    
    #pragma mark 手电筒
    - (void)switchTorch{
        __weak __typeof (&*self)weakSelf = self;
        dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
            AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
            NSError *error = nil;
    //设备参数修改时要锁定配置,修改后unlock,摄像头也一样
            [device lockForConfiguration:&error];
            if (error) {NSLog(@"error:%@",error.description);}
    //        AVCaptureTorchMode torchMode = device.torchMode == AVCaptureTorchModeOff?AVCaptureTorchModeOn:AVCaptureTorchModeOff;
            
            AVCaptureTorchMode torchMode = AVCaptureTorchModeAuto;
            AVCaptureDevice *currentDevice = [weakSelf.mediaDeviceInput device];
            if(currentDevice.position == AVCaptureDevicePositionFront) torchMode = AVCaptureTorchModeOff;
            [device setTorchMode:torchMode];
            [device unlockForConfiguration];
        });
    }
    
    #pragma mark 切换摄像头 
    - (void)switchCamera{
        [_captureSession beginConfiguration];
       //移除旧的输入设备
        [_captureSession removeInput:_mediaDeviceInput];
        AVCaptureDevice *swithToDevice = [self getSwitchCameraDevice];
        [swithToDevice lockForConfiguration:nil];
        [self setExposureModeWithDevice:swithToDevice];
        self.mediaDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:swithToDevice error:nil];
     //替换更换摄像头后的输入对象
        [_captureSession addInput:_mediaDeviceInput];
        [_captureSession commitConfiguration];
    }
    
    
    - (void)cancleRecord{
        
    }
    
    
    #pragma mark 设置对焦
    - (void)setFoucusWithPoint:(CGPoint)point{
        CGPoint cameraPoint= [self.preViewLayer captureDevicePointOfInterestForPoint:point];
        [self focusWithMode:AVCaptureFocusModeAutoFocus exposureMode:AVCaptureExposureModeAutoExpose atPoint:cameraPoint];
    }
    
    /**
     *  设置聚焦点
     *
     *  @param point 聚焦点
     */
    -(void)focusWithMode:(AVCaptureFocusMode)focusMode exposureMode:(AVCaptureExposureMode)exposureMode atPoint:(CGPoint)point{
        [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) {
            //聚焦
            if ([captureDevice isFocusModeSupported:focusMode]) {
                [captureDevice setFocusMode:focusMode];
            }
            //聚焦位置
            if ([captureDevice isFocusPointOfInterestSupported]) {
                [captureDevice setFocusPointOfInterest:point];
            }
            //曝光模式
            if ([captureDevice isExposureModeSupported:exposureMode]) {
                [captureDevice setExposureMode:exposureMode];
            }
            //曝光点位置
            if ([captureDevice isExposurePointOfInterestSupported]) {
                [captureDevice setExposurePointOfInterest:point];
            }
        }];
    }
    
    
    /**
     * 改变设备属性的统一操作方法
     * @param propertyChange 属性改变操作
     */
    - (void)changeDeviceProperty:(IDBLOCK)propertyChange
    {
        AVCaptureDevice *captureDevice = [self.mediaDeviceInput device];
        NSError *error;
        //注意改变设备属性前一定要首先调用lockForConfiguration:调用完之后使用unlockForConfiguration方法解锁
        if ([captureDevice lockForConfiguration:&error]) {
            propertyChange(captureDevice);
            [captureDevice unlockForConfiguration];
        }else{
            NSLog(@"设置设备属性过程发生错误,错误信息:%@",error.localizedDescription);
        }
    }
    
    
    - (AVCaptureDevice *)getSwitchCameraDevice{
        AVCaptureDevice *currentDevice = [self.mediaDeviceInput device];
        AVCaptureDevicePosition currentPosition = [currentDevice position];
        BOOL isUnspecifiedOrFront = (currentPosition==AVCaptureDevicePositionUnspecified||currentPosition==AVCaptureDevicePositionFront);
        AVCaptureDevicePosition  swithToPosition = isUnspecifiedOrFront?AVCaptureDevicePositionBack:AVCaptureDevicePositionFront;
        NSArray *cameras= [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
        __block AVCaptureDevice *swithCameraDevice = nil;
        [cameras enumerateObjectsUsingBlock:^(AVCaptureDevice *camera, NSUInteger idx, BOOL * _Nonnull stop) {
            if(camera.position == swithToPosition){swithCameraDevice = camera;*stop = YES;};
        }];
        return swithCameraDevice;
    }
    
    @end
    
    
    ##类别,权限处理、视频裁剪压缩、视频存储
    #pragma mark  权限认证、缓存处理 类别🤖️🤖️🤖️🤖️🤖️🤖️🤖️🤖️🤖️🤖️🤖️🤖️🤖️🤖️🤖️🤖️🤖️🤖️🤖️🤖️🤖️🤖️🤖️🤖️🤖️开始
    @implementation JoyMediaRecordPlay(JoyRecorderPrivary)
    - (BOOL)isAvailableWithCamera
    {
        return [self isAvailableWithDeviveMediaType:AVMediaTypeVideo];
    }
    
    
    - (BOOL)isAvailableWithMic
    {
        return [self isAvailableWithDeviveMediaType:AVMediaTypeAudio];
    }
    
    
    - (BOOL)isAvailableWithDeviveMediaType:(NSString *)mediaType
    {
        AVAuthorizationStatus status = [AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeVideo];
        return !(status == ALAuthorizationStatusDenied||status == ALAuthorizationStatusRestricted);
    }
    
    
    - (void)getVideoAuth:(BOOLBLOCK)videoAuth{
        __weak typeof(self)weakSelf = self;
        AVAuthorizationStatus authStatus = [AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeVideo];
        if (authStatus == AVAuthorizationStatusNotDetermined)
        {
            [AVCaptureDevice requestAccessForMediaType:AVMediaTypeVideo
                                     completionHandler:^(BOOL granted) {
                                         granted?[weakSelf authAudio:videoAuth]:videoAuth(NO); }];
        }
        else if (authStatus == AVAuthorizationStatusAuthorized)
        { [self authAudio:videoAuth];}
        else
        {videoAuth(NO);}
    }
    
    
    - (void)authAudio:(BOOLBLOCK)audio {
        if ([[AVAudioSession sharedInstance] respondsToSelector:@selector(requestRecordPermission:)]) {
            [[AVAudioSession sharedInstance] requestRecordPermission:^(BOOL granted) {
                audio(granted);
            }];
        }
    }
    
    
    - (void)showAlert{
        [[JoyAlert shareAlert] showAlertViewWithTitle:@"请在iPhone的“设置-隐私”选项中,允许%@访问你的摄像头和麦克风。"
                                              message:nil
                                               cancle:@"好"
                                              confirm:nil
                                           alertBlock:nil];
    }
    
    
    
    #pragma mark 视频保存相册
    + (void)saveToPhotoWithUrl:(NSURL *)url{
        [[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{
            [PHAssetChangeRequest creationRequestForAssetFromVideoAtFileURL:url];
        } completionHandler:nil];
    }
    
    #pragma mark  视频裁剪   ⚙️⚙️⚙️⚙️⚙️⚙️⚙️⚙️⚙️⚙️⚙️⚙️⚙️⚙️⚙️⚙️⚙️⚙️⚙️⚙️⚙️⚙️⚙️⚙️⚙️⚙️⚙️⚙️⚙️⚙️⚙️开始
    /*
     fileURL :原视频url
     mergeFilePath:新的fileurl
     whScalle:所需裁剪的宽高比
     presetName:压缩视频质量,不传则 AVAssetExportPresetMediumQuality
     */
    + (void)mergeAndExportVideosAtFileURLs:(NSURL *)fileURL newUrl:(NSString *)mergeFilePath widthHeightScale:(CGFloat)whScalle presetName:(NSString *)presetName mergeSucess:(VOIDBLOCK)mergeSucess
    
    {
        NSError *error = nil;
        
        CMTime totalDuration = kCMTimeZero;
        //转换AVAsset
        AVAsset *asset = [AVAsset assetWithURL:fileURL];
        if (!asset) {
            return;
        }
        
        AVMutableComposition *mixComposition = [[AVMutableComposition alloc] init];
        //提取音频、视频
        NSArray * assetArray = [asset tracksWithMediaType:AVMediaTypeVideo];
        
        AVAssetTrack *assetTrack;
        if (assetArray.count) {
            assetTrack = [assetArray objectAtIndex:0];
        }
        
        [JoyMediaRecordPlay audioTrackWith:mixComposition assetTrack:assetTrack asset:asset totalDuration:totalDuration error:error];
        
        AVMutableCompositionTrack *videoTrack = [JoyMediaRecordPlay videoTrackWith:mixComposition assetTrack:assetTrack asset:asset totalDuration:totalDuration error:error];
        
        CGFloat renderW = [JoyMediaRecordPlay videoTrackRenderSizeWithassetTrack:assetTrack];
        totalDuration = CMTimeAdd(totalDuration, asset.duration);
        
        NSMutableArray *layerInstructionArray = [JoyMediaRecordPlay assetArrayWith:videoTrack totalDuration:totalDuration assetTrack:assetTrack renderW:renderW widthHeightScale:whScalle];
        
        [JoyMediaRecordPlay mergingVideoWithmergeFilePath:mergeFilePath layerInstructionArray:layerInstructionArray mixComposition:mixComposition totalDuration:totalDuration renderW:renderW widthHeightScale:whScalle presetName:presetName mergeSucess:mergeSucess];
    }
    
    
    //压缩视频
    +(void)mergingVideoWithmergeFilePath:(NSString *)mergeFilePath
                   layerInstructionArray:(NSMutableArray*)layerInstructionArray
                          mixComposition:(AVMutableComposition *)mixComposition
                           totalDuration:(CMTime)totalDuration
                                 renderW:(CGFloat)renderW
                        widthHeightScale:(CGFloat)whScalle
                              presetName:(NSString *)presetName
                             mergeSucess:(VOIDBLOCK)mergeSucess
    
    {
        //get save path
        NSURL *mergeFileURL = [NSURL fileURLWithPath:mergeFilePath];
        
        //export
        AVMutableVideoCompositionInstruction *mainInstruciton = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
        mainInstruciton.timeRange = CMTimeRangeMake(kCMTimeZero, totalDuration);
        mainInstruciton.layerInstructions = layerInstructionArray;
        AVMutableVideoComposition *mainCompositionInst = [AVMutableVideoComposition videoComposition];
        mainCompositionInst.instructions = @[mainInstruciton];
        mainCompositionInst.frameDuration = CMTimeMake(1, 30);
        mainCompositionInst.renderSize = CGSizeMake(renderW, renderW/whScalle);//renderW/4*3
        
        __block AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:presetName?:AVAssetExportPresetMediumQuality];
        exporter.videoComposition = mainCompositionInst;
        exporter.outputURL = mergeFileURL;
        exporter.outputFileType = AVFileTypeMPEG4;
        exporter.shouldOptimizeForNetworkUse = YES;
        [exporter exportAsynchronouslyWithCompletionHandler:^{
            dispatch_async(dispatch_get_main_queue(), ^{
                switch (exporter.status) {
                    case AVAssetExportSessionStatusCompleted:
                        mergeSucess?mergeSucess():nil;
                        break;
                    default:
                        break;
                }
            });
        }];
        
    }
    
    
    //合成视频
    + (NSMutableArray *)assetArrayWith:(AVMutableCompositionTrack *)videoTrack
                         totalDuration:(CMTime)totalDuration
                            assetTrack:(AVAssetTrack *)assetTrack
                               renderW:(CGFloat)renderW
                      widthHeightScale:(CGFloat)whScalle
    
    {
        NSMutableArray *layerInstructionArray = [[NSMutableArray alloc] init];
        
        AVMutableVideoCompositionLayerInstruction *layerInstruciton = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
        CGFloat rate = renderW / MIN(assetTrack.naturalSize.width, assetTrack.naturalSize.height);
        CGAffineTransform layerTransform = CGAffineTransformMake(assetTrack.preferredTransform.a, assetTrack.preferredTransform.b, assetTrack.preferredTransform.c, assetTrack.preferredTransform.d, assetTrack.preferredTransform.tx * rate, assetTrack.preferredTransform.ty * rate);
        layerTransform = CGAffineTransformConcat(layerTransform, CGAffineTransformMake(1, 0, 0, 1, 0, -(assetTrack.naturalSize.width - assetTrack.naturalSize.height/whScalle) / 2.0));//向上移动取中部影响
        layerTransform = CGAffineTransformScale(layerTransform, rate, rate);//放缩,解决前后摄像结果大小不对称
        [layerInstruciton setTransform:layerTransform atTime:kCMTimeZero];
        [layerInstruciton setOpacity:0.0 atTime:totalDuration];
        //data
        [layerInstructionArray addObject:layerInstruciton];
        
        return layerInstructionArray;
    }
    
    
    //视频大小
    +(CGFloat)videoTrackRenderSizeWithassetTrack:(AVAssetTrack *)assetTrack{
        
        CGSize renderSize = CGSizeMake(0, 0);
        renderSize.width = MAX(renderSize.width, assetTrack.naturalSize.height);
        renderSize.height = MAX(renderSize.height, assetTrack.naturalSize.width);
        return MIN(renderSize.width, renderSize.height);
    }
    
    
    //videoTrack
    +(AVMutableCompositionTrack*)videoTrackWith:(AVMutableComposition *)mixComposition
                                     assetTrack:(AVAssetTrack *)assetTrack
                                          asset:(AVAsset *)asset
                                  totalDuration:(CMTime)totalDuration
                                          error:(NSError *)error{
        
        AVMutableCompositionTrack *videoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
        
        [videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration)
                            ofTrack:assetTrack
                             atTime:totalDuration
                              error:&error];
        
        
        return videoTrack;
        
    }
    
    
    //audioTrack
    +(void)audioTrackWith:(AVMutableComposition *)mixComposition
               assetTrack:(AVAssetTrack *)assetTrack
                    asset:(AVAsset *)asset
            totalDuration:(CMTime)totalDuration
                    error:(NSError *)error{
        AVMutableCompositionTrack *audioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
        
        NSArray *array =  [asset tracksWithMediaType:AVMediaTypeAudio];
        if (array.count > 0) {
            AVAssetTrack *audiok =[array objectAtIndex:0];
            [audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration)
                                ofTrack:audiok
                                 atTime:totalDuration
                                  error:nil];
        }
    }
    #pragma mark  视频裁剪   ⚙️⚙️⚙️⚙️⚙️⚙️⚙️⚙️⚙️⚙️⚙️⚙️⚙️⚙️⚙️⚙️⚙️⚙️⚙️⚙️⚙️⚙️⚙️⚙️⚙️⚙️⚙️⚙️⚙️结束
    
    
    #pragma mark - 视频地址
    + (NSString *)generateFilePathWithType:(NSString *)fileType{
       return  [[[self class] getVideoPathCache] stringByAppendingString:[[self class] getVideoNameWithType:fileType]];
    }
    
    
    + (NSString *)getVideoPathCache
    {
        NSString *videoCache = [NSTemporaryDirectory() stringByAppendingPathComponent:@"videos"] ;
        BOOL isDir = NO;
        NSFileManager *fileManager = [NSFileManager defaultManager];
        BOOL existed = [fileManager fileExistsAtPath:videoCache isDirectory:&isDir];
        if ( !(isDir == YES && existed == YES) ) {
            [fileManager createDirectoryAtPath:videoCache withIntermediateDirectories:YES attributes:nil error:nil];
        };
        return videoCache;
    }
    
    
    + (NSString *)getVideoNameWithType:(NSString *)fileType
    {
        NSTimeInterval now = [[NSDate date] timeIntervalSince1970];
        NSDateFormatter * formatter = [[NSDateFormatter alloc] init];
        [formatter setDateFormat:@"HHmmss"];
        NSDate * NowDate = [NSDate dateWithTimeIntervalSince1970:now];
        ;
        NSString * timeStr = [formatter stringFromDate:NowDate];
        NSString *fileName = [NSString stringWithFormat:@"/video_%@.%@",timeStr,fileType];
        return fileName;
    }
    
    
    #pragma mark 获取文件大小
    + (CGFloat)getfileSize:(NSString *)filePath
    {
        NSFileManager *fm = [NSFileManager defaultManager];
        filePath = [filePath stringByReplacingOccurrencesOfString:@"file://" withString:@""];
        CGFloat fileSize = 0;
        if ([fm fileExistsAtPath:filePath]) {
            fileSize = [[fm attributesOfItemAtPath:filePath error:nil] fileSize];
            NSLog(@"视频 - - - - - %fM,--------- %fKB",fileSize / (1024.0 * 1024.0),fileSize / 1024.0);
        }
        return fileSize/1024/1024;
    }
    
    
    
    @end
    #pragma mark  权限认证、缓存处理 类别🤖️🤖️🤖️🤖️🤖️🤖️🤖️🤖️🤖️🤖️🤖️🤖️🤖️🤖️🤖️🤖️🤖️🤖️🤖️🤖️🤖️🤖️🤖️🤖️🤖️结束
    

    主文件git地址: https://github.com/joy-make/SmallVideo.git
    简单写了个demo,主要还是看player的实现,view是临时写了一个,你可以把回调拿到vc里去处理。joytool masony框架可以用自己的

    相关文章

      网友评论

      • 雪丹妮_66865:您好,录制到最大时间之后,视频压缩裁减之后的回调没有执行,不能选择录制完成页面,请问一下要怎么修改?
      • Henrya:博主代码整洁规范,封装设计思路清晰,很赞,是名优秀的开发者,值得学习。希望继续维护,拓展更多功能
      • 问夕阙:想拖过来直接用,用不了 ,全是报错:joy:
      • 问夕阙:joytool 这个是什么框架?百度也没找到
      • MR_詹:请问支持拍照功能吗
      • 我是七月:你好,遇到一个问题,在 iphone8录制的视频,在iphone6上面播放只有声音,么有图片,这是什么原因?
      • 匆xx匆:录制的视频十几兆 是不是太大了
      • KuKuMan:demo没有声音,
      • 95a94c3ba4e8:不错:+1:
      • 2bd7e62c5187:文章写得非常好,正好有用,但碰到个小问题,可以像微信一样,边播音乐,边录视频吗,急求功能啊
        二营长意大利炮呢:@pzz819816 [[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord error:nil];
        pzz819816:@无知小阿三 我也想实现微信那种不影响其他app播放音乐。但不知道具体在哪里设置,能贴一下这部分代码吗?
        二营长意大利炮呢:修改音频输入的设置即可
      • Alistar:横屏时录制的视频,保存到相册是竖着播放的,有什么解决办法么?
      • 卢叁:嗨 录下的视频在手机相册里直接播放 为什么没有声音啊
        卢叁:我是在GitHub上下的demo呀 什么都没改 就是没声音 奇怪
        卢叁:@joymake 没声音啊
        joymake:我刚试了下可以的呢
      • Pusswzy:是原创么, 这么好的文章记得投稿
        Pusswzy:@joymake 快去投稿 快去~ 这么好的文章会上首页的
        joymake:@Pusswzy 纯手工原生绿色打造每一行代码、亲测每一个函数,对简友有帮助就好:blush:

      本文标题:ios视频录制(仿微信小视频、音视频采集播放(文件录制)、视频裁

      本文链接:https://www.haomeiwen.com/subject/rnudqxtx.html