美文网首页
iOS多媒体:摄像头

iOS多媒体:摄像头

作者: 时光啊混蛋_97boy | 来源:发表于2020-09-02 17:08 被阅读0次

原创:知识探索型文章
无私奉献,为国为民,创作不易,请珍惜,之后会持续更新,不断完善
个人比较喜欢做笔记和写总结,毕竟好记性不如烂笔头哈哈,这些文章记录了我的IOS成长历程,希望能与大家一起进步
温馨提示:由于简书不支持目录跳转,大家可通过command + F 输入目录标题后迅速寻找到你所需要的内容

Demo在我的Github上,欢迎下载。
Multi-MediaDemo

目录

  • 一、UIImagePickerController拍照和视频录制

  • 二、AVFoundation拍照和录制视频

  • 三、扫描二维码

  • 参考文献

一、UIImagePickerController拍照和视频录制

1、简介

UIImagePickerController继承于UINavigationController,前面的文章中主要使用它来选取照片,其实UIImagePickerController的功能不仅如此,它还可以用来拍照和录制视频。首先看一下这个类常用的属性和方法:

#pragma mark - 属性

@property(nonatomic) UIImagePickerControllerSourceType sourceType //数据源类型,sourceType是枚举类型
UIImagePickerControllerSourceTypePhotoLibrary:照片库,默认值
UIImagePickerControllerSourceTypeCamera:摄像头
UIImagePickerControllerSourceTypeSavedPhotosAlbum:相簿

@property(nonatomic) UIImagePickerControllerQualityType videoQuality //视频质量,枚举类型
UIImagePickerControllerQualityTypeHigh:高清质量
UIImagePickerControllerQualityTypeMedium:中等质量,适合WiFi传输
UIImagePickerControllerQualityTypeLow:低质量,适合蜂窝网传输
UIImagePickerControllerQualityType640x480:640*480
UIImagePickerControllerQualityTypeIFrame1280x720:1280*720
UIImagePickerControllerQualityTypeIFrame960x540:960*540

@property(nonatomic) UIImagePickerControllerCameraCaptureMode //cameraCaptureMode 摄像头捕获模式,捕获模式是枚举类型
UIImagePickerControllerCameraCaptureModePhoto:拍照模式
UIImagePickerControllerCameraCaptureModeVideo:视频录制模式

@property(nonatomic) UIImagePickerControllerCameraDevice cameraDevice //摄像头设备,cameraDevice是枚举类型
UIImagePickerControllerCameraDeviceRear:前置摄像头
UIImagePickerControllerCameraDeviceFront:后置摄像头

@property(nonatomic) UIImagePickerControllerCameraFlashMode   cameraFlashMode //闪光灯模式,枚举类型
UIImagePickerControllerCameraFlashModeOff:关闭闪光灯
UIImagePickerControllerCameraFlashModeAuto:闪光灯自动
UIImagePickerControllerCameraFlashModeOn:打开闪光灯

@property(nonatomic,copy) NSArray  *mediaTypes //媒体类型,默认情况下此数组包含kUTTypeImage,所以拍照时可以不用设置;但是当要录像的时候必须设置,可以设置为kUTTypeVideo(视频,但不带声音)或者kUTTypeMovie(视频并带有声音)
@property(nonatomic) NSTimeInterval videoMaximumDuration //视频最大录制时长,默认为10 s
@property(nonatomic) BOOL showsCameraControls //是否显示摄像头控制面板,默认为YES
@property(nonatomic,retain) UIView *cameraOverlayView //摄像头上覆盖的视图,可用通过这个视频来自定义拍照或录像界面
@property(nonatomic) CGAffineTransform cameraViewTransform //摄像头形变

#pragma mark - 类方法

+ (BOOL)isSourceTypeAvailable:(UIImagePickerControllerSourceType)sourceType //指定的源类型是否可用,sourceType是枚举类型
UIImagePickerControllerSourceTypePhotoLibrary:照片库
UIImagePickerControllerSourceTypeCamera:摄像头
UIImagePickerControllerSourceTypeSavedPhotosAlbum:相簿

+ (BOOL)isCameraDeviceAvailable:(UIImagePickerControllerCameraDevice)cameraDevice //指定的摄像头是否可用,cameraDevice是枚举类型
UIImagePickerControllerCameraDeviceRear:前置摄像头
UIImagePickerControllerCameraDeviceFront:后置摄像头

+ (NSArray *)availableCaptureModesForCameraDevice:(UIImagePickerControllerCameraDevice)cameraDevice //获得指定摄像头上的可用捕获模式,捕获模式是枚举类型
UIImagePickerControllerCameraCaptureModePhoto:拍照模式
UIImagePickerControllerCameraCaptureModeVideo:视频录制模式

+ (NSArray *)availableMediaTypesForSourceType:(UIImagePickerControllerSourceType)sourceType //指定的源设备上可用的媒体类型,一般就是图片和视频
+ (BOOL)isFlashAvailableForCameraDevice:(UIImagePickerControllerCameraDevice)cameraDevice //指定摄像头的闪光灯是否可用

#pragma mark - 对象方法

- (void)takePicture //编程方式拍照
- (BOOL)startVideoCapture //编程方式录制视频
- (void)stopVideoCapture //编程方式停止录制视频

#pragma mark - 代理方法

- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info //媒体拾取完成
- (void)imagePickerControllerDidCancel:(UIImagePickerController *)picker //取消拾取

#pragma mark - 扩展方法(主要用于保存照片、视频到相簿)

UIImageWriteToSavedPhotosAlbum(UIImage *image, id completionTarget, SEL completionSelector, void *contextInfo) //保存照片到相簿
UIVideoAtPathIsCompatibleWithSavedPhotosAlbum(NSString *videoPath) //能否将视频保存到相簿
void UISaveVideoAtPathToSavedPhotosAlbum(NSString *videoPath, id completionTarget, SEL completionSelector, void *contextInfo) //保存视频到相簿

要用UIImagePickerController来拍照或者录制视频通常可以分为如下步骤:

  1. 创建UIImagePickerController对象。
  2. 指定拾取源,平时选择照片时使用的拾取源是照片库或者相簿,此刻需要指定为摄像头类型。
  3. 指定摄像头,前置摄像头或者后置摄像头。
  4. 设置媒体类型mediaType,注意如果是录像必须设置,如果是拍照此步骤可以省略,因为mediaType默认包含kUTTypeImage(注意媒体类型定义在MobileCoreServices.framework中)
  5. 指定捕获模式,拍照或者录制视频。(视频录制时必须先设置媒体类型再设置捕获模式)
  6. 展示UIImagePickerController(通常以模态窗口形式打开)。
  7. 拍照和录制视频结束后在代理方法中展示/保存照片或视频。

2、Demo演示

当然这个过程中有很多细节可以设置,例如是否显示拍照控制面板,拍照后是否允许编辑等等,通过上面的属性/方法列表相信并不难理解。下面就以一个示例展示如何使用UIImagePickerController来拍照和录制视频,下面的程序中只要将_isVideo设置为YES就是视频录制模式,录制完后在主视图控制器中自动播放;如果将_isVideo设置为NO则为拍照模式,拍照完成之后在主视图控制器中显示拍摄的照片。

//
//  UIImagePickerControllerDemo.m
//  CameraDemo
//
//  Created by 谢佳培 on 2020/9/2.
//  Copyright © 2020 xiejiapei. All rights reserved.
//

#import "UIImagePickerControllerDemo.h"
#import <MobileCoreServices/MobileCoreServices.h>
#import <AVFoundation/AVFoundation.h>

@interface UIImagePickerControllerDemo ()<UIImagePickerControllerDelegate,UINavigationControllerDelegate>

@property (assign,nonatomic) BOOL isVideo; //是否录制视频,如果为1表示录制视频,0代表拍照
@property (strong,nonatomic) UIImagePickerController *imagePicker;
@property (strong,nonatomic) UIImageView *photo; //照片展示视图
@property (strong,nonatomic) AVPlayer *player; //播放器,用于录制完视频后播放视频

@end

@implementation UIImagePickerControllerDemo

- (void)viewDidLoad
{
    [super viewDidLoad];
    
    // 通过这里设置当前程序是拍照还是录制视频
    self.isVideo = YES;
    
    // 拍照按钮
    UIButton *button = [[UIButton alloc] initWithFrame:CGRectMake(50, 400, 300, 80)];
    button.backgroundColor = [UIColor blackColor];
    [button setTitle:@"拍照按钮" forState:UIControlStateNormal];
    [button addTarget:self action:@selector(takeClick) forControlEvents:UIControlEventTouchUpInside];
    [self.view addSubview:button];
}

#pragma mark - Events

// 点击拍照按钮
- (void)takeClick
{
    [self presentViewController:self.imagePicker animated:YES completion:nil];
}

#pragma mark - UIImagePickerController代理方法

// 完成
-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
    NSString *mediaType = [info objectForKey:UIImagePickerControllerMediaType];
    if ([mediaType isEqualToString:(NSString *)kUTTypeImage]) //如果是拍照
    {
        NSLog(@"拍照");
        UIImage *image;
        // 如果允许编辑则获得编辑后的照片,否则获取原始照片
        if (self.imagePicker.allowsEditing)
        {
            // 获取编辑后的照片
            image = [info objectForKey:UIImagePickerControllerEditedImage];
        }
        else
        {
            // 获取原始照片
            image = [info objectForKey:UIImagePickerControllerOriginalImage];
        }
        // 显示照片
        [self.photo setImage:image];
        // 保存到相簿
        UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil);
    }
    else if([mediaType isEqualToString:(NSString *)kUTTypeMovie]) //如果是录制视频
    {
        NSLog(@"录制视频");
        // 视频路径
        NSURL *url = [info objectForKey:UIImagePickerControllerMediaURL];
        NSString *urlStr = [url path];
        if (UIVideoAtPathIsCompatibleWithSavedPhotosAlbum(urlStr))
        {
            // 保存视频到相簿,注意也可以使用ALAssetsLibrary来保存
            UISaveVideoAtPathToSavedPhotosAlbum(urlStr, self, @selector(video:didFinishSavingWithError:contextInfo:), nil);//保存视频到相簿
        }
    }
    [self dismissViewControllerAnimated:YES completion:nil];
}

-(void)imagePickerControllerDidCancel:(UIImagePickerController *)picker
{
    NSLog(@"取消");
}

#pragma mark - Getter/Setter

- (UIImagePickerController *)imagePicker
{
    if (!_imagePicker)
    {
        _imagePicker = [[UIImagePickerController alloc] init];
        // 设置image picker的来源,这里设置为摄像头
        _imagePicker.sourceType = UIImagePickerControllerSourceTypeCamera;
        // 设置使用哪个摄像头,这里设置为后置摄像头
        _imagePicker.cameraDevice = UIImagePickerControllerCameraDeviceRear;
        if (self.isVideo)
        {
            _imagePicker.mediaTypes = @[(NSString *)kUTTypeMovie];
            _imagePicker.videoQuality = UIImagePickerControllerQualityTypeIFrame1280x720;
            // 设置摄像头模式(拍照,录制视频)
            _imagePicker.cameraCaptureMode = UIImagePickerControllerCameraCaptureModeVideo;
            
        }
        else
        {
            _imagePicker.cameraCaptureMode = UIImagePickerControllerCameraCaptureModePhoto;
        }
        // 允许编辑
        _imagePicker.allowsEditing = YES;
        // 设置代理,检测操作
        _imagePicker.delegate=self;
    }
    return _imagePicker;
}

@end

二、AVFoundation拍照和录制视频

1、简介

UIImagePickerController确实强大,但由于它的高度封装性,要进行某些自定义工作就比较复杂了。例如要做出一款类似于美颜相机的拍照界面就比较难以实现了,此时就可以考虑使用AVFoundation来实现。AVFoundation中提供了很多现成的播放器和录音机。因为AVFoundation中抽了很多和底层输入、输出设备打交道的类,依靠这些类开发人员面对的不再是封装好的音频播放器AVAudioPlayer、录音机(AVAudioRecorder)、视频(包括音频)播放器AVPlayer,而是输入设备(例如麦克风、摄像头)、输出设备(图片、视频)等。首先了解一下使用AVFoundation做拍照和视频录制开发用到的相关类:

  • AVCaptureSession:媒体(音、视频)捕获会话,负责把捕获的音视频数据输出到输出设备中。一个AVCaptureSession可以有多个输入输出。
AVCaptureSession
  • AVCaptureDevice:输入设备,包括麦克风、摄像头,通过该对象可以设置物理设备的一些属性(例如相机聚焦、白平衡等)。
  • AVCaptureDeviceInput:设备输入数据管理对象,可以根据AVCaptureDevice创建对应的AVCaptureDeviceInput对象,该对象将会被添加到AVCaptureSession中管理。
  • AVCaptureOutput:输出数据管理对象,用于接收各类输出数据,通常使用对应的子类AVCaptureAudioDataOutputAVCaptureStillImageOutputAVCaptureVideoDataOutputAVCaptureFileOutput,该对象将会被添加到AVCaptureSession中管理。注意:前面几个对象的输出数据都是NSData类型,而AVCaptureFileOutput代表数据以文件形式输出,类似的,AVCcaptureFileOutput也不会直接创建使用,通常会使用其子类:AVCaptureAudioFileOutputAVCaptureMovieFileOutput。当把一个输入或者输出添加到AVCaptureSession之后AVCaptureSession就会在所有相符的输入、输出设备之间建立连接(AVCaptionConnection)。
AVCaptureOutput
  • AVCaptureVideoPreviewLayer:相机拍摄预览图层,是CALayer的子类,使用该对象可以实时查看拍照或视频录制效果,创建该对象需要指定对应的AVCaptureSession对象。使用AVFoundation拍照和录制视频的一般步骤如下:
  1. 创建AVCaptureSession对象。
  2. 使用AVCaptureDevice的静态方法获得需要使用的设备,例如拍照和录像就需要获得摄像头设备,录音就要获得麦克风设备。
  3. 利用输入设备AVCaptureDevice初始化AVCaptureDeviceInput对象。
  4. 初始化输出数据管理对象,如果要拍照就初始化AVCaptureStillImageOutput对象,如果拍摄视频就初始化AVCaptureMovieFileOutput对象。
  5. 将数据输入对象AVCaptureDeviceInput、数据输出对象AVCaptureOutput添加到媒体会话管理对象AVCaptureSession中。
  6. 创建视频预览图层AVCaptureVideoPreviewLayer并指定媒体会话,添加图层到显示容器中,调用AVCaptureSessionstartRuning方法开始捕获。
  7. 将捕获的音频或视频数据输出到指定文件。

2、拍照

下面看一下如何使用AVFoundation实现一个拍照程序,在这个程序中将实现摄像头预览、切换前后摄像头、闪光灯设置、对焦、拍照保存等功能。 应用效果如下:

应用效果

其实有了前面的拍照应用之后要在此基础上做视频录制功能并不复杂,程序只需要做如下修改:

  1. 添加一个音频输入到会话(使用[[AVCaptureDevice devicesWithMediaType:AVMediaTypeAudio] firstObject]获得输入设备,然后根据此输入设备创建一个设备输入对象),在拍照程序中已经添加了视频输入所以此时不需要添加视频输入。
  2. 创建一个音乐播放文件输出对象AVCaptureMovieFileOutput取代原来的照片输出对象。
  3. 将捕获到的视频数据写入到临时文件并在停止录制之后保存到相簿(通过AVCaptureMovieFileOutput的代理方法)。
  4. 为了让程序更加完善在下面的视频录制程序中加入了屏幕旋转视频、自动布局和后台保存任务等细节。

❶ 定义会话、输入、输出等相关对象。

#import "AVFoundationDemo.h"
#import <AVFoundation/AVFoundation.h>
#import <AssetsLibrary/AssetsLibrary.h>
#import <Photos/Photos.h>

typedef void(^PropertyChangeBlock)(AVCaptureDevice *captureDevice);

@interface AVFoundationDemo ()<AVCapturePhotoCaptureDelegate,AVCaptureFileOutputRecordingDelegate>//拍照/视频文件输出代理

@property (strong,nonatomic) AVCaptureSession *captureSession; //负责输入和输出设备之间的数据传递
@property (strong,nonatomic) AVCaptureDeviceInput *captureDeviceInput; //负责从AVCaptureDevice获得输入数据
@property (strong,nonatomic) AVCapturePhotoOutput *capturePhotoOutput; //照片输出流
@property (strong,nonatomic) AVCaptureMovieFileOutput *captureMovieFileOutput; //视频输出流
@property (strong,nonatomic) AVCaptureVideoPreviewLayer *captureVideoPreviewLayer; //相机拍摄预览图层

@property (assign,nonatomic) BOOL enableRotation; //是否允许旋转(注意在视频录制过程中禁止屏幕旋转)
@property (assign,nonatomic) CGRect *lastBounds; //旋转的前大小
@property (assign,nonatomic) UIBackgroundTaskIdentifier backgroundTaskIdentifier; //后台任务标识

@property (strong, nonatomic) UIView *viewContainer;
@property (strong, nonatomic) UIImageView *focusCursor; //聚焦光标
@property (strong, nonatomic) UIButton *flashAutoButton;//自动闪光灯按钮
@property (strong, nonatomic) UIButton *flashOnButton;//打开闪光灯按钮
@property (strong, nonatomic) UIButton *flashOffButton;//关闭闪光灯按钮

@end

❷ 在控制器视图将要展示时创建并初始化会话、摄像头设备、输入、输出、预览图层,并且添加预览图层到视图中,除此之外还做了一些初始化工作,例如添加手势(点击屏幕进行聚焦)、初始化界面等。

#pragma mark - Life Circle

// 配置录制
-(void)viewWillAppear:(BOOL)animated
{
    [super viewWillAppear:animated];
    
    // 1.初始化会话
    _captureSession = [[AVCaptureSession alloc] init];
    if ([_captureSession canSetSessionPreset:AVCaptureSessionPreset1280x720])
    {
        // 设置分辨率
        _captureSession.sessionPreset = AVCaptureSessionPreset1280x720;
    }
    
    // 2.获得输入设备,取得后置摄像头
    AVCaptureDevice *captureDevice = [self cameraWithPostion:AVCaptureDevicePositionBack];
    if (!captureDevice)
    {
        NSLog(@"取得后置摄像头时出现问题.");
        return;
    }
    
    NSError *error = nil;
    // 3.根据输入设备初始化设备输入对象,用于获得输入数据
    _captureDeviceInput = [[AVCaptureDeviceInput alloc] initWithDevice:captureDevice error:&error];
    if (error)
    {
        NSLog(@"取得设备输入对象时出错,错误原因:%@",error.localizedDescription);
        return;
    }
    //(视频)添加一个音频输入设备
    AVCaptureDevice *audioCaptureDevice = [[AVCaptureDevice devicesWithMediaType:AVMediaTypeAudio] firstObject];
    AVCaptureDeviceInput *audioCaptureDeviceInput = [[AVCaptureDeviceInput alloc] initWithDevice:audioCaptureDevice error:&error];
    if (error)
    {
        NSLog(@"取得设备输入对象时出错,错误原因:%@",error.localizedDescription);
        return;
    }
    
    // 4.初始化设备输出对象,用于获得输出数据
    _capturePhotoOutput = [[AVCapturePhotoOutput alloc] init];
    _captureMovieFileOutput=[[AVCaptureMovieFileOutput alloc]init];//(视频)
    
    // 5.将设备输入添加到会话中
    if ([_captureSession canAddInput:_captureDeviceInput])
    {
        [_captureSession addInput:_captureDeviceInput];
        
        //(视频)
        [_captureSession addInput:audioCaptureDeviceInput];
        AVCaptureConnection *captureConnection = [_captureMovieFileOutput connectionWithMediaType:AVMediaTypeVideo];
        if ([captureConnection isVideoStabilizationSupported])
        {
            captureConnection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeAuto;
        }
    }
    
    // 6.将设备输出添加到会话中
    if ([_captureSession canAddOutput:_capturePhotoOutput])
    {
        [_captureSession addOutput:_capturePhotoOutput];
        [_captureSession addOutput:_captureMovieFileOutput];//(视频)
    }
    
    // 7.创建视频预览层,用于实时展示摄像头状态
    _captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.captureSession];
    
    CALayer *layer = self.viewContainer.layer;
    layer.masksToBounds = YES;
    
    _captureVideoPreviewLayer.frame = layer.bounds;
    _captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;//填充模式
    
    // 8.将视频预览层添加到界面中
    [layer insertSublayer:_captureVideoPreviewLayer below:self.focusCursor.layer];
    
    [self addNotificationToCaptureDevice:captureDevice];
    [self addGenstureRecognizer];
    [self setFlashModeButtonStatus];
}

❸ 在控制器视图展示和视图离开界面时启动、停止会话。

// 开始录制
-(void)viewDidAppear:(BOOL)animated
{
    [super viewDidAppear:animated];
    [self.captureSession startRunning];
}

// 停止录制
-(void)viewDidDisappear:(BOOL)animated
{
    [super viewDidDisappear:animated];
    [self.captureSession stopRunning];
}

❹ 定义闪光灯开闭及自动模式功能,注意无论是设置闪光灯、白平衡还是其他输入设备属性,在设置之前必须先锁定配置,修改完后解锁。

#pragma mark - 闪光灯

// 自动闪光灯开启
- (void)flashAutoClick:(UIButton *)sender
{
    [self setFlashMode:AVCaptureFlashModeAuto];
    [self setFlashModeButtonStatus];
}

// 打开闪光灯
- (void)flashOnClick:(UIButton *)sender
{
    [self setFlashMode:AVCaptureFlashModeOn];
    [self setFlashModeButtonStatus];
}

// 关闭闪光灯
- (void)flashOffClick:(UIButton *)sender
{
    [self setFlashMode:AVCaptureFlashModeOff];
    [self setFlashModeButtonStatus];
}

// 改变设备属性的统一操作方法
// @param propertyChange 属性改变操作
-(void)changeDeviceProperty:(PropertyChangeBlock)propertyChange
{
    AVCaptureDevice *captureDevice= [self.captureDeviceInput device];
    NSError *error;
    // 注意改变设备属性前一定要首先调用lockForConfiguration:,调用完之后使用unlockForConfiguration方法解锁
    if ([captureDevice lockForConfiguration:&error])
    {
        propertyChange(captureDevice);
        [captureDevice unlockForConfiguration];
    }
    else
    {
        NSLog(@"设置设备属性过程发生错误,错误信息:%@",error.localizedDescription);
    }
}

// 设置闪光灯模式
-(void)setFlashMode:(AVCaptureFlashMode )flashMode
{
    [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) {
        if ([captureDevice isFlashModeSupported:AVCaptureFlashModeOff])
        {
            captureDevice.flashMode = AVCaptureFlashModeAuto;
        }
    }];
}

// 设置闪光灯按钮状态
-(void)setFlashModeButtonStatus
{
    AVCaptureDevice *captureDevice = [self.captureDeviceInput device];
//    if ([captureDevice hasFlash])
//    {
//        if ([self.capturePhotoOutput.supportedFlashModes containsObject:[NSNumber numberWithInt:AVCaptureFlashModeOn]])
//        {
//        }
//    }
    AVCaptureFlashMode flashMode = captureDevice.flashMode;
    if([captureDevice isFlashAvailable])
    {
        self.flashAutoButton.hidden = NO;
        self.flashOnButton.hidden = NO;
        self.flashOffButton.hidden = NO;
        self.flashAutoButton.enabled = YES;
        self.flashOnButton.enabled = YES;
        self.flashOffButton.enabled = YES;
        switch (flashMode)
        {
            case AVCaptureFlashModeAuto:
                self.flashAutoButton.enabled = NO;
                break;
            case AVCaptureFlashModeOn:
                self.flashOnButton.enabled = NO;
                break;
            case AVCaptureFlashModeOff:
                self.flashOffButton.enabled = NO;
                break;
            default:
                break;
        }
    }
    else
    {
        self.flashAutoButton.hidden = YES;
        self.flashOnButton.hidden = YES;
        self.flashOffButton.hidden = YES;
    }
}

❺ 定义切换摄像头功能,切换摄像头的过程就是将原有输入移除,在会话中添加新的输入,但是注意动态修改会话需要首先开启配置,配置成功后提交配置。

#pragma mark 切换前后摄像头

- (void)toggleButtonClick:(UIButton *)sender
{
    AVCaptureDevice *currentDevice = [self.captureDeviceInput device];
    AVCaptureDevicePosition currentPosition = [currentDevice position];
    [self removeNotificationFromCaptureDevice:currentDevice];
    
    AVCaptureDevice *toChangeDevice;
    AVCaptureDevicePosition toChangePosition = AVCaptureDevicePositionFront;
    if (currentPosition == AVCaptureDevicePositionUnspecified || currentPosition == AVCaptureDevicePositionFront)
    {
        toChangePosition = AVCaptureDevicePositionBack;
    }
    toChangeDevice = [self cameraWithPostion:toChangePosition];
    [self addNotificationToCaptureDevice:toChangeDevice];
    
    // 获得要调整的设备输入对象
    AVCaptureDeviceInput *toChangeDeviceInput = [[AVCaptureDeviceInput alloc] initWithDevice:toChangeDevice error:nil];
    
    // 改变会话的配置前一定要先开启配置,配置完成后提交配置改变
    [self.captureSession beginConfiguration];
    // 移除原有输入对象
    [self.captureSession removeInput:self.captureDeviceInput];
    // 添加新的输入对象
    if ([self.captureSession canAddInput:toChangeDeviceInput])
    {
        [self.captureSession addInput:toChangeDeviceInput];
        self.captureDeviceInput = toChangeDeviceInput;
    }
    // 提交会话配置
    [self.captureSession commitConfiguration];
    
    [self setFlashModeButtonStatus];
}

- (AVCaptureDevice *)cameraWithPostion:(AVCaptureDevicePosition)position
{
    AVCaptureDeviceDiscoverySession *devicesIOS = [AVCaptureDeviceDiscoverySession  discoverySessionWithDeviceTypes:@[AVCaptureDeviceTypeBuiltInWideAngleCamera] mediaType:AVMediaTypeVideo position:position];
    
    NSArray *devices = devicesIOS.devices;
    for (AVCaptureDevice *device in devices)
    {
        if ([device position] == position)
        {
            return device;
        }
    }
    return nil;
}

❻ 添加点击手势操作,点按预览视图时进行聚焦、白平衡设置。

#pragma mark - 聚焦点

// 设置聚焦点
// @param point 聚焦点
-(void)focusWithMode:(AVCaptureFocusMode)focusMode exposureMode:(AVCaptureExposureMode)exposureMode atPoint:(CGPoint)point
{
    [self changeDeviceProperty:^(AVCaptureDevice *captureDevice)
    {
        if ([captureDevice isFocusModeSupported:focusMode])
        {
            [captureDevice setFocusMode:AVCaptureFocusModeAutoFocus];
        }
        if ([captureDevice isFocusPointOfInterestSupported])
        {
            [captureDevice setFocusPointOfInterest:point];
        }
        if ([captureDevice isExposureModeSupported:exposureMode])
        {
            [captureDevice setExposureMode:AVCaptureExposureModeAutoExpose];
        }
        if ([captureDevice isExposurePointOfInterestSupported])
        {
            [captureDevice setExposurePointOfInterest:point];
        }
    }];
}

// 添加点按手势,点按时聚焦
-(void)addGenstureRecognizer
{
    UITapGestureRecognizer *tapGesture = [[UITapGestureRecognizer alloc] initWithTarget:self action:@selector(tapScreen:)];
    [self.viewContainer addGestureRecognizer:tapGesture];
}

-(void)tapScreen:(UITapGestureRecognizer *)tapGesture
{
    CGPoint point = [tapGesture locationInView:self.viewContainer];
    
    // 将UI坐标转化为摄像头坐标
    CGPoint cameraPoint = [self.captureVideoPreviewLayer captureDevicePointOfInterestForPoint:point];
    [self setFocusCursorWithPoint:point];
    [self focusWithMode:AVCaptureFocusModeAutoFocus exposureMode:AVCaptureExposureModeAutoExpose atPoint:cameraPoint];
}

// 设置聚焦光标位置
// @param point 光标位置
-(void)setFocusCursorWithPoint:(CGPoint)point
{
    self.focusCursor.center = point;
    self.focusCursor.transform = CGAffineTransformMakeScale(1.5, 1.5);
    self.focusCursor.alpha = 1.0;
    [UIView animateWithDuration:1.0 animations:^{
        self.focusCursor.transform = CGAffineTransformIdentity;
    } completion:^(BOOL finished) {
        self.focusCursor.alpha = 0;
    }];
}

// 设置聚焦模式
// @param focusMode 聚焦模式
-(void)setFocusMode:(AVCaptureFocusMode )focusMode
{
    [self changeDeviceProperty:^(AVCaptureDevice *captureDevice)
     {
        if ([captureDevice isFocusModeSupported:focusMode])
        {
            [captureDevice setFocusMode:focusMode];
        }
    }];
}

// 设置曝光模式
//  @param exposureMode 曝光模式
-(void)setExposureMode:(AVCaptureExposureMode)exposureMode
{
    [self changeDeviceProperty:^(AVCaptureDevice *captureDevice)
    {
        if ([captureDevice isExposureModeSupported:exposureMode])
        {
            [captureDevice setExposureMode:exposureMode];
        }
    }];
}

❼ 定义拍照功能,拍照的过程就是获取连接,从连接中获得捕获的输出数据并做保存操作。

#pragma mark 拍照

- (void)takePictureClick:(UIButton *)sender
{
    AVCapturePhotoOutput * output = (AVCapturePhotoOutput *)self.capturePhotoOutput;
    AVCapturePhotoSettings * settings = [AVCapturePhotoSettings photoSettings];
    [output capturePhotoWithSettings:settings delegate:self];
}


- (void)captureOutput:(AVCapturePhotoOutput *)output didFinishProcessingPhoto:(AVCapturePhoto *)photo error:(NSError *)error
{
    if (error)
    {
        NSLog(@"获取图片错误 --- %@",error.localizedDescription);
    }
    
    CGImageRef cgImage = [photo CGImageRepresentation];
    UIImage *image = [UIImage imageWithCGImage:cgImage];
    NSLog(@"获取图片成功: %@",image);
    
    // 前置摄像头拍照会旋转180解决办法
    if (self.captureDeviceInput.device.position == AVCaptureDevicePositionFront)
    {
        UIImageOrientation imgOrientation = UIImageOrientationLeftMirrored;
        image = [[UIImage alloc] initWithCGImage:cgImage scale:1.0f orientation:imgOrientation];
    }
    else
    {
        UIImageOrientation imgOrientation = UIImageOrientationRight;
        image = [[UIImage alloc] initWithCGImage:cgImage scale:1.0f orientation:imgOrientation];
    }
    
    UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil);
}

❽ 通知。

#pragma mark - 通知

// 给输入设备添加通知
-(void)addNotificationToCaptureDevice:(AVCaptureDevice *)captureDevice
{
    // 注意添加区域改变捕获通知必须首先设置设备允许捕获
    [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) {
        captureDevice.subjectAreaChangeMonitoringEnabled=YES;
    }];
    
    NSNotificationCenter *notificationCenter = [NSNotificationCenter defaultCenter];
    
    // 捕获区域发生改变
    [notificationCenter addObserver:self selector:@selector(areaChange:) name:AVCaptureDeviceSubjectAreaDidChangeNotification object:captureDevice];
}

-(void)removeNotificationFromCaptureDevice:(AVCaptureDevice *)captureDevice
{
    NSNotificationCenter *notificationCenter = [NSNotificationCenter defaultCenter];
    [notificationCenter removeObserver:self name:AVCaptureDeviceSubjectAreaDidChangeNotification object:captureDevice];
}

// 移除所有通知
-(void)removeNotification
{
    NSNotificationCenter *notificationCenter = [NSNotificationCenter defaultCenter];
    [notificationCenter removeObserver:self];
}

-(void)addNotificationToCaptureSession:(AVCaptureSession *)captureSession
{
    NSNotificationCenter *notificationCenter = [NSNotificationCenter defaultCenter];
    
    // 会话出错
    [notificationCenter addObserver:self selector:@selector(sessionRuntimeError:) name:AVCaptureSessionRuntimeErrorNotification object:captureSession];
}

// 设备连接成功
-(void)deviceConnected:(NSNotification *)notification
{
    NSLog(@"设备已连接...");
}

// 设备连接断开
-(void)deviceDisconnected:(NSNotification *)notification
{
    NSLog(@"设备已断开.");
}

// 捕获区域改变
-(void)areaChange:(NSNotification *)notification
{
    NSLog(@"捕获区域改变...");
}

// 会话出错
-(void)sessionRuntimeError:(NSNotification *)notification
{
    NSLog(@"会话发生错误.");
}

❾ 视频。

#pragma mark - 视频旋转

//(视频)是否可旋转
-(BOOL)shouldAutorotate
{
    return self.enableRotation;
}

// 屏幕旋转时调整视频预览图层的方向
-(void)willRotateToInterfaceOrientation:(UIInterfaceOrientation)toInterfaceOrientation duration:(NSTimeInterval)duration{
    AVCaptureConnection *captureConnection=[self.captureVideoPreviewLayer connection];
    captureConnection.videoOrientation=(AVCaptureVideoOrientation)toInterfaceOrientation;
}

// 旋转后重新设置大小
-(void)didRotateFromInterfaceOrientation:(UIInterfaceOrientation)fromInterfaceOrientation
{
    _captureVideoPreviewLayer.frame = self.viewContainer.bounds;
}

#pragma mark - 视频录制

- (void)takeVideoClick:(UIButton *)sender
{
    // 根据设备输出获得连接
    AVCaptureConnection *captureConnection = [self.captureMovieFileOutput connectionWithMediaType:AVMediaTypeVideo];
    // 根据连接取得设备输出的数据
    if (![self.captureMovieFileOutput isRecording])
    {
        self.enableRotation = NO;
        // 如果支持多任务则则开始多任务
        if ([[UIDevice currentDevice] isMultitaskingSupported])
        {
            self.backgroundTaskIdentifier = [[UIApplication sharedApplication] beginBackgroundTaskWithExpirationHandler:nil];
        }
        
        // 预览图层和视频方向保持一致
        captureConnection.videoOrientation = [self.captureVideoPreviewLayer connection].videoOrientation;
        NSString *outputFielPath = [NSTemporaryDirectory() stringByAppendingString:@"myMovie.mov"];
        NSLog(@"保存路径为 :%@",outputFielPath);
        NSURL *fileUrl = [NSURL fileURLWithPath:outputFielPath];
        [self.captureMovieFileOutput startRecordingToOutputFileURL:fileUrl recordingDelegate:self];
    }
    else
    {
        // 停止录制
        [self.captureMovieFileOutput stopRecording];
    }
}

#pragma mark - 视频输出代理

-(void)captureOutput:(AVCaptureFileOutput *)captureOutput didStartRecordingToOutputFileAtURL:(NSURL *)fileURL fromConnections:(NSArray *)connections
{
    NSLog(@"开始录制...");
}
-(void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error
{
    NSLog(@"视频录制完成.");
    
    //视频录入完成之后在后台将视频存储到相簿
    self.enableRotation = YES;
    UIBackgroundTaskIdentifier lastBackgroundTaskIdentifier = self.backgroundTaskIdentifier;
    self.backgroundTaskIdentifier = UIBackgroundTaskInvalid;
    
    NSURL *documentsURL = [[[NSFileManager defaultManager] URLsForDirectory:NSDocumentDirectory inDomains:NSUserDomainMask] firstObject];
    NSURL *tempURL = [documentsURL URLByAppendingPathComponent:[outputFileURL lastPathComponent]];

    [[NSFileManager defaultManager] moveItemAtURL:outputFileURL toURL:tempURL error:nil];
    
    [[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{
        PHAssetChangeRequest *changeRequest = [PHAssetChangeRequest creationRequestForAssetFromVideoAtFileURL:tempURL];

        NSLog(@"%@", changeRequest.description);
    } completionHandler:^(BOOL success, NSError *error) {
        if (success)
        {
            NSLog(@"成功保存视频到相簿");
            if (lastBackgroundTaskIdentifier != UIBackgroundTaskInvalid)
            {
                [[UIApplication sharedApplication] endBackgroundTask:lastBackgroundTaskIdentifier];
            }
            [[NSFileManager defaultManager] removeItemAtURL:tempURL error:nil];
        }
        else
        {
            NSLog(@"保存视频到相簿过程中发生错误,错误信息:%@",error.localizedDescription);
            [[NSFileManager defaultManager] removeItemAtURL:tempURL error:nil];
        }
    }];
}

三、扫描二维码

扩展
#import "ScanQRCodeViewController.h"
#import <AVFoundation/AVFoundation.h>

@interface ScanQRCodeViewController () <AVCaptureMetadataOutputObjectsDelegate>

// 捕获设备,默认后置摄像头
@property (strong, nonatomic) AVCaptureDevice *device;
// 输入设备
@property (strong, nonatomic) AVCaptureInput *input;
// 输出设备,需要指定他的输出类型及扫描范围
@property (strong, nonatomic) AVCaptureMetadataOutput *output;
// AVFoundation框架捕获类的中心枢纽,协调输入输出设备以获得数据
@property (strong, nonatomic) AVCaptureSession *session;
// 展示捕获图像的图层,是CALayer的子类
@property (strong, nonatomic) AVCaptureVideoPreviewLayer *previewLayer;

// 缩放手势
@property (strong, nonatomic) UIPinchGestureRecognizer *pinch;
// 二维码正方形扫描区域的宽度,根据不同机型适配
@property (assign, nonatomic) CGFloat scanRegion_Width;
// 缩放尺寸
@property (assign, nonatomic) CGFloat initScale;

@end
设备的配置流程
- (void)configBasicDevice
{
// 先将需要的五大设备进行初始化
    // 默认使用后置摄像头进行扫描,使用AVMediaTypeVideo表示视频
    self.device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    
    // 设备输入初始化
    self.input = [[AVCaptureDeviceInput alloc] initWithDevice:self.device error:nil];
    
    // 设备输出初始化,并设置代理和回调
    self.output = [[AVCaptureMetadataOutput alloc] init];
    // 当设备扫描到数据时通过该代理输出队列,一般输出队列都设置为主队列,也是设置了回调方法执行所在的队列环境
    [self.output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
    
    // 会话 初始化,并设置采样质量为高
    self.session = [[AVCaptureSession alloc] init];
    [self.session setSessionPreset:AVCaptureSessionPresetHigh];
    
    // 通过会话连接设备的输入输出
    if ([self.session canAddInput:_input])
    {
        [self.session addInput:_input];
    }
    if ([self.session canAddOutput:_output])
    {
        [self.session addOutput:_output];
    }
    
    // 指定设备的识别类型,这里只指定二维码识别这一种类型 AVMetadataObjectTypeQRCode
    // 指定识别类型这一步一定要在输出添加到会话之后,否则设备的可识别类型会为空,程序会出现崩溃
    [self.output setMetadataObjectTypes:@[AVMetadataObjectTypeQRCode]];
    
    // 设置扫描信息的识别区域,本文设置正中央的一块正方形区域,该区域宽度是scanRegion_W
    // 这里考虑了导航栏的高度,所以计算有点麻烦,识别区域越小识别效率越高,所以不设置整个屏幕
    CGFloat navHeight = self.navigationController.navigationBar.bounds.size.height;
    CGFloat screenHeight = self.view.bounds.size.height;
    CGFloat screenWidth = self.view.bounds.size.width;
    CGFloat viewHeight = screenHeight - navHeight;
    CGFloat scanViewHeight = self.scanRegion_Width;
    
    CGFloat x = (screenWidth - scanViewHeight)/(2*screenWidth);
    CGFloat y = (viewHeight - scanViewHeight)/(2*viewHeight);
    CGFloat height = scanViewHeight/viewHeight;
    CGFloat width = scanViewHeight/screenWidth;
    [self.output setRectOfInterest:CGRectMake(x, y, width, height)];
    
    // 预览层初始化,self.session负责驱动input进行信息的采集,layer负责把图像渲染显示
    // 预览层的区域设置为整个屏幕,这样可以方便我们进行移动二维码到扫描区域
    // 在上面我们已经对我们的扫描区域进行了相应的设置
    self.previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.session];
    self.previewLayer.frame = self.view.bounds;
    self.previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
    [self.view.layer addSublayer:self.previewLayer];
    
    // 扫描框下面的信息label布局
    UILabel *label = [[UILabel alloc] initWithFrame:CGRectMake(0, (viewHeight+scanViewHeight)/2+10.0f, screenWidth, 20.0f)];
    label.text = @"扫描二维码";
    label.font = [UIFont systemFontOfSize:15];
    label.textColor = [UIColor whiteColor];
    label.textAlignment = NSTextAlignmentCenter;
    [self.view addSubview:label];
}
识别二维码之后要实现的功能
// 重写代理的回调方法,实现我们在成功识别二维码之后要实现的功能
// 后置摄像头扫描到二维码的信息
- (void)captureOutput:(AVCaptureOutput *)output didOutputMetadataObjects:(NSArray<__kindof AVMetadataObject *> *)metadataObjects fromConnection:(AVCaptureConnection *)connection
{
    // 停止扫描
    [self.session stopRunning];
    
    if (metadataObjects.count >= 1)
    {
        // 数组中包含的都是AVMetadataMachineReadableCodeObject类型的对象,该对象中包含解码后的数据
        AVMetadataMachineReadableCodeObject *QRObject = [metadataObjects lastObject];
        // 拿到扫描内容在这里进行个性化处理
        NSString *result = QRObject.stringValue;
        // 解析数据进行处理并实现相应的逻辑...
        NSLog(@"扫描到的二维码的信息:%@",result);
    }
}
配置缩放手势
//添加一个缩放手势
- (void)configPinchGesture
{
    self.pinch = [[UIPinchGestureRecognizer alloc] initWithTarget:self action:@selector(pinchDetected:)];
    [self.view addGestureRecognizer:self.pinch];
}

//对我们的相机设备的焦距进行修改就达到了缩放的目的
- (void)pinchDetected:(UIPinchGestureRecognizer*)recogniser
{
    // 相机不存在
    if (!_device)
    {
        return;
    }
    
    // 对手势的状态进行判断
    if (recogniser.state == UIGestureRecognizerStateBegan)
    {
        _initScale = _device.videoZoomFactor;
    }
    
    // 锁定相机设备,相机设备在改变某些参数前必须先锁定,直到改变结束才能解锁
    NSError *error = nil;
    [_device lockForConfiguration:&error];
    if (!error)
    {
        CGFloat zoomFactor; //缩放因子
        CGFloat scale = recogniser.scale;
        
        if (scale < 1.0f)
        {
            zoomFactor = self.initScale - pow([self.device activeFormat].videoMaxZoomFactor, 1.0f - recogniser.scale);
        }
        else
        {
            zoomFactor = self.initScale + pow(self.device.activeFormat.videoMaxZoomFactor, (recogniser.scale - 1.0f) / 2.0f);
        }
        zoomFactor = MIN(15.0f, zoomFactor);
        zoomFactor = MAX(1.0f, zoomFactor);
        
        _device.videoZoomFactor = zoomFactor;
        [_device unlockForConfiguration];
    }
}

参考文献

iOS开发系列--音频播放、录音、视频播放、拍照、视频录制

相关文章

  • iOS多媒体:摄像头

    原创:知识探索型文章无私奉献,为国为民,创作不易,请珍惜,之后会持续更新,不断完善个人比较喜欢做笔记和写总结,毕竟...

  • iOS 多媒体

    一、音频播放 1.音效播放(短时间的音频文件) 1> AudioServicesCreateSystemSound...

  • IOS 多媒体

    一、图片 1、UIImageView iOS-视图之UIImageView - 简书 (jianshu.com)[...

  • 20170313 iOS 权限相关 : 判断与 跳转 设置

    iOS 常见错误(持续更新) iOS权限获取 摄像头与麦克风 iOS各种权限判断(相机,相册,定位,录音) ios...

  • FFmpeg-iOS获取摄像头麦克风

    今天咱来讲讲在iOS 平台上利用ffmpeg获取到摄像头和麦克风,代码很少,后面再加上iOS 自带的获取摄像头的例...

  • iOS RTMP 视频直播开发笔记(1) – 采集摄像头图像

    采集硬件(摄像头)视频图像 这里简单说下 iOS 的摄像头采集。 首先初始化AVCaptureSession,说到...

  • iOS学习笔记27-摄像头

    一、摄像头 在iOS中,手机摄像头的使用有以下两种方法: UIImagePickerController拍照和视频...

  • ios 切换摄像头

    如何在ios开发中,使用相机时切换摄像头

  • 02_ARKit 基础知识1

    1、选择要增强的摄像头源 通过前置或后置摄像头对用户的环境进行增强。 iOS 设备配备两个摄像头,因此对于每个 A...

  • iOS硬件相关开发

    心跳之旅—?—iOS用手机摄像头检测心率(PPG)【iOS开发】网易云音乐锁屏界面效果实现 DJI SDK iOS...

网友评论

      本文标题:iOS多媒体:摄像头

      本文链接:https://www.haomeiwen.com/subject/bgewsktx.html