自己制作一个视频录制软件-Swift

作者: bce67c19184f | 来源:发表于2016-01-08 05:28 被阅读1053次

    一、大致流程:
    1、创建AVCaptureSession对象。
    2、使用AVCaptureDevice的静态方法获得需要使用的设备,例如拍照和录像就需要获得摄像头设备,录音就要获得麦克风设备。
    3、利用输入设备AVCaptureDevice初始化AVCaptureDeviceInput对象。
    4、添加一个音频输入到会话(使用[[AVCaptureDevice devicesWithMediaType:AVMediaTypeAudio] firstObject]获得输入设备,然后根据此输入设备创建一个设备输入对象),在拍照程序中已经添加了视频输入所以此时不需要添加视频输入。
    5、将数据输入对象AVCaptureDeviceInput、数据输出对象AVCaptureOutput添加到媒体会话管理对象AVCaptureSession中。
    6、创建视频预览图层AVCaptureVideoPreviewLayer并指定媒体会话,添加图层到显示容器中,调用AVCaptureSession的startRuning方法开始捕获。
    7、创建一个音乐播放文件输出对象AVCaptureMovieFileOutput取代原来的照片输出对象
    8、将捕获到的视频数据写入到临时文件并在停止录制之后保存到相簿(通过AVCaptureMovieFileOutput的代理方法)。
    二、代码:
    1、 //初始化子�视图
    func initSubViews(){

        movieView = UIView(frame: CGRectMake(0, (view.bounds.height-300)/2, view.bounds.width, 300))
        movieView.backgroundColor = UIColor.whiteColor()
        view.addSubview(movieView)
        print("\(view.bounds.height)")
        movieView.addGestureRecognizer(UITapGestureRecognizer(target: self, action: "settingFocusCorsor:"))
        
        focusCursor = UIImageView(frame: CGRectMake(movieView.bounds.width/2-movieView.bounds.height/4, movieView.bounds.height/4, movieView.bounds.height/2, movieView.bounds.height/2))
        focusCursor.image = UIImage(named: "camera_focus_red.png")
        movieView.addSubview(focusCursor)
        focusCursor.alpha = 0
        
        takeButton = UIButton()
        takeButton.backgroundColor = UIColor.orangeColor()
        takeButton.setTitle("录制", forState: UIControlState.Normal)
        takeButton.addTarget(self, action: "beginRecorderMovie:", forControlEvents: UIControlEvents.TouchUpInside)
        view.addSubview(takeButton)
        takeButton.snp_makeConstraints { (make) -> Void in
            make.top.equalTo(150)
            make.left.equalTo(50)
            make.width.equalTo(60)
            make.height.equalTo(30)
        }
        
        let button = UIButton()
        button.backgroundColor = UIColor.orangeColor()
        button.setTitle("切换摄像头", forState: UIControlState.Normal)
        button.addTarget(self, action: "changeCamrel", forControlEvents: UIControlEvents.TouchUpInside)
        view.addSubview(button)
        button.snp_makeConstraints { (make) -> Void in
            make.top.equalTo(movieView.snp_bottom).offset(20)
            make.left.equalTo(50)
            make.width.equalTo(100)
            make.height.equalTo(30)
        }
    }
    

    2、3、4、5、6步骤的代码:

    func initCapture(){
        //创建AVCaptureSession对象
        captureSession = AVCaptureSession()
        if captureSession.canSetSessionPreset(AVCaptureSessionPreset1280x720){
            captureSession.sessionPreset = AVCaptureSessionPreset1280x720
        }
        
        let captureDeive = getCameraDeviceWithPosition(.Back)
        if captureDeive.position != .Back {
            print("获取后置摄像头失败")
            return
        }
        
        //添加一个音频设备
        let audioCaptureDevice = AVCaptureDevice.devicesWithMediaType(AVMediaTypeAudio).first as!
            AVCaptureDevice
        //根据使用设备初始化输出设备对象audioCaptureDevice
        do {
            captureDeviceInput = try AVCaptureDeviceInput(device: captureDeive)
        } catch {
        
            print("获取输出设备失败")
            return
        }
        
        do {
            audioCaptureDeviceInput = try   AVCaptureDeviceInput(device: audioCaptureDevice )
            
        } catch {
            print("获取输出对象失败")
            return
        }
        //初始化输出对象,用于获得输出数据
        captureMovieFileOutPut = AVCaptureMovieFileOutput()
        
        
        //将设备添加到会话层
        if captureSession.canAddInput(captureDeviceInput) {
            captureSession.addInput(captureDeviceInput)
            captureSession.addInput(audioCaptureDeviceInput)
            if let captureConnection = captureMovieFileOutPut.connectionWithMediaType(AVMediaTypeAudio){
                if captureConnection.supportsVideoStabilization{
                    captureConnection.preferredVideoStabilizationMode = .Auto
                }
            }
        }
        
        //将设备添加到会话中
        if captureSession.canAddOutput(captureMovieFileOutPut){
            captureSession.addOutput(captureMovieFileOutPut)
        }
        
        //创建视频浏览层,用于实时展示摄像头状态
        captureVideoPreviewLayer = AVCaptureVideoPreviewLayer(session: captureSession )
        
        let layer = movieView.layer
        layer.masksToBounds = true
        
        captureVideoPreviewLayer.frame = layer.bounds
        captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
        
        //将视频预览层添加到界面中
        layer.insertSublayer(captureVideoPreviewLayer, above: focusCursor.layer)
        
        //添加设备区域改变通知
        //注意添加区域改变捕获通知必须首先设置设备允许捕获
        addNotificationToCaptureDevice(captureDeive)  
    

    }

    7、8的步骤代码:
    //录制按钮
    func beginRecorderMovie(button:UIButton){
    //根据设备输出链接
    let captureConnection = captureMovieFileOutPut.connectionWithMediaType(AVMediaTypeAudio)
    //根据连接获得设备输出的数据
    if !captureMovieFileOutPut.recording {
    button.setTitle("录制中", forState: UIControlState.Normal)
    //enableRotation = false //声明的时候默认为false
    //如果有多任务就开启多任务
    if UIDevice.currentDevice().multitaskingSupported {
    backgroundTaskIdentifier = UIApplication.sharedApplication().beginBackgroundTaskWithExpirationHandler(nil)
    }
    //预览图层与视频方向保持一致
    captureConnection.videoOrientation = captureVideoPreviewLayer.connection.videoOrientation
    let outputFielPath = NSTemporaryDirectory() + "myMovie.mov"
    let url = NSURL(fileURLWithPath:outputFielPath)
    captureMovieFileOutPut.startRecordingToOutputFileURL(url, recordingDelegate: self)
    } else {
    button.setTitle("录制", forState: UIControlState.Normal)
    captureMovieFileOutPut.stopRecording()
    }
    }

    //切换摄像头
    func changeCamrel(){
        //获得设备
        let currentDevice = captureDeviceInput.device
        //获得设备位置
        let currentPosition = currentDevice.position
        //改变设备发出通知
        NSNotificationCenter.defaultCenter().removeObserver(self, name: AVCaptureDeviceSubjectAreaDidChangeNotification, object: currentDevice)
        var toChangePosition = AVCaptureDevicePosition.Front
        if currentPosition == AVCaptureDevicePosition.Unspecified || currentPosition == AVCaptureDevicePosition.Front {
        
            toChangePosition = AVCaptureDevicePosition.Back
        }
        let toChangeDevice = getCameraDeviceWithPosition(toChangePosition)
        addNotificationToCaptureDevice(toChangeDevice)
        //获得要调整的输入对象
        var toChangeDeviceInput :AVCaptureDeviceInput!
        do {
            toChangeDeviceInput = try AVCaptureDeviceInput(device: toChangeDevice)
        } catch {}
        //改变配置之前一定要先开启配置,配置完成后提交配置改变
        captureSession.beginConfiguration()
        //移除原有输入对象
        captureSession.removeInput(captureDeviceInput)
        //添加新的输入对象
        if captureSession.canAddInput(toChangeDeviceInput){
        
            captureSession.addInput(toChangeDeviceInput)
            captureDeviceInput = toChangeDeviceInput
        }
        //提交会话配置
        captureSession.commitConfiguration()
        
    }
    

    // 获取指定摄像头
    func getCameraDeviceWithPosition(position:AVCaptureDevicePosition) -> AVCaptureDevice{
    let cameras = AVCaptureDevice.devicesWithMediaType(AVMediaTypeVideo)
    for camera in cameras {
    if camera.position == position {
    return camera as! AVCaptureDevice
    }
    }
    return cameras.first as! AVCaptureDevice
    }

    //手势获取焦点
    func settingFocusCorsor(tap:UITapGestureRecognizer){
        let point = tap.locationInView(tap.view)
        //将UI坐标转化为摄像头坐标
        let cameraPoint = captureVideoPreviewLayer.captureDevicePointOfInterestForPoint(point)
        setFocusCursorWithPoint(point)
        focusWithMode(.AutoFocus, exposureMode:.AutoExpose, point: cameraPoint)
    }
    
    //设置聚焦光标位置
    func setFocusCursorWithPoint(point:CGPoint){
        focusCursor.center = point
        focusCursor.transform = CGAffineTransformMakeScale(1.2, 1.2)
        focusCursor.alpha = 1
        UIView.animateWithDuration(1, animations: { () -> Void in
            self.focusCursor.transform = CGAffineTransformMakeScale(1, 1)
            }) { (finish) -> Void in
                self.focusCursor.alpha = 1
        }
    }
    //给设备添加通知
    func addNotificationToCaptureDevice(captureDevice:AVCaptureDevice){
    
        changeDeviceProperty { (captureDevice) -> Void in
            captureDevice.subjectAreaChangeMonitoringEnabled = true
        }
        
    }
        //设置焦点
        func focusWithMode(focusModel:AVCaptureFocusMode,exposureMode:AVCaptureExposureMode,point:CGPoint){
            changeDeviceProperty { (captureDevice) -> Void in
                if captureDevice.isFocusModeSupported(focusModel) {
                    captureDevice.focusMode = focusModel
                }
                if captureDevice.isExposureModeSupported(exposureMode) {
                    captureDevice.exposureMode = exposureMode
                }
                captureDevice.exposurePointOfInterest = point
                captureDevice.focusPointOfInterest = point
            }
            
        }
    
    //通知
    func areaChange(notification:NSNotification){
        
        print("设备区域改变")
    }
    
    //改变设备属性的统一操作方法
    func changeDeviceProperty(closure :(captureDevice:AVCaptureDevice) -> Void) {
        let cDevice = captureDeviceInput.device
        do {
        
            try cDevice.lockForConfiguration()
            closure(captureDevice: cDevice)
            cDevice.unlockForConfiguration()
        } catch {
            print("设置设备属性的过程中发生错误")
            
        }
        
    }
    

    }
    extension ViewController :AVCaptureFileOutputRecordingDelegate{

    //视频输出代理
    func captureOutput(captureOutput: AVCaptureFileOutput!, didFinishRecordingToOutputFileAtURL outputFileURL: NSURL!, fromConnections connections: [AnyObject]!, error: NSError!) {
        print("录制完毕")
    

    // enableRotation = false
    let lastBackgroundTaskIdentifier = backgroundTaskIdentifier
    backgroundTaskIdentifier = UIBackgroundTaskInvalid
    let assetsLibrary = ALAssetsLibrary()
    assetsLibrary.writeVideoAtPathToSavedPhotosAlbum(outputFileURL) { (assetUrl, error) -> Void in
    if error != nil {

                print("保存相册过程中失败")
                
            }
            do {
                try NSFileManager.defaultManager().removeItemAtURL(outputFileURL)
                
            }catch{}
            if lastBackgroundTaskIdentifier != UIBackgroundTaskInvalid {
                UIApplication.sharedApplication().endBackgroundTask(lastBackgroundTaskIdentifier)
            }
        }
    }
    
    func captureOutput(captureOutput: AVCaptureFileOutput!, didStartRecordingToOutputFileAtURL fileURL: NSURL!, fromConnections connections: [AnyObject]!) {
        print("开始录制")
    }
    //是否支持旋转
    override func shouldAutorotate() -> Bool {
        return enableRotation
    }
    
    //屏幕旋转时调整视屏预览图层方向
    override func willRotateToInterfaceOrientation(toInterfaceOrientation: UIInterfaceOrientation, duration: NSTimeInterval) {
        let captureConnection = captureVideoPreviewLayer.connection
        switch toInterfaceOrientation {
        case .Portrait:
            captureConnection.videoOrientation = .Portrait
        case .PortraitUpsideDown:
            captureConnection.videoOrientation = .PortraitUpsideDown
        case .LandscapeLeft:
            captureConnection.videoOrientation = .LandscapeRight
        case .LandscapeRight:
            captureConnection.videoOrientation = .LandscapeLeft
        default:
            print("")
        }
    }
    
    //旋转后重新设置大小
    override func didRotateFromInterfaceOrientation(fromInterfaceOrientation: UIInterfaceOrientation) {
        captureVideoPreviewLayer.frame = movieView.bounds
    }
    

    }

    效果图:

    出来效果图了,大家看一下。(没图纯代码,我怎么知道是否我需要的,嘻嘻)

    Untitled.gif

    自学笔记,适当观看切勿入歧途。

    相关文章

      网友评论

        本文标题:自己制作一个视频录制软件-Swift

        本文链接:https://www.haomeiwen.com/subject/rzbfkttx.html