美文网首页
AVFoundation框架解析(二十六) —— 播放、录制和合

AVFoundation框架解析(二十六) —— 播放、录制和合

作者: 刀客传奇 | 来源:发表于2020-08-13 18:37 被阅读0次

版本记录

版本号 时间
V1.0 2020.08.13 星期四

前言

AVFoundation框架是ios中很重要的框架,所有与视频音频相关的软硬件控制都在这个框架里面,接下来这几篇就主要对这个框架进行介绍和讲解。感兴趣的可以看我上几篇。
1. AVFoundation框架解析(一)—— 基本概览
2. AVFoundation框架解析(二)—— 实现视频预览录制保存到相册
3. AVFoundation框架解析(三)—— 几个关键问题之关于框架的深度概括
4. AVFoundation框架解析(四)—— 几个关键问题之AVFoundation探索(一)
5. AVFoundation框架解析(五)—— 几个关键问题之AVFoundation探索(二)
6. AVFoundation框架解析(六)—— 视频音频的合成(一)
7. AVFoundation框架解析(七)—— 视频组合和音频混合调试
8. AVFoundation框架解析(八)—— 优化用户的播放体验
9. AVFoundation框架解析(九)—— AVFoundation的变化(一)
10. AVFoundation框架解析(十)—— AVFoundation的变化(二)
11. AVFoundation框架解析(十一)—— AVFoundation的变化(三)
12. AVFoundation框架解析(十二)—— AVFoundation的变化(四)
13. AVFoundation框架解析(十三)—— 构建基本播放应用程序
14. AVFoundation框架解析(十四)—— VAssetWriter和AVAssetReader的Timecode支持(一)
15. AVFoundation框架解析(十五)—— VAssetWriter和AVAssetReader的Timecode支持(二)
16. AVFoundation框架解析(十六)—— 一个简单示例之播放、录制以及混合视频(一)
17. AVFoundation框架解析(十七)—— 一个简单示例之播放、录制以及混合视频之源码及效果展示(二)
18. AVFoundation框架解析(十八)—— AVAudioEngine之基本概览(一)
19. AVFoundation框架解析(十九)—— AVAudioEngine之详细说明和一个简单示例(二)
20. AVFoundation框架解析(二十)—— AVAudioEngine之详细说明和一个简单示例源码(三)
21. AVFoundation框架解析(二十一)—— 一个简单的视频流预览和播放示例之解析(一)
22. AVFoundation框架解析(二十二)—— 一个简单的视频流预览和播放示例之源码(二)
23. AVFoundation框架解析(二十三) —— 向视频层添加叠加层和动画(一)
24. AVFoundation框架解析(二十四) —— 向视频层添加叠加层和动画(二)
25. AVFoundation框架解析(二十五) —— 播放、录制和合并视频简单示例(一)

源码

1. Swift

首先看下工程组织结构'

接着就是sb中的内容

下面就是源码了

1. HomeViewController.swift
import UIKit

class HomeViewController: UIViewController {
}
2. MergeVideoViewController.swift
import MediaPlayer
import MobileCoreServices
import Photos
import UIKit

class MergeVideoViewController: UIViewController {
  var firstAsset: AVAsset?
  var secondAsset: AVAsset?
  var audioAsset: AVAsset?
  var loadingAssetOne = false

  @IBOutlet var activityMonitor: UIActivityIndicatorView!

  func exportDidFinish(_ session: AVAssetExportSession) {
    // Cleanup assets
    activityMonitor.stopAnimating()
    firstAsset = nil
    secondAsset = nil
    audioAsset = nil

    guard
      session.status == AVAssetExportSession.Status.completed,
      let outputURL = session.outputURL
      else { return }

    let saveVideoToPhotos = {
    let changes: () -> Void = {
      PHAssetChangeRequest.creationRequestForAssetFromVideo(atFileURL: outputURL)
    }
    PHPhotoLibrary.shared().performChanges(changes) { saved, error in
      DispatchQueue.main.async {
        let success = saved && (error == nil)
        let title = success ? "Success" : "Error"
        let message = success ? "Video saved" : "Failed to save video"

        let alert = UIAlertController(
          title: title,
          message: message,
          preferredStyle: .alert)
        alert.addAction(UIAlertAction(
          title: "OK",
          style: UIAlertAction.Style.cancel,
          handler: nil))
        self.present(alert, animated: true, completion: nil)
      }
    }
    }

    // Ensure permission to access Photo Library
    if PHPhotoLibrary.authorizationStatus() != .authorized {
      PHPhotoLibrary.requestAuthorization { status in
        if status == .authorized {
          saveVideoToPhotos()
        }
      }
    } else {
      saveVideoToPhotos()
    }
  }

  func savedPhotosAvailable() -> Bool {
    guard !UIImagePickerController.isSourceTypeAvailable(.savedPhotosAlbum)
      else { return true }

    let alert = UIAlertController(
      title: "Not Available",
      message: "No Saved Album found",
      preferredStyle: .alert)
    alert.addAction(UIAlertAction(
      title: "OK",
      style: UIAlertAction.Style.cancel,
      handler: nil))
    present(alert, animated: true, completion: nil)
    return false
  }

  @IBAction func loadAssetOne(_ sender: AnyObject) {
    if savedPhotosAvailable() {
      loadingAssetOne = true
      VideoHelper.startMediaBrowser(delegate: self, sourceType: .savedPhotosAlbum)
    }
  }

  @IBAction func loadAssetTwo(_ sender: AnyObject) {
    if savedPhotosAvailable() {
      loadingAssetOne = false
      VideoHelper.startMediaBrowser(delegate: self, sourceType: .savedPhotosAlbum)
    }
  }

  @IBAction func loadAudio(_ sender: AnyObject) {
    let mediaPickerController = MPMediaPickerController(mediaTypes: .any)
    mediaPickerController.delegate = self
    mediaPickerController.prompt = "Select Audio"
    present(mediaPickerController, animated: true, completion: nil)
  }

  // swiftlint:disable:next function_body_length
  @IBAction func merge(_ sender: AnyObject) {
    guard
      let firstAsset = firstAsset,
      let secondAsset = secondAsset
      else { return }

    activityMonitor.startAnimating()

    // 1 - Create AVMutableComposition object. This object
    // will hold your AVMutableCompositionTrack instances.
    let mixComposition = AVMutableComposition()

    // 2 - Create two video tracks
    guard
      let firstTrack = mixComposition.addMutableTrack(
        withMediaType: .video,
        preferredTrackID: Int32(kCMPersistentTrackID_Invalid))
      else { return }

    do {
      try firstTrack.insertTimeRange(
        CMTimeRangeMake(start: .zero, duration: firstAsset.duration),
        of: firstAsset.tracks(withMediaType: .video)[0],
        at: .zero)
    } catch {
      print("Failed to load first track")
      return
    }

    guard
      let secondTrack = mixComposition.addMutableTrack(
        withMediaType: .video,
        preferredTrackID: Int32(kCMPersistentTrackID_Invalid))
      else { return }

    do {
      try secondTrack.insertTimeRange(
        CMTimeRangeMake(start: .zero, duration: secondAsset.duration),
        of: secondAsset.tracks(withMediaType: .video)[0],
        at: firstAsset.duration)
    } catch {
      print("Failed to load second track")
      return
    }

    // 3 - Composition Instructions
    let mainInstruction = AVMutableVideoCompositionInstruction()
    mainInstruction.timeRange = CMTimeRangeMake(
      start: .zero,
      duration: CMTimeAdd(firstAsset.duration, secondAsset.duration))

    // 4 - Set up the instructions — one for each asset
    let firstInstruction = VideoHelper.videoCompositionInstruction(
      firstTrack,
      asset: firstAsset)
    firstInstruction.setOpacity(0.0, at: firstAsset.duration)
    let secondInstruction = VideoHelper.videoCompositionInstruction(
      secondTrack,
      asset: secondAsset)

    // 5 - Add all instructions together and create a mutable video composition
    mainInstruction.layerInstructions = [firstInstruction, secondInstruction]
    let mainComposition = AVMutableVideoComposition()
    mainComposition.instructions = [mainInstruction]
    mainComposition.frameDuration = CMTimeMake(value: 1, timescale: 30)
    mainComposition.renderSize = CGSize(
      width: UIScreen.main.bounds.width,
      height: UIScreen.main.bounds.height)

    // 6 - Audio track
    if let loadedAudioAsset = audioAsset {
      let audioTrack = mixComposition.addMutableTrack(
        withMediaType: .audio,
        preferredTrackID: 0)
      do {
        try audioTrack?.insertTimeRange(
          CMTimeRangeMake(
            start: CMTime.zero,
            duration: CMTimeAdd(
              firstAsset.duration,
              secondAsset.duration)),
          of: loadedAudioAsset.tracks(withMediaType: .audio)[0],
          at: .zero)
      } catch {
        print("Failed to load Audio track")
      }
    }

    // 7 - Get path
    guard
      let documentDirectory = FileManager.default.urls(
        for: .documentDirectory,
        in: .userDomainMask).first
      else { return }
    let dateFormatter = DateFormatter()
    dateFormatter.dateStyle = .long
    dateFormatter.timeStyle = .short
    let date = dateFormatter.string(from: Date())
    let url = documentDirectory.appendingPathComponent("mergeVideo-\(date).mov")

    // 8 - Create Exporter
    guard let exporter = AVAssetExportSession(
      asset: mixComposition,
      presetName: AVAssetExportPresetHighestQuality)
      else { return }
    exporter.outputURL = url
    exporter.outputFileType = AVFileType.mov
    exporter.shouldOptimizeForNetworkUse = true
    exporter.videoComposition = mainComposition

    // 9 - Perform the Export
    exporter.exportAsynchronously {
      DispatchQueue.main.async {
        self.exportDidFinish(exporter)
      }
    }
  }
}

// MARK: - UIImagePickerControllerDelegate
extension MergeVideoViewController: UIImagePickerControllerDelegate {
  func imagePickerController(
    _ picker: UIImagePickerController,
    didFinishPickingMediaWithInfo info: [UIImagePickerController.InfoKey: Any]
  ) {
    dismiss(animated: true, completion: nil)

    guard let mediaType = info[UIImagePickerController.InfoKey.mediaType] as? String,
      mediaType == (kUTTypeMovie as String),
      let url = info[UIImagePickerController.InfoKey.mediaURL] as? URL
      else { return }

    let avAsset = AVAsset(url: url)
    var message = ""
    if loadingAssetOne {
      message = "Video one loaded"
      firstAsset = avAsset
    } else {
      message = "Video two loaded"
      secondAsset = avAsset
    }
    let alert = UIAlertController(
      title: "Asset Loaded",
      message: message,
      preferredStyle: .alert)
    alert.addAction(UIAlertAction(
      title: "OK",
      style: UIAlertAction.Style.cancel,
      handler: nil))
    present(alert, animated: true, completion: nil)
  }
}

// MARK: - UINavigationControllerDelegate
extension MergeVideoViewController: UINavigationControllerDelegate {
}

// MARK: - MPMediaPickerControllerDelegate
extension MergeVideoViewController: MPMediaPickerControllerDelegate {
  func mediaPicker(
    _ mediaPicker: MPMediaPickerController,
    didPickMediaItems mediaItemCollection: MPMediaItemCollection
  ) {
    dismiss(animated: true) {
      let selectedSongs = mediaItemCollection.items
      guard let song = selectedSongs.first else { return }

      let title: String
      let message: String
      if let url = song.value(forProperty: MPMediaItemPropertyAssetURL) as? URL {
        self.audioAsset = AVAsset(url: url)
        title = "Asset Loaded"
        message = "Audio Loaded"
      } else {
        self.audioAsset = nil
        title = "Asset Not Available"
        message = "Audio Not Loaded"
      }

      let alert = UIAlertController(
        title: title,
        message: message,
        preferredStyle: .alert)
      alert.addAction(UIAlertAction(title: "OK", style: .cancel, handler: nil))
      self.present(alert, animated: true, completion: nil)
    }
  }

  func mediaPickerDidCancel(_ mediaPicker: MPMediaPickerController) {
    dismiss(animated: true, completion: nil)
  }
}
3. PlayVideoViewController.swift
import AVKit
import MobileCoreServices
import UIKit

class PlayVideoViewController: UIViewController {
  @IBAction func playVideo(_ sender: AnyObject) {
    VideoHelper.startMediaBrowser(delegate: self, sourceType: .savedPhotosAlbum)
  }
}

// MARK: - UIImagePickerControllerDelegate
extension PlayVideoViewController: UIImagePickerControllerDelegate {
  func imagePickerController(
    _ picker: UIImagePickerController,
    didFinishPickingMediaWithInfo info: [UIImagePickerController.InfoKey: Any]
  ) {
    // 1
    guard
      let mediaType = info[UIImagePickerController.InfoKey.mediaType] as? String,
      mediaType == (kUTTypeMovie as String),
      let url = info[UIImagePickerController.InfoKey.mediaURL] as? URL
      else { return }

    // 2
    dismiss(animated: true) {
      //3
      let player = AVPlayer(url: url)
      let vcPlayer = AVPlayerViewController()
      vcPlayer.player = player
      self.present(vcPlayer, animated: true, completion: nil)
    }
  }
}

// MARK: - UINavigationControllerDelegate
extension PlayVideoViewController: UINavigationControllerDelegate {
}
4. RecordVideoViewController.swift
import MobileCoreServices
import UIKit

class RecordVideoViewController: UIViewController {
  @objc func video(
    _ videoPath: String,
    didFinishSavingWithError error: Error?,
    contextInfo info: AnyObject
  ) {
    let title = (error == nil) ? "Success" : "Error"
    let message = (error == nil) ? "Video was saved" : "Video failed to save"

    let alert = UIAlertController(
      title: title,
      message: message,
      preferredStyle: .alert)
    alert.addAction(UIAlertAction(
      title: "OK",
      style: UIAlertAction.Style.cancel,
      handler: nil))
    present(alert, animated: true, completion: nil)
  }

  @IBAction func record(_ sender: AnyObject) {
    VideoHelper.startMediaBrowser(delegate: self, sourceType: .camera)
  }
}

// MARK: - UIImagePickerControllerDelegate
extension RecordVideoViewController: UIImagePickerControllerDelegate {
  func imagePickerController(
    _ picker: UIImagePickerController,
    didFinishPickingMediaWithInfo info: [UIImagePickerController.InfoKey: Any]
  ) {
    dismiss(animated: true, completion: nil)

    guard
      let mediaType = info[UIImagePickerController.InfoKey.mediaType] as? String,
      mediaType == (kUTTypeMovie as String),
      let url = info[UIImagePickerController.InfoKey.mediaURL] as? URL,
      UIVideoAtPathIsCompatibleWithSavedPhotosAlbum(url.path)
      else { return }

    // Handle a movie capture
    UISaveVideoAtPathToSavedPhotosAlbum(
      url.path,
      self,
      #selector(video(_:didFinishSavingWithError:contextInfo:)),
      nil)
  }
}

// MARK: - UINavigationControllerDelegate
extension RecordVideoViewController: UINavigationControllerDelegate {
}
5. VideoHelper.swift
import AVFoundation
import MobileCoreServices
import UIKit

enum VideoHelper {
  static func orientationFromTransform(
    _ transform: CGAffineTransform
  ) -> (orientation: UIImage.Orientation, isPortrait: Bool) {
    var assetOrientation = UIImage.Orientation.up
    var isPortrait = false
    let tfA = transform.a
    let tfB = transform.b
    let tfC = transform.c
    let tfD = transform.d

    if tfA == 0 && tfB == 1.0 && tfC == -1.0 && tfD == 0 {
      assetOrientation = .right
      isPortrait = true
    } else if tfA == 0 && tfB == -1.0 && tfC == 1.0 && tfD == 0 {
      assetOrientation = .left
      isPortrait = true
    } else if tfA == 1.0 && tfB == 0 && tfC == 0 && tfD == 1.0 {
      assetOrientation = .up
    } else if tfA == -1.0 && tfB == 0 && tfC == 0 && tfD == -1.0 {
      assetOrientation = .down
    }
    return (assetOrientation, isPortrait)
  }

  static func startMediaBrowser(
    delegate: UIViewController & UINavigationControllerDelegate & UIImagePickerControllerDelegate,
    sourceType: UIImagePickerController.SourceType
  ) {
    guard UIImagePickerController.isSourceTypeAvailable(sourceType)
      else { return }

    let mediaUI = UIImagePickerController()
    mediaUI.sourceType = sourceType
    mediaUI.mediaTypes = [kUTTypeMovie as String]
    mediaUI.allowsEditing = true
    mediaUI.delegate = delegate
    delegate.present(mediaUI, animated: true, completion: nil)
  }

  static func videoCompositionInstruction(
    _ track: AVCompositionTrack,
    asset: AVAsset
  ) -> AVMutableVideoCompositionLayerInstruction {
    let instruction = AVMutableVideoCompositionLayerInstruction(assetTrack: track)
    let assetTrack = asset.tracks(withMediaType: AVMediaType.video)[0]

    let transform = assetTrack.preferredTransform
    let assetInfo = orientationFromTransform(transform)

    var scaleToFitRatio = UIScreen.main.bounds.width / assetTrack.naturalSize.width
    if assetInfo.isPortrait {
      scaleToFitRatio = UIScreen.main.bounds.width / assetTrack.naturalSize.height
      let scaleFactor = CGAffineTransform(
        scaleX: scaleToFitRatio,
        y: scaleToFitRatio)
      instruction.setTransform(
        assetTrack.preferredTransform.concatenating(scaleFactor),
        at: .zero)
    } else {
      let scaleFactor = CGAffineTransform(
        scaleX: scaleToFitRatio,
        y: scaleToFitRatio)
      var concat = assetTrack.preferredTransform.concatenating(scaleFactor)
        .concatenating(CGAffineTransform(
          translationX: 0,
          y: UIScreen.main.bounds.width / 2))
      if assetInfo.orientation == .down {
        let fixUpsideDown = CGAffineTransform(rotationAngle: CGFloat(Double.pi))
        let windowBounds = UIScreen.main.bounds
        let yFix = assetTrack.naturalSize.height + windowBounds.height
        let centerFix = CGAffineTransform(
          translationX: assetTrack.naturalSize.width,
          y: yFix)
        concat = fixUpsideDown.concatenating(centerFix).concatenating(scaleFactor)
      }
      instruction.setTransform(concat, at: .zero)
    }

    return instruction
  }
}

后记

本篇主要讲述了播放、录制和合并视频简单示例,感兴趣的给个赞或者关注~~~

相关文章

网友评论

      本文标题:AVFoundation框架解析(二十六) —— 播放、录制和合

      本文链接:https://www.haomeiwen.com/subject/lujidktx.html