美文网首页
swift 制作动态壁纸(live)实况图

swift 制作动态壁纸(live)实况图

作者: HH思無邪 | 来源:发表于2023-02-21 23:56 被阅读0次

    前言

    我相信老铁们都是走在时代前沿的弄潮儿,那么为你们的爱机自定义一张动态壁纸,我想这应该是一个 Good idea😎,如下图便是我制作的动态壁纸。

    WechatIMG64.jpeg

    制作动态壁纸

    动态壁纸在iOS中其实就是用实况图设置锁屏壁纸,在锁频界面长按就会播放视频内容

    什么是实况图?

    • 百度释义:
      苹果实况照片的意思是指动态照片Live Photos,它的作用是在照片拍摄前后录制一段1.5秒的“动态视频”,当用户在照片上深按一下,照片就会自动播放动态效果。

    • 程序员视角:
      一张jpg图片作为封面 + .mov 视频

    如何制作实况图?

    • 思路:
      准备一张封面和一段.mov格式的视频,分别写入identifier,然后保存到系统相册,相册会通过identifier将它们绑定起来便成了我们看到的实况图

    • 第一步 图片写入identifier

     func addAssetID(_ assetIdentifier: String, toImage imageURL: URL, saveTo destinationURL: URL) -> URL? {
            guard let imageDestination = CGImageDestinationCreateWithURL(destinationURL as CFURL, kUTTypeJPEG, 1, nil),
                  let imageSource = CGImageSourceCreateWithURL(imageURL as CFURL, nil),
                  let imageRef = CGImageSourceCreateImageAtIndex(imageSource, 0, nil), 
                    var imageProperties = CGImageSourceCopyPropertiesAtIndex(imageSource, 0, nil) as? [AnyHashable : Any] else { return nil }
            let assetIdentifierKey = "17"
            let assetIdentifierInfo = [assetIdentifierKey : assetIdentifier]
            imageProperties[kCGImagePropertyMakerAppleDictionary] = assetIdentifierInfo
            CGImageDestinationAddImage(imageDestination, imageRef, imageProperties as CFDictionary)
            CGImageDestinationFinalize(imageDestination)
            return destinationURL
        }
    
    • 第二步 视频写入 identifier并且输出.mov格式
       var audioReader: AVAssetReader?
        var videoReader: AVAssetReader?
        var assetWriter: AVAssetWriter?
        
        func addAssetID(_ assetIdentifier: String, toVideo videoURL: URL, saveTo destinationURL: URL, progress: @escaping (CGFloat) -> Void, completion: @escaping (URL?) -> Void) {
            
            var audioWriterInput: AVAssetWriterInput?
            var audioReaderOutput: AVAssetReaderOutput?
            let videoAsset = AVURLAsset(url: videoURL)
            let frameCount = videoAsset.countFrames(exact: false)
            guard let videoTrack = videoAsset.tracks(withMediaType: .video).first else {
                completion(nil)
                return
            }
            do {
                // Create the Asset Writer
                assetWriter = try AVAssetWriter(outputURL: destinationURL, fileType: .mov)
                // Create Video Reader Output
                videoReader = try AVAssetReader(asset: videoAsset)
                let videoReaderSettings = [kCVPixelBufferPixelFormatTypeKey as String: NSNumber(value: kCVPixelFormatType_32BGRA as UInt32)]
                let videoReaderOutput = AVAssetReaderTrackOutput(track: videoTrack, outputSettings: videoReaderSettings)
                videoReader?.add(videoReaderOutput)
                // Create Video Writer Input
                let videoWriterInput = AVAssetWriterInput(mediaType: .video, outputSettings: [AVVideoCodecKey : AVVideoCodecH264, AVVideoWidthKey : videoTrack.naturalSize.width, AVVideoHeightKey : videoTrack.naturalSize.height])
                videoWriterInput.transform = videoTrack.preferredTransform
                videoWriterInput.expectsMediaDataInRealTime = true
                assetWriter?.add(videoWriterInput)
                // Create Audio Reader Output & Writer Input
                if let audioTrack = videoAsset.tracks(withMediaType: .audio).first {
                    do {
                        let _audioReader = try AVAssetReader(asset: videoAsset)
                        let _audioReaderOutput = AVAssetReaderTrackOutput(track: audioTrack, outputSettings: nil)
                        _audioReader.add(_audioReaderOutput)
                        audioReader = _audioReader
                        audioReaderOutput = _audioReaderOutput
                        let _audioWriterInput = AVAssetWriterInput(mediaType: .audio, outputSettings: nil)
                        _audioWriterInput.expectsMediaDataInRealTime = false
                        assetWriter?.add(_audioWriterInput)
                        audioWriterInput = _audioWriterInput
                    } catch {
                        print(error)
                    }
                }
                // Create necessary identifier metadata and still image time metadata
                let assetIdentifierMetadata = metadataForAssetID(assetIdentifier)
                let stillImageTimeMetadataAdapter = createMetadataAdaptorForStillImageTime()
                assetWriter?.metadata = [assetIdentifierMetadata]
                assetWriter?.add(stillImageTimeMetadataAdapter.assetWriterInput)
                // Start the Asset Writer
                assetWriter?.startWriting()
                assetWriter?.startSession(atSourceTime: CMTime.zero)
                // Add still image metadata
                let _stillImagePercent: Float = 0.5
                stillImageTimeMetadataAdapter.append(AVTimedMetadataGroup(items: [metadataItemForStillImageTime()],timeRange: videoAsset.makeStillImageTimeRange(percent: _stillImagePercent, inFrameCount: frameCount)))
                // For end of writing / progress
                var writingVideoFinished = false
                var writingAudioFinished = false
                var currentFrameCount = 0
                func didCompleteWriting() {
                    guard writingAudioFinished && writingVideoFinished else { return }
                    assetWriter?.finishWriting {
                        if self.assetWriter?.status == .completed {
                            completion(destinationURL)
                        } else {
                            completion(nil)
                        }
                    }
                }
                // Start writing video
                if videoReader?.startReading() ?? false {
                    videoWriterInput.requestMediaDataWhenReady(on: DispatchQueue(label: "videoWriterInputQueue")) {
                        while videoWriterInput.isReadyForMoreMediaData {
                            if let sampleBuffer = videoReaderOutput.copyNextSampleBuffer()  {
                                currentFrameCount += 1
                                let percent:CGFloat = CGFloat(currentFrameCount)/CGFloat(frameCount)
                                progress(percent)
                                if !videoWriterInput.append(sampleBuffer) {
                                    print("Cannot write: \(String(describing: self.assetWriter?.error?.localizedDescription))")
                                    self.videoReader?.cancelReading()
                                }
                            } else {
                                videoWriterInput.markAsFinished()
                                writingVideoFinished = true
                                didCompleteWriting()
                            }
                        }
                    }
                } else {
                    writingVideoFinished = true
                    didCompleteWriting()
                }
                // Start writing audio
                if audioReader?.startReading() ?? false {
                    audioWriterInput?.requestMediaDataWhenReady(on: DispatchQueue(label: "audioWriterInputQueue")) {
                        while audioWriterInput?.isReadyForMoreMediaData ?? false {
                            guard let sampleBuffer = audioReaderOutput?.copyNextSampleBuffer() else {
                                audioWriterInput?.markAsFinished()
                                writingAudioFinished = true
                                didCompleteWriting()
                                return
                            }
                            audioWriterInput?.append(sampleBuffer)
                        }
                    }
                } else {
                    writingAudioFinished = true
                    didCompleteWriting()
                }
            } catch {
                print(error)
                completion(nil)
            }
        }
        
        private func metadataForAssetID(_ assetIdentifier: String) -> AVMetadataItem {
            let item = AVMutableMetadataItem()
            let keyContentIdentifier =  "com.apple.quicktime.content.identifier"
            let keySpaceQuickTimeMetadata = "mdta"
            item.key = keyContentIdentifier as (NSCopying & NSObjectProtocol)?
            item.keySpace = AVMetadataKeySpace(rawValue: keySpaceQuickTimeMetadata)
            item.value = assetIdentifier as (NSCopying & NSObjectProtocol)?
            item.dataType = "com.apple.metadata.datatype.UTF-8"
            return item
        }
        
        private func createMetadataAdaptorForStillImageTime() -> AVAssetWriterInputMetadataAdaptor {
            let keyStillImageTime = "com.apple.quicktime.still-image-time"
            let keySpaceQuickTimeMetadata = "mdta"
            let spec : NSDictionary = [
                kCMMetadataFormatDescriptionMetadataSpecificationKey_Identifier as NSString:
                "\(keySpaceQuickTimeMetadata)/\(keyStillImageTime)",
                kCMMetadataFormatDescriptionMetadataSpecificationKey_DataType as NSString:
                "com.apple.metadata.datatype.int8"            ]
            var desc : CMFormatDescription? = nil
            CMMetadataFormatDescriptionCreateWithMetadataSpecifications(allocator: kCFAllocatorDefault, metadataType: kCMMetadataFormatType_Boxed, metadataSpecifications: [spec] as CFArray, formatDescriptionOut: &desc)
            let input = AVAssetWriterInput(mediaType: .metadata,
                                           outputSettings: nil, sourceFormatHint: desc)
            return AVAssetWriterInputMetadataAdaptor(assetWriterInput: input)
        }
        
        private func metadataItemForStillImageTime() -> AVMetadataItem {
            let item = AVMutableMetadataItem()
            let keyStillImageTime = "com.apple.quicktime.still-image-time"
            let keySpaceQuickTimeMetadata = "mdta"
            item.key = keyStillImageTime as (NSCopying & NSObjectProtocol)?
            item.keySpace = AVMetadataKeySpace(rawValue: keySpaceQuickTimeMetadata)
            item.value = 0 as (NSCopying & NSObjectProtocol)?
            item.dataType = "com.apple.metadata.datatype.int8"
            return item
        }
    
    • 最后 保存实况图到相册
    
    typealias LivePhotoResources = (pairedImage: URL, pairedVideo: URL)
    
      /// Save a Live Photo to the Photo Library by passing the paired image and video.
        public class func saveToLibrary(_ resources: LivePhotoResources, completion: @escaping (Bool) -> Void) {
            PHPhotoLibrary.shared().performChanges({
                let creationRequest = PHAssetCreationRequest.forAsset()
                let options = PHAssetResourceCreationOptions()
                creationRequest.addResource(with: PHAssetResourceType.pairedVideo, fileURL: resources.pairedVideo, options: options)
                creationRequest.addResource(with: PHAssetResourceType.photo, fileURL: resources.pairedImage, options: options)
            }, completionHandler: { (success, error) in
                if error != nil {
                    print(error as Any)
                }
                completion(success)
            })
        }
    

    看到这里,相信你已经可以制作出实况图了,快去让你的屏幕炫起来吧!

    😂看大段代码确实打脑壳,那就来个demo吧!

    SwiftLivePhoto-demo


    如果觉得文章对你有用,那就点个赞支持一下吧!如果有任何疑问或写得不好的地方欢迎在评论区留言 🙏

    相关文章

      网友评论

          本文标题:swift 制作动态壁纸(live)实况图

          本文链接:https://www.haomeiwen.com/subject/pimwkdtx.html