美文网首页iOS技术交流收藏iOS DeveloperOpenGL ES
Metal Camera开发1:读取渲染结果生成UIImage

Metal Camera开发1:读取渲染结果生成UIImage

作者: 熊皮皮 | 来源:发表于2017-07-01 19:41 被阅读1052次

    本文档通过Metal compute shader对摄像头当前捕获的画面进行简单的Gamma校正,绘制到屏幕(MTKView)及将渲染结果保存成UIImage。文档最后简要讨论了Metal compute shader的dispatchThreadgroups配置问题。

    文档结构:

    1. 配置AVCaptureSession获取摄像头当前画面
    2. 初始化Compute Shader环境
    3. 编写Gamma校正shader代码
    4. 渲染Compute Shader处理后的纹理到屏幕
    5. 读取Metal渲染结果并生成UIImage
    6. 讨论:Metal compute shader合理的dispatchThreadgroups设置
    渲染结果

    1. 配置AVCaptureSession获取摄像头当前画面

    参考我之前的文档iOS VideoToolbox硬编H.265(HEVC)H.264(AVC):1 概述进行摄像头的配置,简单起见,令摄像头输出画面为竖直方向的RGBA数据,后续文档再实践Metal Shader实现YUV转RGB,然后进行各种滤镜的叠加,参考代码如下。

    let device = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo)
    let input = try? AVCaptureDeviceInput(device: device)
    if session.canAddInput(input) {
        session.addInput(input)
    }
    
    let output = AVCaptureVideoDataOutput()
    output.videoSettings = [kCVPixelBufferPixelFormatTypeKey as AnyHashable : kCVPixelFormatType_32BGRA]
    output.setSampleBufferDelegate(self, queue: DispatchQueue(label: "CamOutputQueue"))
    if session.canAddOutput(output) {
        session.addOutput(output)
    }
    
    if session.canSetSessionPreset(AVCaptureSessionPreset1920x1080) {
        session.canSetSessionPreset(AVCaptureSessionPreset1920x1080)
    }
    
    session.beginConfiguration()
    
    for (_, connection) in output.connections.enumerated() {
        for (_, port) in (connection as! AVCaptureConnection).inputPorts.enumerated() {
            if (port as! AVCaptureInputPort).mediaType == AVMediaTypeVideo {
                videoConnection = connection as? AVCaptureConnection
                break
            }
        }
        if videoConnection != nil {
            break;
        }
    }
    
    if (videoConnection?.isVideoOrientationSupported)! {
        videoConnection?.videoOrientation = .portrait
    }
    
    session.commitConfiguration()
    session.startRunning()
    

    2. 初始化Compute Shader环境

    Core Video给Metal提供了类似OpenGL ES创建纹理的接口CVMetalTextureCache。除此之外,还需进行Metal要求的MTLLibrary等准备工作,参考代码如下。

    var textureCache : CVMetalTextureCache?
    var imageTexture: MTLTexture?
    
    var commandQueue: MTLCommandQueue?
    var library: MTLLibrary?
    var pipeline: MTLComputePipelineState?
    
    //------------
    device = MTLCreateSystemDefaultDevice()
    
    mtlView.device = device
    mtlView.framebufferOnly = false
    mtlView.clearColor = MTLClearColor(red: 0, green: 0, blue: 0, alpha: 1)
    
    library = device?.newDefaultLibrary()
    guard let function = library?.makeFunction(name: "gamma_filter") else {
        fatalError()
    }
    
    pipeline = try! device?.makeComputePipelineState(function: function)
    
    commandQueue = device?.makeCommandQueue()
    
    CVMetalTextureCacheCreate(kCFAllocatorDefault, nil, device!, nil, &textureCache)
    

    由于要读取屏幕上显示的画面,需将MTKView.framebufferOnly属性设置为false。

    3. 编写Gamma校正shader代码

    inTexture表示摄像头当前捕获的画面,outTexture表示处理后的数据,将会渲染到屏幕。

    #include <metal_stdlib>
    using namespace metal;
    
    kernel void gamma_filter(
            texture2d<float, access::read> inTexture [[texture(0)]],
            texture2d<float, access::write> outTexture [[texture(1)]],
            uint2 gid [[thread_position_in_grid]])
    {
        float4 inColor = inTexture.read(gid);
        const float4 outColor = float4(pow(inColor.rgb, float3(0.4/* gamma校正参数 */)), inColor.a);
        outTexture.write(outColor, gid);
    }
    

    4. 渲染Compute Shader处理后的纹理到屏幕

    在MTKViewDelegate的draw(in view: MTKView)方法中绘制Compute Shader处理后的纹理到屏幕,参考代码如下。

    guard let texture = imageTexture else {
        return
    }
    guard let drawable = view.currentDrawable else {
        return
    }
    guard let commandBuffer = commandQueue?.makeCommandBuffer() else {
        return
    }
    
    let encoder = commandBuffer.makeComputeCommandEncoder()
    encoder.setComputePipelineState(pipeline!)
    encoder.setTexture(texture, at: 0)
    encoder.setTexture(drawable.texture, at: 1)
    
    let threads = MTLSize(width: 16, height: 16, depth: 1)
    let threadgroups = MTLSize(width: texture.width / threads.width,
                               height: texture.height / threads.height,
                               depth: 1)
    encoder.dispatchThreadgroups(threadgroups, threadsPerThreadgroup: threads)
    encoder.endEncoding()
    
    commandBuffer.present(drawable)
    commandBuffer.commit()
    

    关键代码encoder.setTexture(drawable.texture, at: 1)指示compute shader将gamma校正结果写到MTKView.currentDrawable.texture。

    5. 读取Metal渲染结果并生成UIImage

    类似OpenGL ES的glReadPixels操作,需要注意大小端字节序及UIKit与Metal纹理坐标系的差异。由第4节渲染Compute Shader处理后的纹理到屏幕可知,MTKView.currentDrawable.texture是当前的渲染结果纹理,读取Metal渲染结果问题就成了MTLTexture转换成UIImage问题,可借助Core Graphics接口实现,参考代码如下。

    let image = currentDrawable?.texture.toUIImage()
    

    为方便后续开发,给MTLTexture添加转换成UIImage接口。

    public extension MTLTexture {
    
        public func toUIImage() -> UIImage {
            let bytesPerPixel: Int = 4
            let imageByteCount = self.width * self.height * bytesPerPixel
            let bytesPerRow = self.width * bytesPerPixel
            var src = [UInt8](repeating: 0, count: Int(imageByteCount))
    
            let region = MTLRegionMake2D(0, 0, self.width, self.height)
            self.getBytes(&src, bytesPerRow: bytesPerRow, from: region, mipmapLevel: 0)
            let bitmapInfo = CGBitmapInfo(rawValue: (CGBitmapInfo.byteOrder32Little.rawValue | CGImageAlphaInfo.noneSkipFirst.rawValue))
            let colorSpace = CGColorSpaceCreateDeviceRGB()
            let bitsPerComponent = 8
            let context = CGContext(data: &src, width: self.width, height: self.height, bitsPerComponent: bitsPerComponent, bytesPerRow: bytesPerRow, space: colorSpace, bitmapInfo: bitmapInfo.rawValue);
    
            let dstImageFilter = context?.makeImage();
    
            return UIImage(cgImage: dstImageFilter!, scale: 0.0, orientation: UIImageOrientation.downMirrored) // 对于本文档,不需要downMirrored,因为第1节强制摄像头输出portrait方向图像
        }
    }
    

    6. 讨论:compute shader合理的dispatchThreadgroups设置

    第4节渲染Compute Shader处理后的纹理到屏幕简单设置了dispatchThreadgroups,那么合理的dispatchThreadgroups值应该是多少呢?可参考官方文档:Working with threads and threadgroups,参考设置代码如下。

    let w = pipeline!.threadExecutionWidth
    let h = pipeline!.maxTotalThreadsPerThreadgroup / w
    let threadsPerThreadgroup = MTLSizeMake(w, h, 1)
    let threadgroupsPerGrid = MTLSize(width: (texture.width + w - 1) / w,
                                      height: (texture.height + h - 1) / h,
                                      depth: 1)
    

    使用上述代码,在iPhone 7p上计算1080p画面,GPU耗时略有下降。

    相关文章

      网友评论

      • dcd310ba9e30:您好,这份代码可以发给我一下吗?529084167@qq.com
      • 卡卡不在jia:刚接触metal,跪求大神Metal camera系列代码。非常感谢407371233@qq.com
      • Y_Swordsman:跪求这metal camera这一系列代码。非常感谢13727634817@163.com
      • 园小加:您好,可以发我一份demo代码么,我想学习一下谢谢。tstststs@163.com
      • waittime:求个demo:122786170@qq.com
      • _Erica:牛逼,求demo,121700354@qq.com,希望能加qq请教,谢谢大🐂
      • 一隻鱼:1521557461@qq.com 大佬发个demo,谢谢了
      • 帅咋天:帅的不行
      • 剑眉枉凝:跪求demo,邮箱 jianmeiwangning@gmail.com
        熊皮皮:@剑眉枉凝 已发
      • 45f0c6e690c1:发我一份这个系列的DEMO,88486969@qq.com 谢谢
        剑眉枉凝:跪求demo,邮箱 jianmeiwangning@gmail.com
        熊皮皮:已发
      • 60916fc63567:有demo么 跪求demo
        熊皮皮:@Soson说 已发,没收到就看看是不是被自动放到垃圾邮件里面
        0ce49954119c:@熊皮皮 superchaoxian@163.com求demo
        熊皮皮:留个邮箱

      本文标题:Metal Camera开发1:读取渲染结果生成UIImage

      本文链接:https://www.haomeiwen.com/subject/mblgcxtx.html