美文网首页iOS技术点
iOS渲染-将视频原始数据(RGB,YUV)渲染到屏幕上

iOS渲染-将视频原始数据(RGB,YUV)渲染到屏幕上

作者: 小东邪啊 | 来源:发表于2019-06-18 22:42 被阅读0次

    需求

    在做如美颜,滤镜等功能时,我们不能使用相机原生的AVCaptureVideoPreviewLayer,而是需要通过其他方式将视频原始帧数据如RGB,NV12等等渲染到iOS界面上.


    实现原理

    利用OpenGL完成高效的渲染功能.本例中仅提供简单流程讲解,具体每行代码含义可在开源库中查询.

    注意:

    • 系统的AVCaptureVideoPreviewLayer仅能直接渲染从相机采集到的数据,当做美颜等功能时无法使用.
    • 本例仅实现了RGB与NV12两种类型视频数据的渲染,其他可根据需求自行添加.

    阅读前提

    • 音视频基础
    • OpenGL基础

    代码地址 : iOS视频渲染

    掘金地址 : iOS视频渲染

    简书地址 : iOS视频渲染

    博客地址 : iOS视频渲染


    具体步骤

    1. 创建EAGLContext上下文对象

    • 创建OpenGL预览层
        CAEAGLLayer *eaglLayer       = (CAEAGLLayer *)self.layer;
        eaglLayer.opaque = YES;
        eaglLayer.drawableProperties = @{kEAGLDrawablePropertyRetainedBacking   : [NSNumber numberWithBool:NO],
                                         kEAGLDrawablePropertyColorFormat       : kEAGLColorFormatRGBA8};
    
    • 创建OpenGL上下文对象
        EAGLContext *context = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2];
        [EAGLContext setCurrentContext:context];
    
    • 设置上下文渲染缓冲区
    - (void)setupBuffersWithContext:(EAGLContext *)context width:(int *)width height:(int *)height colorBufferHandle:(GLuint *)colorBufferHandle frameBufferHandle:(GLuint *)frameBufferHandle {
        glDisable(GL_DEPTH_TEST);
        
        glEnableVertexAttribArray(ATTRIB_VERTEX);
        glVertexAttribPointer(ATTRIB_VERTEX, 2, GL_FLOAT, GL_FALSE, 2 * sizeof(GLfloat), 0);
        
        glEnableVertexAttribArray(ATTRIB_TEXCOORD);
        glVertexAttribPointer(ATTRIB_TEXCOORD, 2, GL_FLOAT, GL_FALSE, 2 * sizeof(GLfloat), 0);
        
        glGenFramebuffers(1, frameBufferHandle);
        glBindFramebuffer(GL_FRAMEBUFFER, *frameBufferHandle);
        
        glGenRenderbuffers(1, colorBufferHandle);
        glBindRenderbuffer(GL_RENDERBUFFER, *colorBufferHandle);
        
        [context renderbufferStorage:GL_RENDERBUFFER fromDrawable:(CAEAGLLayer *)self.layer];
        glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_WIDTH , width);
        glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_HEIGHT, height);
        
        glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_RENDERBUFFER, *colorBufferHandle);
    }
    
    • 加载着色器

    着色器本Demo中只添加了NV12格式与RGB格式两种原始视频数据.

    - (void)loadShaderWithBufferType:(XDXPixelBufferType)type {
        GLuint vertShader, fragShader;
        NSURL  *vertShaderURL, *fragShaderURL;
        
        NSString *shaderName;
        GLuint   program;
        program = glCreateProgram();
        
        if (type == XDXPixelBufferTypeNV12) {
            shaderName = @"XDXPreviewNV12Shader";
            _nv12Program = program;
        } else if (type == XDXPixelBufferTypeRGB) {
            shaderName = @"XDXPreviewRGBShader";
            _rgbProgram = program;
        }
        
        vertShaderURL = [[NSBundle mainBundle] URLForResource:shaderName withExtension:@"vsh"];
        if (![self compileShader:&vertShader type:GL_VERTEX_SHADER URL:vertShaderURL]) {
            log4cplus_error(kModuleName, "Failed to compile vertex shader");
            return;
        }
        
        fragShaderURL = [[NSBundle mainBundle] URLForResource:shaderName withExtension:@"fsh"];
        if (![self compileShader:&fragShader type:GL_FRAGMENT_SHADER URL:fragShaderURL]) {
            log4cplus_error(kModuleName, "Failed to compile fragment shader");
            return;
        }
        
        glAttachShader(program, vertShader);
        glAttachShader(program, fragShader);
        
        glBindAttribLocation(program, ATTRIB_VERTEX  , "position");
        glBindAttribLocation(program, ATTRIB_TEXCOORD, "inputTextureCoordinate");
        
        if (![self linkProgram:program]) {
            if (vertShader) {
                glDeleteShader(vertShader);
                vertShader = 0;
            }
            if (fragShader) {
                glDeleteShader(fragShader);
                fragShader = 0;
            }
            if (program) {
                glDeleteProgram(program);
                program = 0;
            }
            return;
        }
        
        if (type == XDXPixelBufferTypeNV12) {
            uniforms[UNIFORM_Y] = glGetUniformLocation(program , "luminanceTexture");
            uniforms[UNIFORM_UV] = glGetUniformLocation(program, "chrominanceTexture");
            uniforms[UNIFORM_COLOR_CONVERSION_MATRIX] = glGetUniformLocation(program, "colorConversionMatrix");
        } else if (type == XDXPixelBufferTypeRGB) {
            _displayInputTextureUniform = glGetUniformLocation(program, "inputImageTexture");
        }
        
        if (vertShader) {
            glDetachShader(program, vertShader);
            glDeleteShader(vertShader);
        }
        if (fragShader) {
            glDetachShader(program, fragShader);
            glDeleteShader(fragShader);
        }
    }
    
    - (BOOL)compileShader:(GLuint *)shader type:(GLenum)type URL:(NSURL *)URL {
        NSError *error;
        NSString *sourceString = [[NSString alloc] initWithContentsOfURL:URL
                                                                encoding:NSUTF8StringEncoding
                                                                   error:&error];
        if (sourceString == nil) {
            log4cplus_error(kModuleName, "Failed to load vertex shader: %s", [error localizedDescription].UTF8String);
            return NO;
        }
        
        GLint status;
        const GLchar *source;
        source = (GLchar *)[sourceString UTF8String];
        
        *shader = glCreateShader(type);
        glShaderSource(*shader, 1, &source, NULL);
        glCompileShader(*shader);
        
        glGetShaderiv(*shader, GL_COMPILE_STATUS, &status);
        if (status == 0) {
            glDeleteShader(*shader);
            return NO;
        }
        return YES;
    }
    
    - (BOOL)linkProgram:(GLuint)prog {
        GLint status;
        glLinkProgram(prog);
        
        glGetProgramiv(prog, GL_LINK_STATUS, &status);
        if (status == 0) {
            return NO;
        }
        return YES;
    }
    
    • 创建视频纹理缓存区
        if (!*videoTextureCache) {
            CVReturn err = CVOpenGLESTextureCacheCreate(kCFAllocatorDefault, NULL, context, NULL, videoTextureCache);
            if (err != noErr)
                log4cplus_error(kModuleName, "Error at CVOpenGLESTextureCacheCreate %d",err);
        }
    

    2. 将pixelBuffer渲染到屏幕

    • 开始渲染前先清空缓存数据
    
    - (void)cleanUpTextures {
        if (_lumaTexture) {
            CFRelease(_lumaTexture);
            _lumaTexture = NULL;
        }
        
        if (_chromaTexture) {
            CFRelease(_chromaTexture);
            _chromaTexture = NULL;
        }
        
        if (_renderTexture) {
            CFRelease(_renderTexture);
            _renderTexture = NULL;
        }
        
        CVOpenGLESTextureCacheFlush(_videoTextureCache, 0);
    }
    
    
    • 根据pixelBuffer格式确定视频数据类型
        XDXPixelBufferType bufferType;
        if (CVPixelBufferGetPixelFormatType(pixelBuffer) == kCVPixelFormatType_420YpCbCr8BiPlanarFullRange || CVPixelBufferGetPixelFormatType(pixelBuffer) == kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange) {
            bufferType = XDXPixelBufferTypeNV12;
        } else if (CVPixelBufferGetPixelFormatType(pixelBuffer) == kCVPixelFormatType_32BGRA) {
            bufferType = XDXPixelBufferTypeRGB;
        }else {
            log4cplus_error(kModuleName, "Not support current format.");
            return;
        }
    
    • 通过当前的pixelBuffer对象创建CVOpenGLESTexture对象
    CVOpenGLESTextureRef lumaTexture,chromaTexture,renderTexture;
        if (bufferType == XDXPixelBufferTypeNV12) {
            // Y
            glActiveTexture(GL_TEXTURE0);
            
            error = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault,
                                                                 videoTextureCache,
                                                                 pixelBuffer,
                                                                 NULL,
                                                                 GL_TEXTURE_2D,
                                                                 GL_LUMINANCE,
                                                                 frameWidth,
                                                                 frameHeight,
                                                                 GL_LUMINANCE,
                                                                 GL_UNSIGNED_BYTE,
                                                                 0,
                                                                 &lumaTexture);
            if (error) {
                log4cplus_error(kModuleName, "Error at CVOpenGLESTextureCacheCreateTextureFromImage %d", error);
            }else {
                _lumaTexture = lumaTexture;
            }
            
            glBindTexture(CVOpenGLESTextureGetTarget(lumaTexture), CVOpenGLESTextureGetName(lumaTexture));
            glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
            glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
            glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
            glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
            
            // UV
            glActiveTexture(GL_TEXTURE1);
            error = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault,
                                                                 videoTextureCache,
                                                                 pixelBuffer,
                                                                 NULL,
                                                                 GL_TEXTURE_2D,
                                                                 GL_LUMINANCE_ALPHA,
                                                                 frameWidth / 2,
                                                                 frameHeight / 2,
                                                                 GL_LUMINANCE_ALPHA,
                                                                 GL_UNSIGNED_BYTE,
                                                                 1,
                                                                 &chromaTexture);
            if (error) {
                log4cplus_error(kModuleName, "Error at CVOpenGLESTextureCacheCreateTextureFromImage %d", error);
            }else {
                _chromaTexture = chromaTexture;
            }
            
            glBindTexture(CVOpenGLESTextureGetTarget(chromaTexture), CVOpenGLESTextureGetName(chromaTexture));
            glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
            glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
            glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
            glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
            
        } else if (bufferType == XDXPixelBufferTypeRGB) {
            // RGB
            glActiveTexture(GL_TEXTURE0);
            error = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault,
                                                                 videoTextureCache,
                                                                 pixelBuffer,
                                                                 NULL,
                                                                 GL_TEXTURE_2D,
                                                                 GL_RGBA,
                                                                 frameWidth,
                                                                 frameHeight,
                                                                 GL_BGRA,
                                                                 GL_UNSIGNED_BYTE,
                                                                 0,
                                                                 &renderTexture);
            if (error) {
                log4cplus_error(kModuleName, "Error at CVOpenGLESTextureCacheCreateTextureFromImage %d", error);
            }else {
                _renderTexture = renderTexture;
            }
            
            glBindTexture(CVOpenGLESTextureGetTarget(renderTexture), CVOpenGLESTextureGetName(renderTexture));
            glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
            glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
            glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
            glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
        }
    
    • 选择OpenGL程序
    glBindFramebuffer(GL_FRAMEBUFFER, frameBufferHandle);
        
        glViewport(0, 0, backingWidth, backingHeight);
        
        glClearColor(0.1f, 0.0f, 0.0f, 1.0f);
        glClear(GL_COLOR_BUFFER_BIT);
        
        if (bufferType == XDXPixelBufferTypeNV12) {
            if (self.lastBufferType != bufferType) {
                glUseProgram(nv12Program);
                glUniform1i(uniforms[UNIFORM_Y], 0);
                glUniform1i(uniforms[UNIFORM_UV], 1);
                glUniformMatrix3fv(uniforms[UNIFORM_COLOR_CONVERSION_MATRIX], 1, GL_FALSE, preferredConversion);
            }
        } else if (bufferType == XDXPixelBufferTypeRGB) {
            if (self.lastBufferType != bufferType) {
                glUseProgram(rgbProgram);
                glUniform1i(displayInputTextureUniform, 0);
            }
        }
    
    • 计算非全屏尺寸

    渲染的画面可能是全屏,可能是留黑边.

        static CGSize normalizedSamplingSize;
        
        if (self.lastFullScreen != self.isFullScreen || self.pixelbufferWidth != frameWidth || self.pixelbufferHeight != frameHeight
            || normalizedSamplingSize.width == 0 || normalizedSamplingSize.height == 0  || self.screenWidth != [UIScreen mainScreen].bounds.size.width) {
            
            normalizedSamplingSize = [self getNormalizedSamplingSize:CGSizeMake(frameWidth, frameHeight)];
            self.lastFullScreen = self.isFullScreen;
            self.pixelbufferWidth = frameWidth;
            self.pixelbufferHeight = frameHeight;
            self.screenWidth = [UIScreen mainScreen].bounds.size.width;
            
            quadVertexData[0] = -1 * normalizedSamplingSize.width;
            quadVertexData[1] = -1 * normalizedSamplingSize.height;
            quadVertexData[2] = normalizedSamplingSize.width;
            quadVertexData[3] = -1 * normalizedSamplingSize.height;
            quadVertexData[4] = -1 * normalizedSamplingSize.width;
            quadVertexData[5] = normalizedSamplingSize.height;
            quadVertexData[6] = normalizedSamplingSize.width;
            quadVertexData[7] = normalizedSamplingSize.height;
        }
        
    
    • 渲染画面
        glVertexAttribPointer(ATTRIB_VERTEX, 2, GL_FLOAT, 0, 0, quadVertexData);
        glEnableVertexAttribArray(ATTRIB_VERTEX);
        
        glVertexAttribPointer(ATTRIB_TEXCOORD, 2, GL_FLOAT, 0, 0, quadTextureData);
        glEnableVertexAttribArray(ATTRIB_TEXCOORD);
        
        glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
        
        glBindRenderbuffer(GL_RENDERBUFFER, colorBufferHandle);
        
        if ([EAGLContext currentContext] == context) {
            [context presentRenderbuffer:GL_RENDERBUFFER];
        }
    

    相关文章

      网友评论

        本文标题:iOS渲染-将视频原始数据(RGB,YUV)渲染到屏幕上

        本文链接:https://www.haomeiwen.com/subject/xqhcqctx.html