美文网首页
GPUImage源码分析与使用(二)

GPUImage源码分析与使用(二)

作者: 紫水依 | 来源:发表于2020-02-12 16:07 被阅读0次

    Sources、Filters、Outputs、Pipeline的分别介绍

    Sources

    GPUImage的一个类GPUImageOutput

    GPUImage的一个协议GPUImageInput

    滤镜链:输入(图片、视频文件、纹理、二进制等)->输出(视频、view等)

    滤镜链的起点:输入

    不管用哪种方式进行滤镜处理,最终处理的都是纹理数据。

    1. GPUImagePicture,处理静态图片,本质是解压图片->纹理->滤镜处理
    2. GPUImageRawDataInput,二进制数据->纹理,得到CVPixelFormat
    3. GPUImageTextureInput,纹理,已经从压缩文件解压后的纹理数据
    4. GPUImageUIElement,UIView、CALayer,通过Coregraphic把要绘制的内容填充到上下文来获取图片数据->纹理,用在截屏、获取当前UIview的时候
      • 图片水印,在对应位置添加纹理图片混合即可
      • 文字水印label,使用GPUImageUIElement可以添加文字水印
    5. GPUImageMovie,视频文件-> 使用AVAssetReader逐帧读取-> 帧数据转化成纹理-> 滤镜处理,AVAssetReaderOutput -> CMSampleBufferRef -> CVImageBufferRef -> CVOpenGLESTextureRef -> texture
    6. GPUImageVideoCamera,拍视频,AVFoundation采集视频-> 回调方法didOutputSampleBuffer-> CVImageBufferRef -> CVOpenGLESTextureRef -> texture
    7. GPUImageStillCamera,GPUImagePicture的子类,拍图片,AVFoundation采集图片-> 回调方法didOutputSampleBuffer-> CVImageBufferRef -> CVOpenGLESTextureRef -> texture

    滤镜,有两百多种

    滤镜基类是GPUImageFilter,自定义滤镜必须继承于GPUImageFilter

    滤镜链的终点:输出

    1. GPUImageMovieWriter,录制的视频,从帧缓冲中将渲染的结果纹理数据通过AVAssetWriter把每一帧保存到相应的路径,然后将保存后的文件断点续传到平台
    2. GPUImageRawDataOutput,边录制边上传,获取处理滤镜中帧缓冲区的二进制数据上传到平台
    3. GPUImageTextureOutput,输出纹理,渲染完成后得到新的纹理
    4. GPUImageView,继承于UIView,纹理输出到layer上

    GPUImage的使用

    视频添加滤镜

    1. AVFoundation捕捉视频并处理设备
    - (id)initWithSessionPreset:(NSString *)sessionPreset cameraPosition:(AVCaptureDevicePosition)cameraPosition; 
    {
        if (!(self = [super init]))
        {
            return nil;
        }
        
        //AVFoundation视频捕捉
        //1. 初始化视频处理队列、音频处理队列、GCD信号量
        cameraProcessingQueue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH,0);
        audioProcessingQueue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_LOW,0);
    
        frameRenderingSemaphore = dispatch_semaphore_create(1);
    
        //2. 初始化帧率、其他标记等相关属性
        _frameRate = 0; // This will not set frame rate unless this value gets set to 1 or above
        _runBenchmark = NO;
        capturePaused = NO;
        outputRotation = kGPUImageNoRotation;
        internalRotation = kGPUImageNoRotation;
        captureAsYUV = YES;
        _preferredConversion = kColorConversion709;
        
        // Grab the back-facing or front-facing camera
        //3. 获取前后摄像头设备(默认后置)
        _inputCamera = nil;
        NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
        for (AVCaptureDevice *device in devices) 
        {
            if ([device position] == cameraPosition)
            {
                _inputCamera = device;
            }
        }
        
        //4. 未获取到摄像头则返回
        if (!_inputCamera) {
            return nil;
        }
        
        // Create the capture session
        //5. 创建AVCaptureSession
        _captureSession = [[AVCaptureSession alloc] init];
        
        [_captureSession beginConfiguration];
        
        // Add the video input
        //6. 添加摄像头视频输入设备
        NSError *error = nil;
        videoInput = [[AVCaptureDeviceInput alloc] initWithDevice:_inputCamera error:&error];
        if ([_captureSession canAddInput:videoInput]) 
        {
            [_captureSession addInput:videoInput];
        }
        
        // Add the video frame output
        //7. 添加视频输出设备
        videoOutput = [[AVCaptureVideoDataOutput alloc] init];
        [videoOutput setAlwaysDiscardsLateVideoFrames:NO];
        
    //    if (captureAsYUV && [GPUImageContext deviceSupportsRedTextures])
        //8. 判断捕捉的YUV颜色空间
        /**
         supportsFastTextureUpload: 从iOS5开始支持的一种CVOpenGLESTextureCacheRef和CVImageBufferRef的映射,通过这个映射可以直接拿到VCPixelBufferRef,而不需要再用glReaderPixel读取数据,这样性能更好。
         */
        if (captureAsYUV && [GPUImageContext supportsFastTextureUpload])
        {
            BOOL supportsFullYUVRange = NO;
            //获取所有支持的视频像素格式
            NSArray *supportedPixelFormats = videoOutput.availableVideoCVPixelFormatTypes;
            for (NSNumber *currentPixelFormat in supportedPixelFormats)
            {
                //找到目前支持的像素格式中有kCVPixelFormatType_420YpCbCr8BiPlanarFullRange,将supportsFullYUVRange改为YES
                if ([currentPixelFormat intValue] == kCVPixelFormatType_420YpCbCr8BiPlanarFullRange)
                {
                    supportsFullYUVRange = YES;
                }
            }
            
            //9. 支持kCVPixelFormatType_420YpCbCr8BiPlanarFullRange格式
            if (supportsFullYUVRange)
            {
                //将视频输出格式设为kCVPixelFormatType_420YpCbCr8BiPlanarFullRange
                [videoOutput setVideoSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarFullRange] forKey:(id)kCVPixelBufferPixelFormatTypeKey]];
                isFullYUVRange = YES;
            }
            else
            {
                [videoOutput setVideoSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange] forKey:(id)kCVPixelBufferPixelFormatTypeKey]];
                isFullYUVRange = NO;
            }
        }
        else
        {
            [videoOutput setVideoSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey]];
        }
        
        runSynchronouslyOnVideoProcessingQueue(^{
            
            if (captureAsYUV)
            {
                [GPUImageContext useImageProcessingContext];
                //            if ([GPUImageContext deviceSupportsRedTextures])
                //            {
                //                yuvConversionProgram = [[GPUImageContext sharedImageProcessingContext] programForVertexShaderString:kGPUImageVertexShaderString fragmentShaderString:kGPUImageYUVVideoRangeConversionForRGFragmentShaderString];
                //            }
                //            else
                //            {
                if (isFullYUVRange)
                {
                    yuvConversionProgram = [[GPUImageContext sharedImageProcessingContext] programForVertexShaderString:kGPUImageVertexShaderString fragmentShaderString:kGPUImageYUVFullRangeConversionForLAFragmentShaderString];
                }
                else
                {
                    yuvConversionProgram = [[GPUImageContext sharedImageProcessingContext] programForVertexShaderString:kGPUImageVertexShaderString fragmentShaderString:kGPUImageYUVVideoRangeConversionForLAFragmentShaderString];
                }
    
                //            }
                
                if (!yuvConversionProgram.initialized)
                {
                    [yuvConversionProgram addAttribute:@"position"];
                    [yuvConversionProgram addAttribute:@"inputTextureCoordinate"];
                    
                    if (![yuvConversionProgram link])
                    {
                        NSString *progLog = [yuvConversionProgram programLog];
                        NSLog(@"Program link log: %@", progLog);
                        NSString *fragLog = [yuvConversionProgram fragmentShaderLog];
                        NSLog(@"Fragment shader compile log: %@", fragLog);
                        NSString *vertLog = [yuvConversionProgram vertexShaderLog];
                        NSLog(@"Vertex shader compile log: %@", vertLog);
                        yuvConversionProgram = nil;
                        NSAssert(NO, @"Filter shader link failed");
                    }
                }
                
                yuvConversionPositionAttribute = [yuvConversionProgram attributeIndex:@"position"];
                yuvConversionTextureCoordinateAttribute = [yuvConversionProgram attributeIndex:@"inputTextureCoordinate"];
                yuvConversionLuminanceTextureUniform = [yuvConversionProgram uniformIndex:@"luminanceTexture"];
                yuvConversionChrominanceTextureUniform = [yuvConversionProgram uniformIndex:@"chrominanceTexture"];
                yuvConversionMatrixUniform = [yuvConversionProgram uniformIndex:@"colorConversionMatrix"];
                
                [GPUImageContext setActiveShaderProgram:yuvConversionProgram];
                
                glEnableVertexAttribArray(yuvConversionPositionAttribute);
                glEnableVertexAttribArray(yuvConversionTextureCoordinateAttribute);
            }
        });
        
        //10. 摄像头捕捉的数据到此代理方法里
        /**
         - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection;
         */
        
        [videoOutput setSampleBufferDelegate:self queue:cameraProcessingQueue];
        if ([_captureSession canAddOutput:videoOutput])
        {
            [_captureSession addOutput:videoOutput];
        }
        else
        {
            NSLog(@"Couldn't add video output");
            return nil;
        }
        
        _captureSessionPreset = sessionPreset;
        [_captureSession setSessionPreset:_captureSessionPreset];
    
    // This will let you get 60 FPS video from the 720p preset on an iPhone 4S, but only that device and that preset
    //    AVCaptureConnection *conn = [videoOutput connectionWithMediaType:AVMediaTypeVideo];
    //    
    //    if (conn.supportsVideoMinFrameDuration)
    //        conn.videoMinFrameDuration = CMTimeMake(1,60);
    //    if (conn.supportsVideoMaxFrameDuration)
    //        conn.videoMaxFrameDuration = CMTimeMake(1,60);
        
        [_captureSession commitConfiguration];
        
        return self;
    }
    
    1. 捕捉视频后获取的视频数据再delegate里返回
    #pragma mark AVCaptureVideoDataOutputSampleBufferDelegate
    //AVFoundation拍摄视频、图片后的代理回调,取得视频、图片数据
    - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
    {
        //正在拍摄
        if (!self.captureSession.isRunning)
        {
            return;
        }
        else if (captureOutput == audioOutput)
        {
            //处理音频
            [self processAudioSampleBuffer:sampleBuffer];
        }
        else
        {
            if (dispatch_semaphore_wait(frameRenderingSemaphore, DISPATCH_TIME_NOW) != 0)
            {
                return;
            }
            
            CFRetain(sampleBuffer);
            runAsynchronouslyOnVideoProcessingQueue(^{
                //Feature Detection Hook.
                if (self.delegate)
                {
                    [self.delegate willOutputSampleBuffer:sampleBuffer];
                }
                
                //处理视频,将sampleBuffer转化成CVImageBufferRef
                [self processVideoSampleBuffer:sampleBuffer];
                
                CFRelease(sampleBuffer);
                dispatch_semaphore_signal(frameRenderingSemaphore);
            });
        }
    }
    
    1. 处理视频数据
    - (void)processVideoSampleBuffer:(CMSampleBufferRef)sampleBuffer;
    {
        if (capturePaused)
        {
            return;
        }
        
        CFAbsoluteTime startTime = CFAbsoluteTimeGetCurrent();
        //将sampleBuffer转换成CVImageBufferRef
        CVImageBufferRef cameraFrame = CMSampleBufferGetImageBuffer(sampleBuffer);
        //获得宽高
        int bufferWidth = (int) CVPixelBufferGetWidth(cameraFrame);
        int bufferHeight = (int) CVPixelBufferGetHeight(cameraFrame);
        //获得颜色附着点
        CFTypeRef colorAttachments = CVBufferGetAttachment(cameraFrame, kCVImageBufferYCbCrMatrixKey, NULL);
        if (colorAttachments != NULL)
        {
            if(CFStringCompare(colorAttachments, kCVImageBufferYCbCrMatrix_ITU_R_601_4, 0) == kCFCompareEqualTo)
            {
                //判断格式
                if (isFullYUVRange)
                {
                    _preferredConversion = kColorConversion601FullRange;
                }
                else
                {
                    _preferredConversion = kColorConversion601; //颜色空间转换矩阵RGB->YUV或者YUV->RGB,RGB更占用内存空间
                }
            }
            else
            {
                _preferredConversion = kColorConversion709; //颜色空间转换矩阵RGB->YUV或者YUV->RGB
            }
        }
        else
        {
            if (isFullYUVRange)
            {
                _preferredConversion = kColorConversion601FullRange;
            }
            else
            {
                _preferredConversion = kColorConversion601;
            }
        }
        ......
        //将图片转化成纹理,处理纹理数据
        luminanceTexture = CVOpenGLESTextureGetName(luminanceTextureRef);
        glBindTexture(GL_TEXTURE_2D, luminanceTexture);
        glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
        glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
        ......
    

    图片添加滤镜---饱和度滤镜

    GPUImageSaturationFilter.h

    @interface GPUImageSaturationFilter : GPUImageFilter
    {
        GLint saturationUniform; //使用Uniform修饰饱和度
    }
    /** Saturation ranges from 0.0 (fully desaturated) to 2.0 (max saturation), with 1.0 as the normal level
     */
    @property(readwrite, nonatomic) CGFloat saturation; //正常值是1.0
    

    GPUImageSaturationFilter.m

    • 饱和度滤镜片元着色器实现
    #if TARGET_IPHONE_SIMULATOR || TARGET_OS_IPHONE //模拟器、手机
    NSString *const kGPUImageSaturationFragmentShaderString = SHADER_STRING //ShaderString
    ( //片元着色器执行,图片使用默认顶点着色器
     varying highp vec2 textureCoordinate; //纹理坐标,从顶点坐标传过来
     
     uniform sampler2D inputImageTexture; //纹理ID
     uniform lowp float saturation; //饱和度
     
     // Values from "Graphics Shaders: Theory and Practice" by Bailey and Cunningham
     const mediump vec3 luminanceWeighting = vec3(0.2125, 0.7154, 0.0721); //调整饱和度的权值W
     
     void main()
     {
        lowp vec4 textureColor = texture2D(inputImageTexture, textureCoordinate); //获取纹理颜色值
        lowp float luminance = dot(textureColor.rgb, luminanceWeighting); //纹理颜色值和权重值w进行点乘
        lowp vec3 greyScaleColor = vec3(luminance); //将点乘结果变成三维向量
        
        gl_FragColor = vec4(mix(greyScaleColor, textureColor.rgb, saturation), textureColor.w); //混合
         
     }
    );
    
    • 初始化滤镜,使用默认顶点着色器
    - (id)init;
    {
        //初始化时如果不使用默认顶点着色器,需要指定顶点着色器,这里使用默认顶点着色器kGPUImageVertexShaderString
        if (!(self = [super initWithFragmentShaderFromString:kGPUImageSaturationFragmentShaderString]))
        {
            return nil;
        }
        
        saturationUniform = [filterProgram uniformIndex:@"saturation"];
        self.saturation = 1.0;
    
        return self;
    }
    
    • 默认顶点着色器
    // Hardcode the vertex shader for standard filters, but this can be overridden
    NSString *const kGPUImageVertexShaderString = SHADER_STRING
    (
     attribute vec4 position; //顶点
     attribute vec4 inputTextureCoordinate; //纹理坐标
     
     varying vec2 textureCoordinate; //传递纹理坐标
     
     void main()
     {
         gl_Position = position; //顶点坐标赋值给内建函数
         textureCoordinate = inputTextureCoordinate.xy; //传递纹理坐标
     }
     );
    
    • 给图片添加饱和度滤镜
    1. 获取图片
    //1.获取图片
        _jingImage = [UIImage imageNamed:@"jing.jpg"];
    
    1. 选择需要的滤镜,比如饱和度等,初始化滤镜
    2. 拿到数据源头-静态图片
    3. 获取处理后的图片显示
    //2.选择需要的滤镜
        if (_disFilter == nil) {
            //只有在_disFilter为空时才创建滤镜,初始化
            _disFilter = [[GPUImageSaturationFilter alloc] init];
        }
        
        //设置饱和度的值,默认1.0
        _disFilter.saturation = 1.0;
        
        //设置滤镜处理的区域,图片的大小
        [_disFilter forceProcessingAtSize:_jingImage.size];
        
        [_disFilter useNextFrameForImageCapture];
        
        //根据slider调整滤镜的饱和度
        _disFilter.saturation = sender.value;
        
        //3.拿到数据源头-静态图片
        GPUImagePicture *stillImageSource = [[GPUImagePicture alloc] initWithImage:_jingImage];
        
        //图片添加滤镜
        [stillImageSource addTarget:_disFilter];
        //处理图片
        [stillImageSource processImage];
        
        //4.获取处理后的图片
        UIImage *newImage = [_disFilter imageFromCurrentFramebuffer];
        
        //5.新图片放到imageView显示
        _jingImageView.image = newImage;
    

    拍照添加滤镜---饱和度滤镜

    • 灰度滤镜片元着色器实现
    NSString *const kGPUImageLuminanceFragmentShaderString = SHADER_STRING //ShaderString
    (
     precision highp float; //精度
     
     varying vec2 textureCoordinate; //纹理坐标
     
     uniform sampler2D inputImageTexture; //纹理
     
     const highp vec3 W = vec3(0.2125, 0.7154, 0.0721); //权值
     
     void main()
     {
         lowp vec4 textureColor = texture2D(inputImageTexture, textureCoordinate); //获取纹理颜色值
         float luminance = dot(textureColor.rgb, W); //纹理颜色值点乘权值
         
         gl_FragColor = vec4(vec3(luminance), textureColor.a); //转成四维向量
     }
    );
    

    相关文章

      网友评论

          本文标题:GPUImage源码分析与使用(二)

          本文链接:https://www.haomeiwen.com/subject/hkdcfhtx.html