概述
GPUImage是一个著名的图像处理开源库,它让你能够在图片、视频、相机上使用GPU加速的滤镜和其它特效。与CoreImage框架相比,可以根据GPUImage提供的接口,使用自定义的滤镜。项目地址:https://github.com/BradLarson/GPUImage
这篇文章主要是阅读GPUImage框架中的 GPUImageTwoInputFilter、GPUImageThreeInputFilter、GPUImageFourInputFilter 这几个类的源码。这几个类与多个纹理输入有关,都间接或直接继承自GPUImageFilter,在需要处理多个纹理输入的时候可以使用它们。以下是源码内容:
GPUImageTwoInputFilter
GPUImageThreeInputFilter
GPUImageFourInputFilter
实现效果
照片.pngGPUImageTwoInputFilter
GPUImageTwoInputFilter 可以接收两个帧缓存对象的输入。它的作用可以将两个帧缓存对象的输入合并成一个帧缓存对象的输出。它继承自GPUImageFilter,因此,可以方便在滤镜链中使用。
- 实例变量。GPUImageTwoInputFilter最主要的特点就是增加了secondInputFramebuffer这个接收第二个帧缓存对象的实例变量,同时,也增加了关于第二个帧缓存对象的其它相关参数。
@interface GPUImageTwoInputFilter : GPUImageFilter
{
// 与第二个帧缓存对象相关的参数
GPUImageFramebuffer *secondInputFramebuffer;
GLint filterSecondTextureCoordinateAttribute;
GLint filterInputTextureUniform2;
GPUImageRotationMode inputRotation2;
CMTime firstFrameTime, secondFrameTime;
// 控制两个帧缓存对象渲染的相关参数
BOOL hasSetFirstTexture, hasReceivedFirstFrame, hasReceivedSecondFrame, firstFrameWasVideo, secondFrameWasVideo;
BOOL firstFrameCheckDisabled, secondFrameCheckDisabled;
}
- 初始化方法。初始化的时候可以不用指定顶点着色器,但是需要指定片段着色器。
// 初始化方法
- (id)initWithFragmentShaderFromString:(NSString *)fragmentShaderString;
- (id)initWithVertexShaderFromString:(NSString *)vertexShaderString fragmentShaderFromString:(NSString *)fragmentShaderString;
// 指定fragmentShaderString来初始化
- (id)initWithFragmentShaderFromString:(NSString *)fragmentShaderString;
{
if (!(self = [self initWithVertexShaderFromString:kGPUImageTwoInputTextureVertexShaderString fragmentShaderFromString:fragmentShaderString]))
{
return nil;
}
return self;
}
// 指定vertexShaderString和fragmentShaderString来初始化
- (id)initWithVertexShaderFromString:(NSString *)vertexShaderString fragmentShaderFromString:(NSString *)fragmentShaderString;
{
if (!(self = [super initWithVertexShaderFromString:vertexShaderString fragmentShaderFromString:fragmentShaderString]))
{
return nil;
}
// 相关变量初始化
inputRotation2 = kGPUImageNoRotation;
hasSetFirstTexture = NO;
hasReceivedFirstFrame = NO;
hasReceivedSecondFrame = NO;
firstFrameWasVideo = NO;
secondFrameWasVideo = NO;
firstFrameCheckDisabled = NO;
secondFrameCheckDisabled = NO;
firstFrameTime = kCMTimeInvalid;
secondFrameTime = kCMTimeInvalid;
runSynchronouslyOnVideoProcessingQueue(^{
[GPUImageContext useImageProcessingContext];
// 获取第二个纹理坐标对象
filterSecondTextureCoordinateAttribute = [filterProgram attributeIndex:@"inputTextureCoordinate2"];
// 获取第二个纹理对象
filterInputTextureUniform2 = [filterProgram uniformIndex:@"inputImageTexture2"]; // This does assume a name of "inputImageTexture2" for second input texture in the fragment shader
glEnableVertexAttribArray(filterSecondTextureCoordinateAttribute);
});
return self;
}
- 其它方法。GPUImageTwoInputFilter给出的方法比较少,但是这里有一些重写父类的方法是处理两个帧缓存对象输入的关键,比如:
- (void)newFrameReadyAtTime:(CMTime)frameTime atIndex:(NSInteger)textureIndex;
,- (void)renderToTextureWithVertices:(const GLfloat *)vertices textureCoordinates:(const GLfloat *)textureCoordinates;
。
// 增加方法
- (void)disableFirstFrameCheck;
- (void)disableSecondFrameCheck;
// 重写方法
- (void)newFrameReadyAtTime:(CMTime)frameTime atIndex:(NSInteger)textureIndex;
- (void)renderToTextureWithVertices:(const GLfloat *)vertices textureCoordinates:(const GLfloat *)textureCoordinates;
// 需不需要检查第一个纹理输入
- (void)disableFirstFrameCheck;
{
firstFrameCheckDisabled = YES;
}
// 需不需要检查第二个纹理输入
- (void)disableSecondFrameCheck;
{
secondFrameCheckDisabled = YES;
}
- (void)newFrameReadyAtTime:(CMTime)frameTime atIndex:(NSInteger)textureIndex;
{
// 已经接收了两个纹理对象的输入,则返回
if (hasReceivedFirstFrame && hasReceivedSecondFrame)
{
return;
}
BOOL updatedMovieFrameOppositeStillImage = NO;
// 处理第一个纹理对象
if (textureIndex == 0)
{
hasReceivedFirstFrame = YES;
firstFrameTime = frameTime;
// 如果不检查第二个纹理输入,则直接默认已经接收了第二个纹理
if (secondFrameCheckDisabled)
{
hasReceivedSecondFrame = YES;
}
if (!CMTIME_IS_INDEFINITE(frameTime))
{
if CMTIME_IS_INDEFINITE(secondFrameTime)
{
updatedMovieFrameOppositeStillImage = YES;
}
}
}
else
{
hasReceivedSecondFrame = YES;
secondFrameTime = frameTime;
// 如果不检查第一个纹理输入,则直接默认已经接收了第一个纹理
if (firstFrameCheckDisabled)
{
hasReceivedFirstFrame = YES;
}
if (!CMTIME_IS_INDEFINITE(frameTime))
{
if CMTIME_IS_INDEFINITE(firstFrameTime)
{
updatedMovieFrameOppositeStillImage = YES;
}
}
}
// || (hasReceivedFirstFrame && secondFrameCheckDisabled) || (hasReceivedSecondFrame && firstFrameCheckDisabled)
// 如果接收了两个纹理输入或者是有效帧,则渲染
if ((hasReceivedFirstFrame && hasReceivedSecondFrame) || updatedMovieFrameOppositeStillImage)
{
CMTime passOnFrameTime = (!CMTIME_IS_INDEFINITE(firstFrameTime)) ? firstFrameTime : secondFrameTime;
[super newFrameReadyAtTime:passOnFrameTime atIndex:0]; // Bugfix when trying to record: always use time from first input (unless indefinite, in which case use the second input)
hasReceivedFirstFrame = NO;
hasReceivedSecondFrame = NO;
}
}
- (void)renderToTextureWithVertices:(const GLfloat *)vertices textureCoordinates:(const GLfloat *)textureCoordinates;
{
if (self.preventRendering)
{
[firstInputFramebuffer unlock];
[secondInputFramebuffer unlock];
return;
}
[GPUImageContext setActiveShaderProgram:filterProgram];
outputFramebuffer = [[GPUImageContext sharedFramebufferCache] fetchFramebufferForSize:[self sizeOfFBO] textureOptions:self.outputTextureOptions onlyTexture:NO];
[outputFramebuffer activateFramebuffer];
if (usingNextFrameForImageCapture)
{
[outputFramebuffer lock];
}
[self setUniformsForProgramAtIndex:0];
glClearColor(backgroundColorRed, backgroundColorGreen, backgroundColorBlue, backgroundColorAlpha);
glClear(GL_COLOR_BUFFER_BIT);
// 激活第一个纹理
glActiveTexture(GL_TEXTURE2);
glBindTexture(GL_TEXTURE_2D, [firstInputFramebuffer texture]);
glUniform1i(filterInputTextureUniform, 2);
// 激活第二个纹理
glActiveTexture(GL_TEXTURE3);
glBindTexture(GL_TEXTURE_2D, [secondInputFramebuffer texture]);
glUniform1i(filterInputTextureUniform2, 3);
glVertexAttribPointer(filterPositionAttribute, 2, GL_FLOAT, 0, 0, vertices);
glVertexAttribPointer(filterTextureCoordinateAttribute, 2, GL_FLOAT, 0, 0, textureCoordinates);
glVertexAttribPointer(filterSecondTextureCoordinateAttribute, 2, GL_FLOAT, 0, 0, [[self class] textureCoordinatesForRotation:inputRotation2]);
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
[firstInputFramebuffer unlock];
[secondInputFramebuffer unlock];
if (usingNextFrameForImageCapture)
{
dispatch_semaphore_signal(imageCaptureSemaphore);
}
}
GPUImageThreeInputFilter
GPUImageThreeInputFilter 可以接收三个帧缓存对象的输入。它的作用可以将三个帧缓存对象的输入合并成一个帧缓存对象的输出。它继承自GPUImageTwoInputFilter,与GPUImageTwoInputFilter类似,主要增加了第三个帧缓存对象处理的相关操作。
- 实例变量。主要增加了与第三个帧缓存对象相关的实例变量。
@interface GPUImageThreeInputFilter : GPUImageTwoInputFilter
{
GPUImageFramebuffer *thirdInputFramebuffer;
GLint filterThirdTextureCoordinateAttribute;
GLint filterInputTextureUniform3;
GPUImageRotationMode inputRotation3;
GLuint filterSourceTexture3;
CMTime thirdFrameTime;
BOOL hasSetSecondTexture, hasReceivedThirdFrame, thirdFrameWasVideo;
BOOL thirdFrameCheckDisabled;
}
- 方法列表。
// 增加方法
- (void)disableThirdFrameCheck;
// 重写方法
- (void)newFrameReadyAtTime:(CMTime)frameTime atIndex:(NSInteger)textureIndex;
- (void)renderToTextureWithVertices:(const GLfloat *)vertices textureCoordinates:(const GLfloat *)textureCoordinates;
- (void)disableThirdFrameCheck;
{
thirdFrameCheckDisabled = YES;
}
- (void)renderToTextureWithVertices:(const GLfloat *)vertices textureCoordinates:(const GLfloat *)textureCoordinates;
{
if (self.preventRendering)
{
[firstInputFramebuffer unlock];
[secondInputFramebuffer unlock];
[thirdInputFramebuffer unlock];
return;
}
[GPUImageContext setActiveShaderProgram:filterProgram];
outputFramebuffer = [[GPUImageContext sharedFramebufferCache] fetchFramebufferForSize:[self sizeOfFBO] textureOptions:self.outputTextureOptions onlyTexture:NO];
[outputFramebuffer activateFramebuffer];
if (usingNextFrameForImageCapture)
{
[outputFramebuffer lock];
}
[self setUniformsForProgramAtIndex:0];
glClearColor(backgroundColorRed, backgroundColorGreen, backgroundColorBlue, backgroundColorAlpha);
glClear(GL_COLOR_BUFFER_BIT);
// 激活第一个纹理
glActiveTexture(GL_TEXTURE2);
glBindTexture(GL_TEXTURE_2D, [firstInputFramebuffer texture]);
glUniform1i(filterInputTextureUniform, 2);
// 激活第二个纹理
glActiveTexture(GL_TEXTURE3);
glBindTexture(GL_TEXTURE_2D, [secondInputFramebuffer texture]);
glUniform1i(filterInputTextureUniform2, 3);
// 激活第三个纹理
glActiveTexture(GL_TEXTURE4);
glBindTexture(GL_TEXTURE_2D, [thirdInputFramebuffer texture]);
glUniform1i(filterInputTextureUniform3, 4);
glVertexAttribPointer(filterPositionAttribute, 2, GL_FLOAT, 0, 0, vertices);
glVertexAttribPointer(filterTextureCoordinateAttribute, 2, GL_FLOAT, 0, 0, textureCoordinates);
glVertexAttribPointer(filterSecondTextureCoordinateAttribute, 2, GL_FLOAT, 0, 0, [[self class] textureCoordinatesForRotation:inputRotation2]);
glVertexAttribPointer(filterThirdTextureCoordinateAttribute, 2, GL_FLOAT, 0, 0, [[self class] textureCoordinatesForRotation:inputRotation3]);
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
[firstInputFramebuffer unlock];
[secondInputFramebuffer unlock];
[thirdInputFramebuffer unlock];
if (usingNextFrameForImageCapture)
{
dispatch_semaphore_signal(imageCaptureSemaphore);
}
}
- (void)newFrameReadyAtTime:(CMTime)frameTime atIndex:(NSInteger)textureIndex;
{
// You can set up infinite update loops, so this helps to short circuit them
if (hasReceivedFirstFrame && hasReceivedSecondFrame && hasReceivedThirdFrame)
{
return;
}
BOOL updatedMovieFrameOppositeStillImage = NO;
if (textureIndex == 0)
{
hasReceivedFirstFrame = YES;
firstFrameTime = frameTime;
// 如果不检查第二个纹理输入,则直接默认已经接收了第二个纹理
if (secondFrameCheckDisabled)
{
hasReceivedSecondFrame = YES;
}
// 如果不检查第三个纹理输入,则直接默认已经接收了第三个纹理
if (thirdFrameCheckDisabled)
{
hasReceivedThirdFrame = YES;
}
if (!CMTIME_IS_INDEFINITE(frameTime))
{
if CMTIME_IS_INDEFINITE(secondFrameTime)
{
updatedMovieFrameOppositeStillImage = YES;
}
}
}
else if (textureIndex == 1)
{
hasReceivedSecondFrame = YES;
secondFrameTime = frameTime;
// 如果不检查第一个纹理输入,则直接默认已经接收了第一个纹理
if (firstFrameCheckDisabled)
{
hasReceivedFirstFrame = YES;
}
// 如果不检查第三个纹理输入,则直接默认已经接收了第三个纹理
if (thirdFrameCheckDisabled)
{
hasReceivedThirdFrame = YES;
}
if (!CMTIME_IS_INDEFINITE(frameTime))
{
if CMTIME_IS_INDEFINITE(firstFrameTime)
{
updatedMovieFrameOppositeStillImage = YES;
}
}
}
else
{
hasReceivedThirdFrame = YES;
thirdFrameTime = frameTime;
// 如果不检查第一个纹理输入,则直接默认已经接收了第一个纹理
if (firstFrameCheckDisabled)
{
hasReceivedFirstFrame = YES;
}
// 如果不检查第二个纹理输入,则直接默认已经接收了第二个纹理
if (secondFrameCheckDisabled)
{
hasReceivedSecondFrame = YES;
}
if (!CMTIME_IS_INDEFINITE(frameTime))
{
if CMTIME_IS_INDEFINITE(firstFrameTime)
{
updatedMovieFrameOppositeStillImage = YES;
}
}
}
// || (hasReceivedFirstFrame && secondFrameCheckDisabled) || (hasReceivedSecondFrame && firstFrameCheckDisabled)
// 如果已经接收了三个纹理输入或者是有效帧,则渲染
if ((hasReceivedFirstFrame && hasReceivedSecondFrame && hasReceivedThirdFrame) || updatedMovieFrameOppositeStillImage)
{
static const GLfloat imageVertices[] = {
-1.0f, -1.0f,
1.0f, -1.0f,
-1.0f, 1.0f,
1.0f, 1.0f,
};
[self renderToTextureWithVertices:imageVertices textureCoordinates:[[self class] textureCoordinatesForRotation:inputRotation]];
[self informTargetsAboutNewFrameAtTime:frameTime];
hasReceivedFirstFrame = NO;
hasReceivedSecondFrame = NO;
hasReceivedThirdFrame = NO;
}
}
GPUImageFourInputFilter
GPUImageFourInputFilter 可以接收四个帧缓存对象的输入。它的作用可以将四个帧缓存对象的输入合并成一个帧缓存对象的输出。它继承自GPUImageThreeInputFilter,与GPUImageTwoInputFilter、GPUImageThreeInputFilter类似,主要增加了第四个帧缓存对象处理的相关操作。
- 实例变量。主要增加了与第四个帧缓存对象相关的实例变量。
@interface GPUImageFourInputFilter : GPUImageThreeInputFilter
{
GPUImageFramebuffer *fourthInputFramebuffer;
GLint filterFourthTextureCoordinateAttribute;
GLint filterInputTextureUniform4;
GPUImageRotationMode inputRotation4;
GLuint filterSourceTexture4;
CMTime fourthFrameTime;
BOOL hasSetThirdTexture, hasReceivedFourthFrame, fourthFrameWasVideo;
BOOL fourthFrameCheckDisabled;
}
- 方法列表。 GPUImageFourInputFilter与GPUImageTwoInputFilter、GPUImageThreeInputFilter相比,它们的一些关键方法的逻辑都比较类似。
// 增加方法
- (void)disableFourthFrameCheck;
// 重写方法
- (void)newFrameReadyAtTime:(CMTime)frameTime atIndex:(NSInteger)textureIndex;
- (void)renderToTextureWithVertices:(const GLfloat *)vertices textureCoordinates:(const GLfloat *)textureCoordinates;
- (void)disableFourthFrameCheck;
{
fourthFrameCheckDisabled = YES;
}
- (void)renderToTextureWithVertices:(const GLfloat *)vertices textureCoordinates:(const GLfloat *)textureCoordinates;
{
if (self.preventRendering)
{
[firstInputFramebuffer unlock];
[secondInputFramebuffer unlock];
[thirdInputFramebuffer unlock];
[fourthInputFramebuffer unlock];
return;
}
[GPUImageContext setActiveShaderProgram:filterProgram];
outputFramebuffer = [[GPUImageContext sharedFramebufferCache] fetchFramebufferForSize:[self sizeOfFBO] textureOptions:self.outputTextureOptions onlyTexture:NO];
[outputFramebuffer activateFramebuffer];
if (usingNextFrameForImageCapture)
{
[outputFramebuffer lock];
}
[self setUniformsForProgramAtIndex:0];
glClearColor(backgroundColorRed, backgroundColorGreen, backgroundColorBlue, backgroundColorAlpha);
glClear(GL_COLOR_BUFFER_BIT);
// 激活纹理对象
glActiveTexture(GL_TEXTURE2);
glBindTexture(GL_TEXTURE_2D, [firstInputFramebuffer texture]);
glUniform1i(filterInputTextureUniform, 2);
// 激活纹理对象
glActiveTexture(GL_TEXTURE3);
glBindTexture(GL_TEXTURE_2D, [secondInputFramebuffer texture]);
glUniform1i(filterInputTextureUniform2, 3);
// 激活纹理对象
glActiveTexture(GL_TEXTURE4);
glBindTexture(GL_TEXTURE_2D, [thirdInputFramebuffer texture]);
glUniform1i(filterInputTextureUniform3, 4);
// 激活纹理对象
glActiveTexture(GL_TEXTURE5);
glBindTexture(GL_TEXTURE_2D, [fourthInputFramebuffer texture]);
glUniform1i(filterInputTextureUniform4, 5);
glVertexAttribPointer(filterPositionAttribute, 2, GL_FLOAT, 0, 0, vertices);
glVertexAttribPointer(filterTextureCoordinateAttribute, 2, GL_FLOAT, 0, 0, textureCoordinates);
glVertexAttribPointer(filterSecondTextureCoordinateAttribute, 2, GL_FLOAT, 0, 0, [[self class] textureCoordinatesForRotation:inputRotation2]);
glVertexAttribPointer(filterThirdTextureCoordinateAttribute, 2, GL_FLOAT, 0, 0, [[self class] textureCoordinatesForRotation:inputRotation3]);
glVertexAttribPointer(filterFourthTextureCoordinateAttribute, 2, GL_FLOAT, 0, 0, [[self class] textureCoordinatesForRotation:inputRotation4]);
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
[firstInputFramebuffer unlock];
[secondInputFramebuffer unlock];
[thirdInputFramebuffer unlock];
[fourthInputFramebuffer unlock];
if (usingNextFrameForImageCapture)
{
dispatch_semaphore_signal(imageCaptureSemaphore);
}
}
- (void)newFrameReadyAtTime:(CMTime)frameTime atIndex:(NSInteger)textureIndex;
{
// You can set up infinite update loops, so this helps to short circuit them
if (hasReceivedFirstFrame && hasReceivedSecondFrame && hasReceivedThirdFrame)
{
return;
}
BOOL updatedMovieFrameOppositeStillImage = NO;
if (textureIndex == 0)
{
hasReceivedFirstFrame = YES;
firstFrameTime = frameTime;
if (secondFrameCheckDisabled)
{
hasReceivedSecondFrame = YES;
}
if (thirdFrameCheckDisabled)
{
hasReceivedThirdFrame = YES;
}
if (fourthFrameCheckDisabled)
{
hasReceivedThirdFrame = YES;
}
if (!CMTIME_IS_INDEFINITE(frameTime))
{
if CMTIME_IS_INDEFINITE(secondFrameTime)
{
updatedMovieFrameOppositeStillImage = YES;
}
}
}
else if (textureIndex == 1)
{
hasReceivedSecondFrame = YES;
secondFrameTime = frameTime;
if (firstFrameCheckDisabled)
{
hasReceivedFirstFrame = YES;
}
if (thirdFrameCheckDisabled)
{
hasReceivedThirdFrame = YES;
}
if (fourthFrameCheckDisabled)
{
hasReceivedFourthFrame = YES;
}
if (!CMTIME_IS_INDEFINITE(frameTime))
{
if CMTIME_IS_INDEFINITE(firstFrameTime)
{
updatedMovieFrameOppositeStillImage = YES;
}
}
}
else if (textureIndex == 2)
{
hasReceivedThirdFrame = YES;
thirdFrameTime = frameTime;
if (firstFrameCheckDisabled)
{
hasReceivedFirstFrame = YES;
}
if (secondFrameCheckDisabled)
{
hasReceivedSecondFrame = YES;
}
if (fourthFrameCheckDisabled)
{
hasReceivedFourthFrame = YES;
}
if (!CMTIME_IS_INDEFINITE(frameTime))
{
if CMTIME_IS_INDEFINITE(firstFrameTime)
{
updatedMovieFrameOppositeStillImage = YES;
}
}
}
else
{
hasReceivedFourthFrame = YES;
fourthFrameTime = frameTime;
if (firstFrameCheckDisabled)
{
hasReceivedFirstFrame = YES;
}
if (secondFrameCheckDisabled)
{
hasReceivedSecondFrame = YES;
}
if (thirdFrameCheckDisabled)
{
hasReceivedThirdFrame = YES;
}
if (!CMTIME_IS_INDEFINITE(frameTime))
{
if CMTIME_IS_INDEFINITE(firstFrameTime)
{
updatedMovieFrameOppositeStillImage = YES;
}
}
}
// || (hasReceivedFirstFrame && secondFrameCheckDisabled) || (hasReceivedSecondFrame && firstFrameCheckDisabled)
// 如果已经接收四个纹理输入或者是有效帧,则渲染
if ((hasReceivedFirstFrame && hasReceivedSecondFrame && hasReceivedThirdFrame && hasReceivedFourthFrame) || updatedMovieFrameOppositeStillImage)
{
static const GLfloat imageVertices[] = {
-1.0f, -1.0f,
1.0f, -1.0f,
-1.0f, 1.0f,
1.0f, 1.0f,
};
[self renderToTextureWithVertices:imageVertices textureCoordinates:[[self class] textureCoordinatesForRotation:inputRotation]];
[self informTargetsAboutNewFrameAtTime:frameTime];
hasReceivedFirstFrame = NO;
hasReceivedSecondFrame = NO;
hasReceivedThirdFrame = NO;
hasReceivedFourthFrame = NO;
}
}
实现过程
- 先加载图片,然后分别经过两个滤镜处理,再将处理后的结果输出到GPUImageTwoInputFilter中处理,最后用GPUImageView显示处理后的结果。
// 图片
GPUImagePicture *picture = [[GPUImagePicture alloc] initWithImage:[UIImage imageNamed:@"1.jpg"]];
// 滤镜
GPUImageCannyEdgeDetectionFilter *cannyFilter = [[GPUImageCannyEdgeDetectionFilter alloc] init];
GPUImageGammaFilter *gammaFilter = [[GPUImageGammaFilter alloc] init];
[picture addTarget:cannyFilter];
[picture addTarget:gammaFilter];
// GPUImageTwoInputFilter
GPUImageTwoInputFilter *twoFilter = [[GPUImageTwoInputFilter alloc] initWithFragmentShaderFromString:kTwoInputFragmentShaderString];
[cannyFilter addTarget:twoFilter];
[gammaFilter addTarget:twoFilter];
[twoFilter addTarget:_imageView];
[picture processImage];
片段着色器程序
NSString *const kTwoInputFragmentShaderString = SHADER_STRING
(
varying highp vec2 textureCoordinate;
varying highp vec2 textureCoordinate2;
uniform sampler2D inputImageTexture;
uniform sampler2D inputImageTexture2;
void main()
{
highp vec4 oneInputColor = texture2D(inputImageTexture, textureCoordinate);
highp vec4 twoInputColor = texture2D(inputImageTexture2, textureCoordinate2);
highp float range = distance(textureCoordinate, vec2(0.5, 0.5));
highp vec4 dstClor = oneInputColor;
if (range < 0.25) {
dstClor = twoInputColor;
}else {
//dstClor = vec4(vec3(1.0 - oneInputColor), 1.0);
if (oneInputColor.r < 0.001 && oneInputColor.g < 0.001 && oneInputColor.b < 0.001) {
dstClor = vec4(1.0);
}else {
dstClor = vec4(1.0, 0.0, 0.0, 1.0);
}
}
gl_FragColor = dstClor;
}
);
总结
GPUImageTwoInputFilter、GPUImageThreeInputFilter、GPUImageFourInputFilter 这几个类在我们需要合并多个滤镜处理结果的时候会很有用,在使用的时候也比较简单。当然在GPUImage框架中已有很多滤镜继承自这几个类,可以处理多个纹理的输入。
源码地址:GPUImage源码阅读系列 https://github.com/QinminiOS/GPUImage
系列文章地址:GPUImage源码阅读 http://www.jianshu.com/nb/11749791
网友评论