美文网首页
UIImage和UIColor的那些事

UIImage和UIColor的那些事

作者: Baby小猪 | 来源:发表于2017-06-26 15:38 被阅读84次

    最近项目中碰到一个问题,由于背景图片太白,掩盖了视图中button的位置,造成了很不好的用户体验。

    消失了的按钮.png

    很显然,用户很难发现在左下角竟然隐藏了一个按钮。

    • 解决方案一:
      1.取背景图片的主色
      2.判断主色是否是白色,如果是白色则改变按钮的颜色
    //获取图片的主色
    + (UIColor *)mainColor:(UIImage *)image{
        int bitmapInfo = kCGBitmapByteOrderDefault | kCGImageAlphaPremultipliedLast;
        //把图片缩小
        CGSize thumbSize = CGSizeMake(image.size.width/2, image.size.height/2);
        CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
        CGContextRef context = CGBitmapContextCreate(NULL, thumbSize.width, thumbSize.height, 8, thumbSize.width*4, colorSpace, bitmapInfo);
        CGRect drawRect = CGRectMake(0, 0, thumbSize.width, thumbSize.height);
        CGContextDrawImage(context, drawRect, image.CGImage);
        CGColorSpaceRelease(colorSpace);
        
        //获取每个点的像素
        unsigned char* data = CGBitmapContextGetData (context);
        if (data == NULL) return nil;
        NSCountedSet *cls=[NSCountedSet setWithCapacity:thumbSize.width*thumbSize.height];
        
        for (int x=0; x<thumbSize.width; x++) {
            for (int y=0; y<thumbSize.height; y++) {
                int offset = 4*(x*y);
                int red = data[offset];
                int green = data[offset+1];
                int blue = data[offset+2];
                int alpha =  data[offset+3];
                if (alpha>0) {//去除透明
                    if (red==255&&green==255&&blue==255) {//去除白色
                    }else{
                        NSArray *clr=@[@(red),@(green),@(blue),@(alpha)];
                        [cls addObject:clr];
                    }
                    
                }
            }
        }
        CGContextRelease(context);
        //第三步 找到出现次数最多的那个颜色
        NSEnumerator *enumerator = [cls objectEnumerator];
        NSArray *curColor = nil;
        NSArray *MaxColor=nil;
        NSUInteger MaxCount=0;
        while ( (curColor = [enumerator nextObject]) != nil )
        {
            NSUInteger tmpCount = [cls countForObject:curColor];
            if ( tmpCount < MaxCount ) continue;
            MaxCount=tmpCount;
            MaxColor=curColor;
            
        }
        return [UIColor colorWithRed:([MaxColor[0] intValue]/255.0f) green:([MaxColor[1] intValue]/255.0f) blue:([MaxColor[2] intValue]/255.0f) alpha:([MaxColor[3] intValue]/255.0f)];
    }
    
    
    

    这种方案解决了背景颜色主色是白色的掩盖问题,可是如果图片的主色不是白色,但是白色的位置刚好和按钮的位置重合了呢?
    于是,

    • 方案二:
      去除图片的白色背景,替换成其他颜色,或者透明
    //去除图片的白色背景
    + (UIImage*) imageToTransparent:(UIImage*) image
    
    {
        
        // 分配内存
        
        const int imageWidth = image.size.width;
        
        const int imageHeight = image.size.height;
        
        size_t bytesPerRow = imageWidth * 4;
        
        uint32_t* rgbImageBuf = (uint32_t*)malloc(bytesPerRow * imageHeight);
        
        
        
        // 创建context
        
        CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
        
        CGContextRef context = CGBitmapContextCreate(rgbImageBuf, imageWidth, imageHeight, 8, bytesPerRow, colorSpace,
                                                     
                                                     kCGBitmapByteOrder32Little | kCGImageAlphaNoneSkipLast);
        
        CGContextDrawImage(context, CGRectMake(0, 0, imageWidth, imageHeight), image.CGImage);
        
        
        
        // 遍历像素
        
        int pixelNum = imageWidth * imageHeight;
        
        uint32_t* pCurPtr = rgbImageBuf;
        
        for (int i = 0; i < pixelNum; i++, pCurPtr++)
            
        {
            
            //        //去除白色...将0xFFFFFF00换成其它颜色也可以替换其他颜色。
            
            //        if ((*pCurPtr & 0xFFFFFF00) >= 0xffffff00) {
            
            //
            
            //            uint8_t* ptr = (uint8_t*)pCurPtr;
            
            //            ptr[0] = 0;
            
            //        }
            
            //接近白色
            
            //将像素点转成子节数组来表示---第一个表示透明度即ARGB这种表示方式。ptr[0]:透明度,ptr[1]:R,ptr[2]:G,ptr[3]:B
            
            //分别取出RGB值后。进行判断需不需要设成透明。
            
            uint8_t* ptr = (uint8_t*)pCurPtr;
            
            if (ptr[1] > 240 && ptr[2] > 240 && ptr[3] > 240) {
                
                //当RGB值都大于240则比较接近白色的都将透明度设为0.-----即接近白色的都设置为透明。某些白色背景具有杂质就会去不干净,用这个方法可以去干净
                
                ptr[0] = 0;
                
            }
            
        }
        
        // 将内存转成image
        
        CGDataProviderRef dataProvider =CGDataProviderCreateWithData(NULL, rgbImageBuf, bytesPerRow * imageHeight, nil);
        
        
        
        CGImageRef imageRef = CGImageCreate(imageWidth, imageHeight,8, 32, bytesPerRow, colorSpace,
                                            
                                            kCGImageAlphaLast |kCGBitmapByteOrder32Little, dataProvider,
                                            
                                            NULL, true,kCGRenderingIntentDefault);
        
        CGDataProviderRelease(dataProvider);
        
        UIImage* resultUIImage = [UIImage imageWithCGImage:imageRef];
        
        // 释放
        
        CGImageRelease(imageRef);
        
        CGContextRelease(context);
        
        CGColorSpaceRelease(colorSpace);
        
        return resultUIImage;
        
    }
    
    
    

    如此以来又出现了新的问题,直接破坏了原有的背景图片,可行性为0

    虽然,还没有找到更好的可行的办法,但是不知不觉的竟然让我找到了几个不错的图片和颜色的方法。
    etc.
    • func1
    //取图片的反色
    + (UIImage *)applyColorInvertFilter:(UIImage *)image{
        GPUImageColorInvertFilter *filter = [[GPUImageColorInvertFilter alloc] init];
        [filter forceProcessingAtSize:image.size];
        GPUImagePicture *pic = [[GPUImagePicture alloc] initWithImage:image];
        [pic addTarget:filter];
        [pic processImage];
        [filter useNextFrameForImageCapture];
        
        return [filter imageFromCurrentFramebuffer];
    }
    
    
    
    • func2
    //图片的灰度处理
    + (UIImage *)applySepiaFilter:(UIImage *)image
    {
        GPUImageSepiaFilter *filter = [[GPUImageSepiaFilter alloc] init];
        [filter forceProcessingAtSize:image.size];
        GPUImagePicture *pic = [[GPUImagePicture alloc] initWithImage:image];
        [pic addTarget:filter];
        [pic processImage];
        [filter useNextFrameForImageCapture];
        
        return [filter imageFromCurrentFramebuffer];
    }
    
    
    • func3
    //高斯模糊(周围没有白边)
    + (UIImage *)imageWithGaussianBlur:(UIImage *)image float:(CGFloat)blur{
        if (blur < 0.f || blur > 1.f) {
            blur = 0.5f;
        }
        int boxSize = (int)(blur * 40);
        boxSize = boxSize - (boxSize % 2) + 1;
        
        CGImageRef img = image.CGImage;
        
        vImage_Buffer inBuffer, outBuffer;
        vImage_Error error;
        
        void *pixelBuffer;
        //从CGImage中获取数据
        CGDataProviderRef inProvider = CGImageGetDataProvider(img);
        CFDataRef inBitmapData = CGDataProviderCopyData(inProvider);
        //设置从CGImage获取对象的属性
        inBuffer.width = CGImageGetWidth(img);
        inBuffer.height = CGImageGetHeight(img);
        inBuffer.rowBytes = CGImageGetBytesPerRow(img);
        
        inBuffer.data = (void*)CFDataGetBytePtr(inBitmapData);
        
        pixelBuffer = malloc(CGImageGetBytesPerRow(img) *
                             CGImageGetHeight(img));
        
        if(pixelBuffer == NULL)
            NSLog(@"No pixelbuffer");
        
        outBuffer.data = pixelBuffer;
        outBuffer.width = CGImageGetWidth(img);
        outBuffer.height = CGImageGetHeight(img);
        outBuffer.rowBytes = CGImageGetBytesPerRow(img);
        
        error = vImageBoxConvolve_ARGB8888(&inBuffer, &outBuffer, NULL, 0, 0, boxSize, boxSize, NULL, kvImageEdgeExtend);
        
        if (error) {
            NSLog(@"error from convolution %ld", error);
        }
        
        CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
        CGContextRef ctx = CGBitmapContextCreate(
                                                 outBuffer.data,
                                                 outBuffer.width,
                                                 outBuffer.height,
                                                 8,
                                                 outBuffer.rowBytes,
                                                 colorSpace,
                                                 kCGImageAlphaNoneSkipLast);
        CGImageRef imageRef = CGBitmapContextCreateImage (ctx);
        UIImage *returnImage = [UIImage imageWithCGImage:imageRef];
        
        //clean up
        CGContextRelease(ctx);
        CGColorSpaceRelease(colorSpace);
        
        free(pixelBuffer);
        CFRelease(inBitmapData);
        
        CGColorSpaceRelease(colorSpace);
        CGImageRelease(imageRef);
        
        return returnImage;
    }
    
    

    特别提醒的是,我试过使用CoreImage,发现它的渲染阻塞主进程,最关键的是还有白边,所以,我比较偏向鱼vImage,他的算法也更加高效,效果更加完美。

    • func4
    //判断颜色深浅
    + (BOOL)isDarkColor:(UIColor *)newColor{
        
        const CGFloat *componentColors = CGColorGetComponents(newColor.CGColor);
        CGFloat colorBrightness = ((componentColors[0] * 299) + (componentColors[1] * 587) + (componentColors[2] * 114)) / 1000;
        if (colorBrightness < 0.5){
            NSLog(@"Color is dark");
            return YES;
        }
        else{
            NSLog(@"Color is light");
            return NO;
        }
    }
    
    
    • func5
    //毛玻璃(ios8)
    UIBlurEffect *effect = [UIBlurEffect effectWithStyle:UIBlurEffectStyleLight];
        UIVisualEffectView *effectView = [[UIVisualEffectView alloc] initWithEffect:effect];
        effectView.alpha = 0.9;
        effectView.frame = CGRectMake(0, 0, imageView.frame.size.width*0.5, imageView.frame.size.height);
        imageView.image = image;
        [imageView addSubview:effectView];
    
    • func6
    //颜色转换为图片
    + (UIImage *)imageWithColor:(UIColor *)color {
        CGRect rect = CGRectMake(0.0f, 0.0f, 1.0f, 1.0f);
        UIGraphicsBeginImageContext(rect.size);
        CGContextRef context = UIGraphicsGetCurrentContext();
        
        CGContextSetFillColorWithColor(context, [color CGColor]);
        CGContextFillRect(context, rect);
        
        UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
        UIGraphicsEndImageContext();
        
        return image;
    }
    
    

    此外还用到到了一个大家都再熟悉不过的库GPUImage。相信做过滤镜的朋友大家都很熟悉,在这里我就不去班门弄斧了。

    一个用户体验引发的思考~

    相关文章

      网友评论

          本文标题:UIImage和UIColor的那些事

          本文链接:https://www.haomeiwen.com/subject/hnkjcxtx.html