美文网首页恩美第二个APP项目iOS开发专区iOS技能
iOS利用摄像头获取环境光感参数

iOS利用摄像头获取环境光感参数

作者: 西叶lv | 来源:发表于2017-05-03 09:54 被阅读2018次

不多说,代码如下:

#import "LightSensitiveViewController.h"

@import AVFoundation;

#import <ImageIO/ImageIO.h>

@interface LightSensitiveViewController ()< AVCaptureVideoDataOutputSampleBufferDelegate>

@property (nonatomic, strong) AVCaptureSession *session;

@end

@implementation LightSensitiveViewController

- (void)viewDidLoad {
    [super viewDidLoad];
    // Do any additional setup after loading the view.
    self.view.backgroundColor = [UIColor whiteColor];
    
    self.navigationItem.title = @"光感";
    [self lightSensitive];
}

- (void)didReceiveMemoryWarning {
    [super didReceiveMemoryWarning];
    // Dispose of any resources that can be recreated.
}

#pragma mark- 光感
- (void)lightSensitive {
    
    // 1.获取硬件设备
    AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    
    // 2.创建输入流
    AVCaptureDeviceInput *input = [[AVCaptureDeviceInput alloc]initWithDevice:device error:nil];
    
    // 3.创建设备输出流
    AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];
    [output setSampleBufferDelegate:self queue:dispatch_get_main_queue()];
    

    // AVCaptureSession属性
    self.session = [[AVCaptureSession alloc]init];
    // 设置为高质量采集率
    [self.session setSessionPreset:AVCaptureSessionPresetHigh];
    // 添加会话输入和输出
    if ([self.session canAddInput:input]) {
        [self.session addInput:input];
    }
    if ([self.session canAddOutput:output]) {
        [self.session addOutput:output];
    }
    
    // 9.启动会话
    [self.session startRunning];
    
}

#pragma mark- AVCaptureVideoDataOutputSampleBufferDelegate的方法
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
    
    CFDictionaryRef metadataDict = CMCopyDictionaryOfAttachments(NULL,sampleBuffer, kCMAttachmentMode_ShouldPropagate);
    NSDictionary *metadata = [[NSMutableDictionary alloc] initWithDictionary:(__bridge NSDictionary*)metadataDict];
    CFRelease(metadataDict);
    NSDictionary *exifMetadata = [[metadata objectForKey:(NSString *)kCGImagePropertyExifDictionary] mutableCopy];
    float brightnessValue = [[exifMetadata objectForKey:(NSString *)kCGImagePropertyExifBrightnessValue] floatValue];
    
    NSLog(@"%f",brightnessValue);
    
    
    // 根据brightnessValue的值来打开和关闭闪光灯
    AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    BOOL result = [device hasTorch];// 判断设备是否有闪光灯
    if ((brightnessValue < 0) && result) {// 打开闪光灯
        
        [device lockForConfiguration:nil];
        
        [device setTorchMode: AVCaptureTorchModeOn];//开
        
        [device unlockForConfiguration];
        
    }else if((brightnessValue > 0) && result) {// 关闭闪光灯
        
        [device lockForConfiguration:nil];
        [device setTorchMode: AVCaptureTorchModeOff];//关
        [device unlockForConfiguration];
        
    }
    
}

@end

注意点:

  • 首先引入AVFoundation框架和ImageIO/ImageIO.h声明文件
  • 遵循AVCaptureVideoDataOutputSampleBufferDelegate协议
  • AVCaptureSession对象要定义为属性,确保有对象在一直引用AVCaptureSession对象;否则如果在lightSensitive方法中定义并初始化AVCaptureSession对象,会造成AVCaptureSession对象提前释放, [self.session startRunning];会失效
  • 实现AVCaptureVideoDataOutputSampleBufferDelegate的代理方法,参数brightnessValue就是周围环境的亮度参数了,范围大概在-5~~12之间,参数数值越大,环境越亮

参考文章
iOS开发 读取环境光亮度

相关文章

网友评论

  • 小学生_daae:作者,请问如何将摄像头改为前置摄像头,意思是通过前置摄像头获取光强,望指教
  • Cocoa_NewBee:感谢楼主提供的方法,不过该方法似乎有个问题:将镜头对准颜色较暗(黑色)的物体,就算处在光线较好的环境中,也会打开闪光灯。但是像支付宝之类的扫一扫则不会这样,能否解答下为什么?十分感谢
    西叶lv:@robyzhou 这个没研究了,你研究了么?可以探索下,😆
    robyzhou:@郝嘉律 这个恐怕支付宝也取了前置摄像头的图像来做分析吧
    西叶lv:@Cocoa_NewBee 还没研究,我试试这个问题
  • EchoZuo:CFDictionaryRef metadataDict = CMCopyDictionaryOfAttachments(NULL,sampleBuffer, kCMAttachmentMode_ShouldPropagate);
    NSDictionary *metadata = [[NSMutableDictionary alloc] initWithDictionary:(__bridge NSDictionary*)metadataDict];
    CFRelease(metadataDict);
    NSDictionary *exifMetadata = [[metadata objectForKey:(NSString *)kCGImagePropertyExifDictionary] mutableCopy];
    float brightnessValue = [[exifMetadata objectForKey:(NSString *)kCGImagePropertyExifBrightnessValue] floatValue];


    这段代码是否可以解释下逻辑?
    codermali:@蜡泪再塑 感谢
    kongkk:CFDictionaryRef metadataDict = CMCopyDictionaryOfAttachments(NULL,sampleBuffer, kCMAttachmentMode_ShouldPropagate); --->将系统有关摄像头采集到的信息返回,通过CMCopyDictionary...方法转换成一个CFDictionaryRef类型的字典。里面包含了环境亮度值还有摄像头光圈等等信息。
    NSDictionary *metadata = [[NSMutableDictionary alloc] initWithDictionary:(__bridge NSDictionary*)metadataDict]; -->将CFDictionaryRef字典转换成为NSDctionary,便于操作和取值。
    CFRelease(metadataDict); --> 释放Ref,防止内存泄露

    NSDictionary *exifMetadata = [[metadata objectForKey:(NSString *)kCGImagePropertyExifDictionary] mutableCopy];
    float brightnessValue = [[exifMetadata objectForKey:(NSString *)kCGImagePropertyExifBrightnessValue] floatValue];
    --> 这几句有不同说了吧,通过Key,取Value。Key值可以直接输出MetaData进行查看,也可以用SDK定义好的进行取值(例如kCGImagePropertyExifDictionary对应的其实是{Exif})。


本文标题:iOS利用摄像头获取环境光感参数

本文链接:https://www.haomeiwen.com/subject/bmlktxtx.html