美文网首页iOS技术交流收藏
视频和图片序列帧互相转换

视频和图片序列帧互相转换

作者: KevinTing | 来源:发表于2016-10-31 14:59 被阅读3052次

    工程地址(https://github.com/tujinqiu/KTMovieImagesTransfer)

    如果你有一些连续的图片序列,那么把它转换成MP4再放到网络上传输是一个好的选择,因为size会小很多。从视频里面抽取连续的图片序列也是一个偶尔会遇到的问题。我分别尝试使用iOS原生的API,FFmpeg和OpenCV来解决这两个问题。

    1、原生方法

    使用原生方法主要是利用AVFoundation框架的api进行转换的。
    1、将视频解成序列帧

    - (NSError *)nativeTransferMovie:(NSString *)movie toImagesAtPath:(NSString *)imagesPath
    {
        AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:[NSURL fileURLWithPath:movie] options:nil];
        AVAssetImageGenerator *generator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
        CMTime time = asset.duration;
        NSUInteger totalFrameCount = CMTimeGetSeconds(time) * kKTImagesMovieTransferFPS;
        NSMutableArray *timesArray = [NSMutableArray arrayWithCapacity:totalFrameCount];
        for (NSUInteger ii = 0; ii < totalFrameCount; ++ii) {
            CMTime timeFrame = CMTimeMake(ii, kKTImagesMovieTransferFPS);
            NSValue *timeValue = [NSValue valueWithCMTime:timeFrame];
            [timesArray addObject:timeValue];
        }
        generator.requestedTimeToleranceBefore = kCMTimeZero;
        generator.requestedTimeToleranceAfter = kCMTimeZero;
        __block NSError *returnError = nil;
        [generator generateCGImagesAsynchronouslyForTimes:timesArray completionHandler:^(CMTime requestedTime, CGImageRef  _Nullable image, CMTime actualTime, AVAssetImageGeneratorResult result, NSError * _Nullable error) {
            switch (result) {
                    
                case AVAssetImageGeneratorFailed:
                    returnError = error;
                    [self sendToMainThreadError:returnError];
                    break;
                    
                case AVAssetImageGeneratorSucceeded:
                {
                    NSString *imageFile = [imagesPath stringByAppendingPathComponent:[NSString stringWithFormat:@"%lld.jpg", requestedTime.value]];
                    NSData *data = UIImageJPEGRepresentation([UIImage imageWithCGImage:image], 1.0);
                    if ([data writeToFile:imageFile atomically:YES]) {
                        dispatch_async(dispatch_get_main_queue(), ^{
                            if ([self.delegate respondsToSelector:@selector(transfer:didTransferedAtIndex:totalFrameCount:)]) {
                                [self.delegate transfer:self didTransferedAtIndex:requestedTime.value totalFrameCount:totalFrameCount];
                            }
                        });
                        NSUInteger index = requestedTime.value;
                        if (index == totalFrameCount - 1) {
                            if ([self.delegate respondsToSelector:@selector(transfer:didFinishedWithError:)]) {
                                [self.delegate transfer:self didFinishedWithError:nil];
                            }
                        }
                    } else {
                        returnError = [self errorWithErrorCode:KTTransferWriteError object:imageFile];
                        [self sendToMainThreadError:returnError];
                        [generator cancelAllCGImageGeneration];
                    }
                }
                    break;
                    
                default:
                    break;
            }
        }];
        
        return returnError;
    }
    

    主要是利用AVAssetImageGenerator抽取图片,注意在调用generateCGImagesAsynchronouslyForTimes: completionHandler方法之前,用帧率和视频时长计算帧数。

    2、将序列帧压缩为视频

    - (NSError *)nativeTransferImageFiles:(NSArray<NSString *> *)imageFiles toMovie:(NSString *)movie
    {
        __block NSError *returnError = nil;
        AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:[NSURL fileURLWithPath:movie] fileType:AVFileTypeQuickTimeMovie error:&returnError];
        if (returnError) {
            [self sendToMainThreadError:returnError];
            return returnError;
        }
        UIImage *firstImage = [UIImage imageWithContentsOfFile:[imageFiles firstObject]];
        if (!firstImage) {
            returnError = [self errorWithErrorCode:KTTransferReadImageError object:[imageFiles firstObject]];
            [self sendToMainThreadError:returnError];
            return returnError;
        }
        CGSize size = firstImage.size;
        // h264格式
        NSDictionary *videoSettings = @{AVVideoCodecKey: AVVideoCodecH264,
                                        AVVideoWidthKey: [NSNumber numberWithInt:size.width],
                                        AVVideoHeightKey: [NSNumber numberWithInt:size.height]};
        AVAssetWriterInput* writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo
                                                                             outputSettings:videoSettings];
        AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput sourcePixelBufferAttributes:nil];
        
        dispatch_async(KTImagesMovieTransferQueue(), ^{
            [videoWriter addInput:writerInput];
            [videoWriter startWriting];
            [videoWriter startSessionAtSourceTime:kCMTimeZero];
            UIImage *tmpImage = nil;
            NSUInteger index = 0;
            while (index < imageFiles.count) {
                if(writerInput.readyForMoreMediaData) {
                    CMTime presentTime = CMTimeMake(index, kKTImagesMovieTransferFPS);
                    tmpImage = [UIImage imageWithContentsOfFile:[imageFiles objectAtIndex:index]];
                    if (!tmpImage) {
                        returnError = [self errorWithErrorCode:KTTransferReadImageError object:[imageFiles firstObject]];
                        [self sendToMainThreadError:returnError];
                        return;
                    }
                    CVPixelBufferRef buffer = [self pixelBufferFromCGImage:[tmpImage CGImage] size:size];
                    if (buffer) {
                        [self appendToAdapter:adaptor pixelBuffer:buffer atTime:presentTime withInput:writerInput];
                        CFRelease(buffer);
                    } else {
                        // Finish the session
                        [writerInput markAsFinished];
                        [videoWriter finishWritingWithCompletionHandler:^{
                        }];
                        returnError = [self errorWithErrorCode:KTTransferGetBufferError object:nil];
                        [self sendToMainThreadError:returnError];
                        return;
                    }
                }
                dispatch_async(dispatch_get_main_queue(), ^{
                    if ([self.delegate respondsToSelector:@selector(transfer:didTransferedAtIndex:totalFrameCount:)]) {
                        [self.delegate transfer:self didTransferedAtIndex:index totalFrameCount:imageFiles.count];
                    }
                });
                index++;
            }
            // Finish the session
            [writerInput markAsFinished];
            [videoWriter finishWritingWithCompletionHandler:^{
                if (videoWriter.status != AVAssetWriterStatusCompleted) {
                    returnError = videoWriter.error;
                    [self sendToMainThreadError:returnError];
                } else {
                    if ([self.delegate respondsToSelector:@selector(transfer:didFinishedWithError:)]) {
                        [self.delegate transfer:self didFinishedWithError:nil];
                    }
                }
            }];
        });
        
        return returnError;
    }
    

    这里主要是利用AVAssetWriter和AVAssetWriterInput来进行图片转视频的,通过字典可以给AVAssetWriterInput设置属性,从而达到设置视频属性的办法。

    2、OpenCV

    1、OpenCV导入工程
    直接去官网下载iOS版本的framework,然后放进工程里面即可。

    Snip20161016_1.png
    如果出现下面的错误: Paste_Image.png

    那么是由于Objective-C默认只支持C语言,但是并不支持C++,而OpenCV用到了C++,因此将m文件的后缀改为mm,让编译支持C++即可。
    如果出现下面的错误:

    Undefined symbols for architecture x86_64:
      "_jpeg_free_large", referenced from:
          _free_pool in opencv2(jmemmgr.o)
      "_jpeg_free_small", referenced from:
          _free_pool in opencv2(jmemmgr.o)
          _self_destruct in opencv2(jmemmgr.o)
      "_jpeg_get_large", referenced from:
          _alloc_large in opencv2(jmemmgr.o)
          _realize_virt_arrays in opencv2(jmemmgr.o)
      "_jpeg_get_small", referenced from:
          _jinit_memory_mgr in opencv2(jmemmgr.o)
          _alloc_small in opencv2(jmemmgr.o)
      "_jpeg_mem_available", referenced from:
          _realize_virt_arrays in opencv2(jmemmgr.o)
      "_jpeg_mem_init", referenced from:
          _jinit_memory_mgr in opencv2(jmemmgr.o)
      "_jpeg_mem_term", referenced from:
          _jinit_memory_mgr in opencv2(jmemmgr.o)
          _self_destruct in opencv2(jmemmgr.o)
      "_jpeg_open_backing_store", referenced from:
          _realize_virt_arrays in opencv2(jmemmgr.o)
    ld: symbol(s) not found for architecture x86_64
    clang: error: linker command failed with exit code 1 (use -v to see invocation)
    

    是缺少libjpeg库造成的,首先去这里(https://sourceforge.net/projects/libjpeg-turbo/files/ )下载安装libjpeg。安装完成之后输入命令:lipo -info /opt/libjpeg-turbo/lib/libjpeg.a,可以看到支持各个处理器架构的libjpeg.a文件的路径,将这个这个文件加入工程即可。

    Paste_Image.png

    如果出现下面的错误:

    Undefined symbols for architecture x86_64:
      "_CMSampleBufferGetImageBuffer", referenced from:
          -[CaptureDelegate captureOutput:didOutputSampleBuffer:fromConnection:] in opencv2(cap_avfoundation.o)
          CvCaptureFile::retrieveFramePixelBuffer() in opencv2(cap_avfoundation.o)
      "_CMSampleBufferInvalidate", referenced from:
          CvCaptureFile::retrieveFramePixelBuffer() in opencv2(cap_avfoundation.o)
      "_CMTimeGetSeconds", referenced from:
          -[KTImagesMovieTransfer nativeTransferMovie:toImagesAtPath:] in KTImagesMovieTransfer.o
    

    是缺少CoreMedia.framework造成的,添加进去即可


    Snip20161025_1.png

    2、将视频解成序列帧
    OpenCV的方法很简练,通过while循环不断取出帧,然后存盘即可:

    - (NSError *)opencvTransferMovie:(NSString *)movie toImagesAtPath:(NSString *)imagesPath
    {
        __block NSError *returnError = nil;
        dispatch_async(KTImagesMovieTransferQueue(), ^{
            CvCapture *pCapture = cvCaptureFromFile(movie.UTF8String);
            // 这个函数只是读取视频头文件信息来获取帧数,因此有可能有不对的情况
            // NSUInteger totalFrameCount = cvGetCaptureProperty(pCapture, CV_CAP_PROP_FRAME_COUNT);
            // 所以采取下面的遍历两遍的办法
            NSUInteger totalFrameCount = 0;
            while (cvQueryFrame(pCapture)) {
                totalFrameCount ++;
            }
            if (pCapture) {
                cvReleaseCapture(&pCapture);
            }
            pCapture = cvCaptureFromFile(movie.UTF8String);
            NSUInteger index = 0;
            IplImage *pGrabImg = NULL;
            while ((pGrabImg = cvQueryFrame(pCapture))) {
                NSString *imagePath = [imagesPath stringByAppendingPathComponent:[NSString stringWithFormat:@"%lu.jpg", index]];
                cvSaveImage(imagePath.UTF8String, pGrabImg);
                dispatch_async(dispatch_get_main_queue(), ^{
                    if ([self.delegate respondsToSelector:@selector(transfer:didTransferedAtIndex:totalFrameCount:)]) {
                        [self.delegate transfer:self didTransferedAtIndex:index totalFrameCount:totalFrameCount];
                    }
                });
                index++;
            }
            if (pCapture) {
                cvReleaseCapture(&pCapture);
            }
            if (index == totalFrameCount) {
                dispatch_async(dispatch_get_main_queue(), ^{
                    if ([self.delegate respondsToSelector:@selector(transfer:didFinishedWithError:)]) {
                        [self.delegate transfer:self didFinishedWithError:nil];
                    }
                });
            } else {
                returnError = [self errorWithErrorCode:KTTransferOpencvWrongFrameCountError object:nil];
                [self sendToMainThreadError:returnError];
            }
        });
        
        return returnError;
    }
    

    这里要注意的是OpenCV有一个获取视频属性的函数:cvGetCaptureProperty,但是这个方法获取视频帧数往往不正确,因为OpenCV只是通过这个函数读取视频的头信息,很有可能与实际帧数不相符合。因此上面是采用2次遍历的办法来获取帧数,第一次遍历获取帧数,第二次遍历执行取帧,存盘操作。另外一个需要注意的是cvQueryFrame函数返回的IplImage并不需要释放,只需要在最后释放一次cvReleaseCapture(&pCapture)即可。

    3、将序列帧压缩为视频
    OpenCV图片转换为视频的办法同样很简单

    - (NSError *)opencvTransferImageFiles:(NSArray<NSString *> *)imageFiles toMovie:(NSString *)movie
    {
        __block NSError *returnError = nil;
        UIImage *firstImage = [UIImage imageWithContentsOfFile:[imageFiles firstObject]];
        if (!firstImage) {
            returnError = [self errorWithErrorCode:KTTransferReadImageError object:[imageFiles firstObject]];
            [self sendToMainThreadError:returnError];
            return returnError;
        }
        CvSize size = cvSize(firstImage.size.width, firstImage.size.height);
        dispatch_async(KTImagesMovieTransferQueue(), ^{
            // OpenCV由于不原生支持H264(可以用其他办法做到),这里选用MP4格式
            CvVideoWriter *pWriter = cvCreateVideoWriter(movie.UTF8String, CV_FOURCC('D', 'I', 'V', 'X'), (double)kKTImagesMovieTransferFPS, size);
            for (NSUInteger ii = 0; ii < imageFiles.count; ++ii) {
                NSString *imageFile = [imageFiles objectAtIndex:ii];
                IplImage *pImage = cvLoadImage(imageFile.UTF8String);
                if (pImage) {
                    cvWriteFrame(pWriter, pImage);
                    cvReleaseImage(&pImage);
                    dispatch_async(dispatch_get_main_queue(), ^{
                        if ([self.delegate respondsToSelector:@selector(transfer:didTransferedAtIndex:totalFrameCount:)]) {
                            [self.delegate transfer:self didTransferedAtIndex:ii totalFrameCount:imageFiles.count];
                        }
                    });
                } else {
                    returnError = [self errorWithErrorCode:KTTransferReadImageError object:imageFile];
                    [self sendToMainThreadError:returnError];
                    return;
                }
            }
            cvReleaseVideoWriter(&pWriter);
            dispatch_async(dispatch_get_main_queue(), ^{
                if ([self.delegate respondsToSelector:@selector(transfer:didFinishedWithError:)]) {
                    [self.delegate transfer:self didFinishedWithError:nil];
                }
            });
        });
        
        return returnError;
    }
    

    首先初始化一个CvVideoWriter对象,然后一帧帧往里面写就行了。这里初始化CvVideoWriter

    CvVideoWriter *pWriter = cvCreateVideoWriter(movie.UTF8String, CV_FOURCC('D', 'I', 'V', 'X'), (double)kKTImagesMovieTransferFPS, size);
    

    与前面的原生方法设置writerInput属性相比较,其实非常相似,都是设置文件名,格式,大小。

    NSDictionary *videoSettings = @{AVVideoCodecKey: AVVideoCodecH264, AVVideoWidthKey: [NSNumber numberWithInt:size.width], AVVideoHeightKey: [NSNumber numberWithInt:size.height]}; 
    AVAssetWriterInput* writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];
    

    3、FFmpeg

    1、FFmpeg的编译和配置
    FFmpeg稍显复杂一点,参照这个教程
    将编译好的FFmpeg加入工程之后,除了上面这个教程所说的设置header search pathes和lib库之外,如果遇到下面的错误:

    Undefined symbols for architecture arm64:
      "av_register_all()", referenced from:
          -[KTImagesMovieTransfer ffmpegTransferMovie:toImagesAtPath:] in KTImagesMovieTransfer.o
    ld: symbol(s) not found for architecture arm64
    clang: error: linker command failed with exit code 1 (use -v to see invocation)
    

    这是因为C语言的函数无法在C++中调用识别的原因,那么引用头文件的时候请这样引用(与在windows上用VS写C++遇到的问题一样的解决办法):extern "C"的作用是让括号包住的头文件中的符号以C语言的方式进行编译。这里之所以使用.mm后缀支持C++是因为之前需要支持OpenCV。如果把FFmpeg的转换方法和OpenCV的分文件写,就不存在这个问题。

    extern "C"
    {
    #include <libavcodec/avcodec.h>
    #include <libavformat/avformat.h>
    }
    

    未完待续。。。。

    相关文章

      网友评论

      本文标题:视频和图片序列帧互相转换

      本文链接:https://www.haomeiwen.com/subject/hjgysttx.html