一、Mac 平台 FFmpeg 命令行视频录制
FFmpeg 命令行视频录制的步骤和音频录制步骤类似,首先查看 Mac 平台可用设备:
$ ffmpeg -devices
输出:
Devices:
D. = Demuxing supported
.E = Muxing supported
--
D avfoundation AVFoundation input device
D lavfi Libavfilter virtual input device
E sdl,sdl2 SDL2 output device
我们选择使用 avfoundation 来录音,接下来查看 avfoundation 支持的设备:
$ ffmpeg -f avfoundation -list_devices true -i ''
输出:
[AVFoundation indev @ 0x7fabb5505300] AVFoundation video devices:
[AVFoundation indev @ 0x7fabb5505300] [0] FaceTime HD Camera
[AVFoundation indev @ 0x7fabb5505300] [1] Capture screen 0
[AVFoundation indev @ 0x7fabb5505300] AVFoundation audio devices:
[AVFoundation indev @ 0x7fabb5505300] [0] Built-in Microphone
我们仅仅是想录制视频,不录制音频,那么选择我的 MacBook pro 自带摄像头 FaceTime HD Camera 就可以,接下来开始录制视频:
$ ffmpeg -f avfoundation -i 0 out.yuv
不出意外终端会输出如下错误:
[avfoundation @ 0x7febb7008200] Selected framerate (29.970030) is not supported by the device.
[avfoundation @ 0x7febb7008200] Supported modes:
[avfoundation @ 0x7febb7008200] 1280x720@[1.000000 30.000000]fps
Last message repeated 2 times
[avfoundation @ 0x7febb7008200] 640x480@[1.000000 30.000000]fps
意思是设备不支持帧率 29.970030fps,那么重新设置帧率试一下:
$ ffmpeg -f avfoundation -framerate 30 -i 0 out.yuv
输出:
[avfoundation @ 0x7fe4f8008200] Selected pixel format (yuv420p) is not supported by the input device.
[avfoundation @ 0x7fe4f8008200] Supported pixel formats:
[avfoundation @ 0x7fe4f8008200] uyvy422
[avfoundation @ 0x7fe4f8008200] yuyv422
[avfoundation @ 0x7fe4f8008200] nv12
[avfoundation @ 0x7fe4f8008200] 0rgb
[avfoundation @ 0x7fe4f8008200] bgr0
[avfoundation @ 0x7fe4f8008200] Overriding selected pixel format to use uyvy422 instead.
ffmpeg 的默认像素格式是 yuv420p,但是我们的设备不支持像素格式 yuv420p,支持的格式仅包括 uyvy422、yuyv422、nv12、0rgb 和 bgr0,并且 ffmpeg 自动选择了设备支持的像素格式 uyvy422;
Input #0, avfoundation, from '0':
Duration: N/A, start: 4916.385400, bitrate: N/A
Stream #0:0: Video: rawvideo (UYVY / 0x59565955), uyvy422, 320x240, 30 fps, 30 tbr, 1000k tbn, 1000k tbc
Stream mapping:
Stream #0:0 -> #0:0 (rawvideo (native) -> rawvideo (native))
Press [q] to stop, [?] for help
Output #0, rawvideo, to 'out_001.yuv':
Metadata:
encoder : Lavf58.45.100
Stream #0:0: Video: rawvideo (UYVY / 0x59565955), uyvy422, 320x240, q=2-31, 36864 kb/s, 30 fps, 30 tbn, 30 tbc
Metadata:
encoder : Lavc58.91.100 rawvideo
此时 ffmpeg 成功的开始采集摄像头的数据了,想要结束的话按下 ctrl + c 就可以。采集的视频像素格式是 uyvy422,分辨率是 320x240,帧率是 30fps。最后播放采集的视频数据:
$ ffplay -pixel_format uyvy422 -video_size 320x240 -framerate 30 out.yuv
PS:查看 avfoundation 支持的参数:
$ ffmpeg -h demuxer=avfoundation
输出:
Demuxer avfoundation [AVFoundation input device]:
AVFoundation indev AVOptions:
-list_devices <boolean> .D........ list available devices (default false)
-video_device_index <int> .D........ select video device by index for devices with same name (starts at 0) (from -1 to INT_MAX) (default -1)
-audio_device_index <int> .D........ select audio device by index for devices with same name (starts at 0) (from -1 to INT_MAX) (default -1)
-pixel_format <pix_fmt> .D........ set pixel format (default yuv420p)
-framerate <video_rate> .D........ set frame rate (default "ntsc")
-video_size <image_size> .D........ set video size
-capture_cursor <boolean> .D........ capture the screen cursor (default false)
-capture_mouse_clicks <boolean> .D........ capture the screen mouse clicks (default false)
-capture_raw_data <boolean> .D........ capture the raw data from device connection (default false)
-drop_late_frames <boolean> .D........ drop frames that are available later than expected (default true)
二、FFmpeg 编程录制视频
编程录制视频的过程和编程录制音频的过程也很相似,导入需要用到的头文件:
extern "C" {
#include <libavdevice/avdevice.h>
#include <libavformat/avformat.h>
#include <libavcodec/avcodec.h>
#include <libavutil/avutil.h>
#include <libavutil/imgutils.h>
}
定义需要的宏定义:
#define FILE_NAME "/Users/mac/Downloads/pic/out.yuv"
#define FMT_NAME "avfoundation"
#define DEVICE_NAME "0"
首先注册所有设备,在整个程序的运行过程中,只需要执行1次注册设备的代码:
avdevice_register_all();
获取输入格式对象:
AVInputFormat *fmt = av_find_input_format(FMT_NAME);
if (!fmt) {
qDebug() << "not found input format:" << FMT_NAME;
return;
}
打开输入设备:
// 格式上下文
AVFormatContext *ctx = nullptr;
// 传递给输入设备的参数
AVDictionary *options = nullptr;
av_dict_set(&options, "video_size", "640x480", 0);
av_dict_set(&options, "pixel_format", "yuyv422", 0);
av_dict_set(&options, "framerate", "30", 0);
// 打开输入设备
int ret = avformat_open_input(&ctx, DEVICE_NAME, fmt, &options);
if (ret < 0) {
char errbuf[1024];
av_strerror(ret, errbuf, sizeof (errbuf));
qDebug() << "open input error:" << errbuf;
return;
}
此处有一个细节,前面使用 FFmpeg 命令行录制的时候,没有设置帧率(-framerate
)会报错,我们调用 avformat_open_input
函数的时候也需要设置相应的参数:
AVDictionary *options = nullptr;
av_dict_set(&options, "video_size", "640x480", 0); // 视频像素
av_dict_set(&options, "pixel_format", "yuyv422", 0); // 采样格式
av_dict_set(&options, "framerate", "30", 0); // 帧率
打开文件:
QFile file(FILE_NAME);
if (!file.open(QFile::WriteOnly)) {
qDebug() << "open file failure:" << FILE_NAME;
// 关闭输入设备
avformat_close_input(&ctx);
return;
}
采集数据:
// 计算每一帧的大小
AVCodecParameters *params = ctx->streams[0]->codecpar;
int imageSize = av_image_get_buffer_size((AVPixelFormat)params->format, params->width, params->height, 1);
// 数据包
AVPacket *pkt = av_packet_alloc();
while (!isInterruptionRequested()) {
// 不断采集数据
ret = av_read_frame(ctx, pkt);
if (ret == 0) {
// 读取成功,将数据写入文件
// 这里要使用imageSize,而不是pkt->size。pkt->size有可能比imageSize大(比如在Mac平台),使用pkt->size会导致写入一些多余数据到YUV文件中,进而导致YUV内容无法正常播放
file.write((const char *)pkt->data, imageSize);
av_packet_unref(pkt);
} else if (ret == AVERROR(EAGAIN)) {
continue;
} else {
// 其他错误
char errbuf[1024];
av_strerror(ret, errbuf, sizeof (errbuf));
qDebug() << "read frame error:" << errbuf;
break;
}
}
最后释放资源、关闭文件、关闭设备:
// 释放资源
av_packet_free(&pkt);
// 关闭文件
file.close();
// 关闭设备
avformat_close_input(&ctx);
网友评论