在webrtc源码分析之视频编码之一和webrtc源码分析之视频编码之二分别分析了视频编码模块的初始化流程和编码流程,接下来说一下视频编码模块关键的几个点。
硬编码与软编码
视频编码有硬编码和软编码,在android平台上,MediaCodec封装了硬编码和软编码,对于软编码,也可以直接使用其他主流开源编码库,比如H264编码标准有libx264、libopenh264,vp8/vp9编码标准有libvpx。
webrtc同时支持硬编码和软编码,在android平台上,硬编码使用的是MediaCodec,是在java层封装调用的,软编码H264使用的是libopenh264,vp8/vp9使用的是libvpx,都是在native层封装调用的。webrtc定义了如下编码相关的主要类:
java层主要类如下所示:
native层主要类如下所示:
在java层和native层都定义了VideoEncoderFactory接口和VideoEncoder和接口,VideoEncoderFactory创建VideoEncoder,VideoEncoder实现视频编码功能。从实现的角度来说,可以分为以下几类:
- 基于android系统提供的MediaCodec实现的HardwareVideoEncoder和MediaCodecVideoEncoder,支持H264、VP8、VP9编码。
- 基于libopenh264实现的H264Encoder,支持H264编码。
- 基于libvpx实现的VP8Encoder和VP9Encoder,分别支持VP8、VP9编码。
其他类只是包装类,比如native层VideoEncoderWrapper可以用于包装java层HardwareVideoEncoder,因为编码操作都是从native层调用的,借助VideoEncoderWrapper就可以在native层统一接口了。同理native层的VideoEncoderFactoryWrapper可以用于包装java层VideoEncoderFactory对象。
以java层为例,VideoEncoderFactory定义如下:
/** Factory for creating VideoEncoders. */
public interface VideoEncoderFactory {
/** Creates an encoder for the given video codec. */
@CalledByNative VideoEncoder createEncoder(VideoCodecInfo info);
/**
* Enumerates the list of supported video codecs. This method will only be called once and the
* result will be cached.
*/
@CalledByNative VideoCodecInfo[] getSupportedCodecs();
}
createEncoder根据VideoCodecInfo创建对应类型的编码器,getSupportedCodecs获取支持的编码器类型,这里的类型指的是编码标准,比如H264、VP8、VP9等。获取的信息用于生成sdp信息用于协商会话使用的编码器类型。实际定义了HardwareVideoEncoderFactory和SoftwareVideoEncoderFactory两种,DefaultVideoEncoderFactory只是对它们的封装。特定的VideoEncoderFactory创建特定的VideoEncoder,比如HardwareVideoEncoderFactory创建的是HardwareVideoEncoder,SoftwareVideoEncoderFactory创建的是VP8Encoder或者VP9Encoder,由info参数决定。
VideoEncoder主要定义如下:
/**
* Initializes the encoding process. Call before any calls to encode.
*/
@CalledByNative VideoCodecStatus initEncode(Settings settings, Callback encodeCallback);
/**
* Releases the encoder. No more calls to encode will be made after this call.
*/
@CalledByNative VideoCodecStatus release();
/**
* Requests the encoder to encode a frame.
*/
@CalledByNative VideoCodecStatus encode(VideoFrame frame, EncodeInfo info);
/**
* Informs the encoder of the packet loss and the round-trip time of the network.
*
* @param packetLoss How many packets are lost on average per 255 packets.
* @param roundTripTimeMs Round-trip time of the network in milliseconds.
*/
@CalledByNative VideoCodecStatus setChannelParameters(short packetLoss, long roundTripTimeMs);
/** Sets the bitrate allocation and the target framerate for the encoder. */
@CalledByNative VideoCodecStatus setRateAllocation(BitrateAllocation allocation, int framerate);
/** Any encoder that wants to use WebRTC provided quality scaler must implement this method. */
@CalledByNative ScalingSettings getScalingSettings();
/**
* Should return a descriptive name for the implementation. Gets called once and cached. May be
* called from arbitrary thread.
*/
@CalledByNative String getImplementationName();
native层的EncoderAdapter的internal_encoder_factory_成员是一个InternalEncoderFactory对象,external_encoder_factory_成员是一个CricketToWebRtcEncoderFactory对象,CricketToWebRtcEncoderFactory的external_encoder_factory_成员是一个MediaCodecVideoEncoderFactory对象,VideoEncoderFactoryWrapper的encoder_factory_成员是一个java层的VideoEncoderFactory对象,VideoEncoderWrapper的encoder_成员是一个java层的VideoEncoder对象。
上面定义了这么多种VideoEncoderFactory和VideoEncoder,实际使用的是哪一种呢?实际使用哪一种跟调用webrtc api传递的参数、硬件平台以及android系统版本相关。
参数主要是PeerConnectionFactory相关的,比如:
public static void initializeFieldTrials(String fieldTrialsInitString) {
nativeInitializeFieldTrials(fieldTrialsInitString);
}
fieldTrialsInitString的值会影响VideoEncoderSoftwareFallbackWrapper的行为。
还有就是给PeerConnectionFactory构造函数传的encoderFactory的值。
相关的代码如下所示:
public PeerConnectionFactory(
Options options, VideoEncoderFactory encoderFactory, VideoDecoderFactory decoderFactory) {
checkInitializeHasBeenCalled();
nativeFactory = nativeCreatePeerConnectionFactory(options, encoderFactory, decoderFactory);
if (nativeFactory == 0) {
throw new RuntimeException("Failed to initialize PeerConnectionFactory!");
}
}
jlong CreatePeerConnectionFactoryForJava(
JNIEnv* jni,
const JavaParamRef<jobject>& joptions,
const JavaParamRef<jobject>& jencoder_factory,
const JavaParamRef<jobject>& jdecoder_factory,
rtc::scoped_refptr<AudioProcessing> audio_processor) {
cricket::WebRtcVideoEncoderFactory* legacy_video_encoder_factory = nullptr;
cricket::WebRtcVideoDecoderFactory* legacy_video_decoder_factory = nullptr;
std::unique_ptr<cricket::MediaEngineInterface> media_engine;
if (jencoder_factory.is_null() && jdecoder_factory.is_null()) {
// This uses the legacy API, which automatically uses the internal SW
// codecs in WebRTC.
if (video_hw_acceleration_enabled) {
legacy_video_encoder_factory = CreateLegacyVideoEncoderFactory();
legacy_video_decoder_factory = CreateLegacyVideoDecoderFactory();
}
media_engine.reset(CreateMediaEngine(
adm, audio_encoder_factory, audio_decoder_factory,
legacy_video_encoder_factory, legacy_video_decoder_factory, audio_mixer,
audio_processor));
} else {
// This uses the new API, does not automatically include software codecs.
std::unique_ptr<VideoEncoderFactory> video_encoder_factory = nullptr;
if (jencoder_factory.is_null()) {
legacy_video_encoder_factory = CreateLegacyVideoEncoderFactory();
video_encoder_factory = std::unique_ptr<VideoEncoderFactory>(
WrapLegacyVideoEncoderFactory(legacy_video_encoder_factory));
} else {
video_encoder_factory = std::unique_ptr<VideoEncoderFactory>(
CreateVideoEncoderFactory(jni, jencoder_factory));
}
std::unique_ptr<VideoDecoderFactory> video_decoder_factory = nullptr;
if (jdecoder_factory.is_null()) {
legacy_video_decoder_factory = CreateLegacyVideoDecoderFactory();
video_decoder_factory = std::unique_ptr<VideoDecoderFactory>(
WrapLegacyVideoDecoderFactory(legacy_video_decoder_factory));
} else {
video_decoder_factory = std::unique_ptr<VideoDecoderFactory>(
CreateVideoDecoderFactory(jni, jdecoder_factory));
}
rtc::scoped_refptr<AudioDeviceModule> adm_scoped = nullptr;
media_engine.reset(CreateMediaEngine(
adm_scoped, audio_encoder_factory, audio_decoder_factory,
std::move(video_encoder_factory), std::move(video_decoder_factory),
audio_mixer, audio_processor));
}
rtc::scoped_refptr<PeerConnectionFactoryInterface> factory(
CreateModularPeerConnectionFactory(
network_thread.get(), worker_thread.get(), signaling_thread.get(),
std::move(media_engine), std::move(call_factory),
std::move(rtc_event_log_factory)));
RTC_CHECK(factory) << "Failed to create the peer connection factory; "
<< "WebRTC/libjingle init likely failed on this device";
// TODO(honghaiz): Maybe put the options as the argument of
// CreatePeerConnectionFactory.
if (has_options) {
factory->SetOptions(options);
}
OwnedFactoryAndThreads* owned_factory = new OwnedFactoryAndThreads(
std::move(network_thread), std::move(worker_thread),
std::move(signaling_thread), legacy_video_encoder_factory,
legacy_video_decoder_factory, network_monitor_factory, factory.release());
owned_factory->InvokeJavaCallbacksOnFactoryThreads();
return jlongFromPointer(owned_factory);
}
可见传给PeerConnectionFactory构造函数encoderFactory参数的值直接影响使用哪一个VideoEncoderFactory,也就直接影响使用哪一个VideoEncoder。demo中调用如下:
if (peerConnectionParameters.videoCodecHwAcceleration) {
encoderFactory = new DefaultVideoEncoderFactory(
rootEglBase.getEglBaseContext(), true /* enableIntelVp8Encoder */, enableH264HighProfile);
decoderFactory = new DefaultVideoDecoderFactory(rootEglBase.getEglBaseContext());
} else {
encoderFactory = new SoftwareVideoEncoderFactory();
decoderFactory = new SoftwareVideoDecoderFactory();
}
factory = new PeerConnectionFactory(options, encoderFactory, decoderFactory);
可见使能hardware acceleration时,使用的是DefaultVideoEncoderFactory,而DefaultVideoEncoderFactory createEncoder时会先后尝试HardwareVideoEncoderFactory和SoftwareVideoEncoderFactory两种,如下所示:
public VideoEncoder createEncoder(VideoCodecInfo info) {
final VideoEncoder videoEncoder = hardwareVideoEncoderFactory.createEncoder(info);
if (videoEncoder != null) {
return videoEncoder;
}
return softwareVideoEncoderFactory.createEncoder(info);
}
也就是无法使用硬编码的情况下使用软编码,而是否能使用硬编码取决于硬件平台以及android系统版本是否支持对应的编码标准,如下所示:
public VideoEncoder createEncoder(VideoCodecInfo input) {
VideoCodecType type = VideoCodecType.valueOf(input.name);
MediaCodecInfo info = findCodecForType(type);
if (info == null) {
// No hardware support for this type.
// TODO(andersc): This is for backwards compatibility. Remove when clients have migrated to
// new DefaultVideoEncoderFactory.
if (fallbackToSoftware) {
SoftwareVideoEncoderFactory softwareVideoEncoderFactory = new SoftwareVideoEncoderFactory();
return softwareVideoEncoderFactory.createEncoder(input);
} else {
return null;
}
}
String codecName = info.getName();
String mime = type.mimeType();
Integer surfaceColorFormat = MediaCodecUtils.selectColorFormat(
MediaCodecUtils.TEXTURE_COLOR_FORMATS, info.getCapabilitiesForType(mime));
Integer yuvColorFormat = MediaCodecUtils.selectColorFormat(
MediaCodecUtils.ENCODER_COLOR_FORMATS, info.getCapabilitiesForType(mime));
if (type == VideoCodecType.H264) {
boolean isHighProfile = nativeIsSameH264Profile(input.params, getCodecProperties(type, true))
&& isH264HighProfileSupported(info);
boolean isBaselineProfile =
nativeIsSameH264Profile(input.params, getCodecProperties(type, false));
if (!isHighProfile && !isBaselineProfile) {
return null;
}
}
return new HardwareVideoEncoder(codecName, type, surfaceColorFormat, yuvColorFormat,
input.params, getKeyFrameIntervalSec(type), getForcedKeyFrameIntervalMs(type, codecName),
createBitrateAdjuster(type, codecName), sharedContext);
}
VideoCodecInfo的name包含要使用的编码类型信息,比如H264、VP8、VP9,可以对应一个VideoCodecType,如下所示:
enum VideoCodecType {
VP8("video/x-vnd.on2.vp8"),
VP9("video/x-vnd.on2.vp9"),
H264("video/avc");
private final String mimeType;
private VideoCodecType(String mimeType) {
this.mimeType = mimeType;
}
String mimeType() {
return mimeType;
}
}
调用findCodecForType判断使用的硬件平台以及android系统版本是否支持该编码类型,支持的话就使用HardwareVideoEncoder,否则就会调用SoftwareVideoEncoderFactory创建对应的软解码器,findCodecForType定义如下所示:
private MediaCodecInfo findCodecForType(VideoCodecType type) {
for (int i = 0; i < MediaCodecList.getCodecCount(); ++i) {
MediaCodecInfo info = null;
try {
info = MediaCodecList.getCodecInfoAt(i);
} catch (IllegalArgumentException e) {
Logging.e(TAG, "Cannot retrieve encoder codec info", e);
}
if (info == null || !info.isEncoder()) {
continue;
}
if (isSupportedCodec(info, type)) {
return info;
}
}
return null; // No support for this type.
}
isSupportedCodec定义如下所示:
private boolean isSupportedCodec(MediaCodecInfo info, VideoCodecType type) {
if (!MediaCodecUtils.codecSupportsType(info, type)) {
return false;
}
// Check for a supported color format.
if (MediaCodecUtils.selectColorFormat(
MediaCodecUtils.ENCODER_COLOR_FORMATS, info.getCapabilitiesForType(type.mimeType()))
== null) {
return false;
}
return isHardwareSupportedInCurrentSdk(info, type);
}
先调用codecSupportsType比对mimeType,如下所示:
static boolean codecSupportsType(MediaCodecInfo info, VideoCodecType type) {
for (String mimeType : info.getSupportedTypes()) {
if (type.mimeType().equals(mimeType)) {
return true;
}
}
return false;
}
接着调用isHardwareSupportedInCurrentSdk比对硬件平台以及android系统版本,如下所示:
private boolean isHardwareSupportedInCurrentSdk(MediaCodecInfo info, VideoCodecType type) {
switch (type) {
case VP8:
return isHardwareSupportedInCurrentSdkVp8(info);
case VP9:
return isHardwareSupportedInCurrentSdkVp9(info);
case H264:
return isHardwareSupportedInCurrentSdkH264(info);
}
return false;
}
如果是H264,调用的是isHardwareSupportedInCurrentSdkH264,如下所示:
private boolean isHardwareSupportedInCurrentSdkH264(MediaCodecInfo info) {
// First, H264 hardware might perform poorly on this model.
if (H264_HW_EXCEPTION_MODELS.contains(Build.MODEL)) {
return false;
}
String name = info.getName();
// QCOM H264 encoder is supported in KITKAT or later.
return (name.startsWith(QCOM_PREFIX) && Build.VERSION.SDK_INT >= Build.VERSION_CODES.KITKAT)
// Exynos H264 encoder is supported in LOLLIPOP or later.
|| (name.startsWith(EXYNOS_PREFIX)
&& Build.VERSION.SDK_INT >= Build.VERSION_CODES.LOLLIPOP);
}
可见,在这里完成了硬件平台以及android系统版本的判断。
总的来说,实际使用哪一种编码器跟调用webrtc api传递的参数、硬件平台以及android系统版本相关。
码率控制
Todo
网友评论