美文网首页webrtcWebRtc Video Receiver
WebRtc Video Receiver(一)-模块创建分析

WebRtc Video Receiver(一)-模块创建分析

作者: JeffreyLau | 来源:发表于2020-08-30 23:00 被阅读0次

    1)前言

    • 视频接收流模块主要分成几大块。
    • 第一是视频接收模块的创建。
    • 第二是视频接收模块对RTP流的处理。
    • 第三是视频接收模块中NackModule模块丢包判断及请求处理。
    • 第四是视频接收模块组包分析并发现有效的帧。
    • 第五是视频接收模块解码分析。
    • 第六是视频接收模块渲染部分分析。
    • 本文着重分析第一部分,分析清楚视频接收模块的创建流程,以及它和其他模块如Call、Rtp、video engine等模块之间的关系

    2)WebRtcVideoReceiveStream的创建流程

    • 首先、engine模块通过channer_mannager对所有的通道进行管理,而在创建PeerConnection之后,通过调用addTranciver函数创建相应的通道。
    • 其次、而接收流或发送流由通道来进行管理。
    • WebRtcVideoReceiveStream类是基于MediaChannel管理的,定义在video engine模块,WebRtcVideoChannel类中的内部类,它是属于engine层视频接收流的抽象。
    • WebRtcVideoChannel中定义了一个receive_streams_的集合,该集合以ssrc为key,以WebRtcVideoReceiveStream实例为value对该通道的所有的视频接收流进行管理。
    • WebRtcVideoReceiveStream的创建流程大致如下:
    WebRtc_Video_Stream_Receiver_00_00.png
    • WebRtcVideoChannel模块的AddRecvStream函数中会new WebRtcVideoReceiveStream模块。
    bool WebRtcVideoChannel::AddRecvStream(const StreamParams& sp,
                                           bool default_stream) {
      RTC_DCHECK_RUN_ON(&thread_checker_);
      .....
      uint32_t ssrc = sp.first_ssrc();
      ....  
      receive_streams_[ssrc] = new WebRtcVideoReceiveStream(
          this, call_, sp, std::move(config), decoder_factory_, default_stream,
          recv_codecs_, flexfec_config);
    
      return true;
    }
    
    • 通过远程sdp传入的参数,得到对应的ssrc,然后根据ssrc为key创建WebRtcVideoReceiveStream,最后将其记录到receive_streams_容器当中。

    • 最终VideoReceiveStream创建的核心任务落在了WebRtcVideoReceiveStream模块的构造函数当中。

    • WebRtcVideoReceiveStream模块构造函数中的函数调用栈大致如下:

      WebRtc_Video_Stream_Receiver_00_01.png
    • 由上图的关系可以得出WebRtcVideoReceiveStream模块依赖Call模块,最终调用Call模块的CreateVideoReceiveStream函数创建webrtc::VideoReceiveStream对象,该对象是真实解码后的数据消费者。

    • 那么WebRtcVideoReceiveStream模块又是如何拿到webrtc::VideoReceiveStream模块中的数据流的?回到WebRtcVideoReceiveStream模块的构造函数。

    WebRtcVideoChannel::WebRtcVideoReceiveStream::WebRtcVideoReceiveStream(
        WebRtcVideoChannel* channel,
        webrtc::Call* call,
        const StreamParams& sp,
        webrtc::VideoReceiveStream::Config config,
        webrtc::VideoDecoderFactory* decoder_factory,
        bool default_stream,
        const std::vector<VideoCodecSettings>& recv_codecs,
        const webrtc::FlexfecReceiveStream::Config& flexfec_config)
        : channel_(channel),
          call_(call),
          stream_params_(sp),
          stream_(NULL),
          default_stream_(default_stream),
          config_(std::move(config)),
          flexfec_config_(flexfec_config),
          flexfec_stream_(nullptr),
          decoder_factory_(decoder_factory),
          sink_(NULL),
          first_frame_timestamp_(-1),
          estimated_remote_start_ntp_time_ms_(0) {
      config_.renderer = this;
      ConfigureCodecs(recv_codecs);
      ConfigureFlexfecCodec(flexfec_config.payload_type);
      MaybeRecreateWebRtcFlexfecStream();
      RecreateWebRtcVideoStream();
    }
    
    • 根据构造函数和上图可知,WebRtcVideoReceiveStream 是模块rtc::VideoSinkInterface<webrtc::VideoFrame>的派生类,也就是YUV数据的消费者。
    • 同时通过对远端SDP的解析,最终握手完成将匹配的解码器等相关信息都保存到了config_成员变量中(该步骤操作通过ConfigureCodecs函数完成)。
    • webrtc::VideoReceiveStream::Config结构中存在一个重要的成员变量rtc::VideoSinkInterface<VideoFrame>* renderer该变量在上面构造函数中被初始化成this指针。
    • 就这样WebRtcVideoReceiveStream模块通过其onFrame函数监听就能顺利从Call模块中拿到对应解码后的YUV数据。
    • ConfigureCodecs函数的左右就是解析远端SDP拿到codec等信息和local端进行对比然后记录其解码工厂,最后将信息记录到config_成员变量中,供Call模块使用。
    • ConfigureCodecs函数主要初始化了config_的decoders成员和rtp成员,这些信息都是从远端sdp中得到。
    void WebRtcVideoChannel::WebRtcVideoReceiveStream::ConfigureCodecs(
        const std::vector<VideoCodecSettings>& recv_codecs) {
      RTC_DCHECK(!recv_codecs.empty());
      config_.decoders.clear();
      config_.rtp.rtx_associated_payload_types.clear();
      config_.rtp.raw_payload_types.clear();
      for (const auto& recv_codec : recv_codecs) {
        webrtc::SdpVideoFormat video_format(recv_codec.codec.name,
                                            recv_codec.codec.params);
    
        webrtc::VideoReceiveStream::Decoder decoder;
        decoder.decoder_factory = decoder_factory_;
        decoder.video_format = video_format;
        decoder.payload_type = recv_codec.codec.id;
        decoder.video_format =
            webrtc::SdpVideoFormat(recv_codec.codec.name, recv_codec.codec.params);
        config_.decoders.push_back(decoder);
        config_.rtp.rtx_associated_payload_types[recv_codec.rtx_payload_type] =
            recv_codec.codec.id;
        if (recv_codec.codec.packetization == kPacketizationParamRaw) {
          config_.rtp.raw_payload_types.insert(recv_codec.codec.id);
        }
      }
    
      const auto& codec = recv_codecs.front();
      config_.rtp.ulpfec_payload_type = codec.ulpfec.ulpfec_payload_type;
      config_.rtp.red_payload_type = codec.ulpfec.red_payload_type;
    
      config_.rtp.lntf.enabled = HasLntf(codec.codec);
      config_.rtp.nack.rtp_history_ms = HasNack(codec.codec) ? kNackHistoryMs : 0;
      config_.rtp.rtcp_xr.receiver_reference_time_report = HasRrtr(codec.codec);
      if (codec.ulpfec.red_rtx_payload_type != -1) {
        config_.rtp
            .rtx_associated_payload_types[codec.ulpfec.red_rtx_payload_type] =
            codec.ulpfec.red_payload_type;
      }
    }
    
    void WebRtcVideoChannel::WebRtcVideoReceiveStream::ConfigureFlexfecCodec(
        int flexfec_payload_type) {
      flexfec_config_.payload_type = flexfec_payload_type;
    }
    
    • 根据远程SDP设置config_成员,供后续使用。

    3)webrtc::VideoReceiveStream 创建及其初始化

    webrtc::VideoReceiveStream* Call::CreateVideoReceiveStream(
        webrtc::VideoReceiveStream::Config configuration) {
      TRACE_EVENT0("webrtc", "Call::CreateVideoReceiveStream");
      RTC_DCHECK_RUN_ON(&configuration_sequence_checker_);
      //如果支持transport wide cc 则不进行周期发送
      receive_side_cc_.SetSendPeriodicFeedback(
          SendPeriodicFeedback(configuration.rtp.extensions));
    
      RegisterRateObserver();
    
      VideoReceiveStream* receive_stream = new VideoReceiveStream(
          task_queue_factory_, &video_receiver_controller_, num_cpu_cores_,
          transport_send_ptr_->packet_router(), std::move(configuration),
          module_process_thread_.get(), call_stats_.get(), clock_);
    
      const webrtc::VideoReceiveStream::Config& config = receive_stream->config();
      {
        WriteLockScoped write_lock(*receive_crit_);
        if (config.rtp.rtx_ssrc) {
          // We record identical config for the rtx stream as for the main
          // stream. Since the transport_send_cc negotiation is per payload
          // type, we may get an incorrect value for the rtx stream, but
          // that is unlikely to matter in practice.
          receive_rtp_config_.emplace(config.rtp.rtx_ssrc,
                                      ReceiveRtpConfig(config));
        }
        receive_rtp_config_.emplace(config.rtp.remote_ssrc,
                                    ReceiveRtpConfig(config));
        video_receive_streams_.insert(receive_stream);
        ConfigureSync(config.sync_group);
      }
      receive_stream->SignalNetworkState(video_network_state_);
      UpdateAggregateNetworkState();
      event_log_->Log(std::make_unique<RtcEventVideoReceiveStreamConfig>(
          CreateRtcLogStreamConfig(config)));
      return receive_stream;
    }
    
    • 如果没有向RtpTransportControllerSend模块注册TargetTransferRateObserver,目标码率监听则进行注册,Call模块是TargetTransferRateObserver的派生类,这里将Call模块注册到RtpTransportControllerSend模块,这样动态码率调节完后Call模块会收到,主要触发OnTargetTransferRate和OnStartRateUpdate。
    • 创建VideoReceiveStream实例,并将其添加到video_receive_streams_容器。
    • 最后配置同步,本文不做分析。
    • 在分析VideoReceiveStream的构造函数之前先看一下,Call模块是如何对VideoReceiveStream进行管理的,以及Call模块收到RTP包后如何将RTP包路由到VideoReceiveStream的。
      WebRtc_Video_Stream_Receiver_00_02.png
    • Call模块维护一个video_receive_streams_容器,每一路视频接收流在new 之后都会被添加到该容器中。
    • Call模块也维护一个RtpStreamReceiverController video_receiver_controller_成员,RtpStreamReceiverController模块维护一个RtpDemuxer demuxer_成员变量。
    • 在new VideoReceiveStream实例的时候会传入video_receiver_controller,那么它做了什么请看其构造函数部分。
    VideoReceiveStream::VideoReceiveStream(
        TaskQueueFactory* task_queue_factory,
        RtpStreamReceiverControllerInterface* receiver_controller,
        int num_cpu_cores,
        PacketRouter* packet_router,
        VideoReceiveStream::Config config,
        ProcessThread* process_thread,
        CallStats* call_stats,
        Clock* clock,
        VCMTiming* timing)
        : task_queue_factory_(task_queue_factory),
          ....
          rtp_video_stream_receiver_(clock_,
                                     &transport_adapter_,
                                     call_stats,
                                     packet_router,
                                     &config_,
                                     rtp_receive_statistics_.get(),
                                     &stats_proxy_,
                                     process_thread_,
                                     this,     // NackSender
                                     nullptr,  // Use default KeyFrameRequestSender
                                     this,     // OnCompleteFrameCallback
                                     config_.frame_decryptor),
            ....  
        ) {
      RTC_LOG(LS_INFO) << "VideoReceiveStream: " << config_.ToString();
      ....
      if (config_.media_transport()) {
        config_.media_transport()->SetReceiveVideoSink(this);
        config_.media_transport()->AddRttObserver(this);
      } else {
        // Register with RtpStreamReceiverController.
        media_receiver_ = receiver_controller->CreateReceiver(
            config_.rtp.remote_ssrc, &rtp_video_stream_receiver_);
        if (config_.rtp.rtx_ssrc) {
          rtx_receive_stream_ = std::make_unique<RtxReceiveStream>(
              &rtp_video_stream_receiver_, config.rtp.rtx_associated_payload_types,
              config_.rtp.remote_ssrc, rtp_receive_statistics_.get());
          rtx_receiver_ = receiver_controller->CreateReceiver(
              config_.rtp.rtx_ssrc, rtx_receive_stream_.get());
        } else {
          rtp_receive_statistics_->EnableRetransmitDetection(config.rtp.remote_ssrc,
                                                             true);
        }
      }
    }
    
    • 首先是初始化RtpVideoStreamReceiver模块,也就是对应的rtp_video_stream_receiver_成员变量,然后以ssrc和rtp_video_stream_receiver_成员做为参数调用RtpStreamReceiverController(在Call模块中保存)的CreateReceiver函数创建接收者,同时在RtpStreamReceiverController::Receiver的构造函数中会调用对应的AddSink函数以ssrc为key,rtp_video_stream_receiver为value将其和RtpDemuxer进行关联。

    • 其流程图如下:


      WebRtc_Video_Stream_Receiver_00_03.png
    • RtpVideoStreamReceiver模块由RtpPacketSinkInterface派生,而VideoReceiveStream模块中有一个重要的成员变量RtpVideoStreamReceiver rtp_video_stream_receiver_

    • 经过以上的拆解我们就可以得出,当Call模块接收到RTP包后经过其保存的RtpStreamReceiverController模块(也就是对应video_receiver_controller_成员)的OnRtpPacket函数进行路由。

    • 根据RtpStreamReceiverController模块OnRtpPacket函数的实现可以得出最终会将RTP包分发到VideoReceiveStream模块所管理RtpVideoStreamReceiver模块中(也就是rtp_video_stream_receiver_成员)

    • 其大致流程如下:


      WebRtc_Video_Stream_Receiver_00_04.png

    4)VideoReceiveStream构造函数分析

    VideoReceiveStream::VideoReceiveStream(
        TaskQueueFactory* task_queue_factory,
        RtpStreamReceiverControllerInterface* receiver_controller,
        int num_cpu_cores,
        PacketRouter* packet_router,
        VideoReceiveStream::Config config,
        ProcessThread* process_thread,
        CallStats* call_stats,
        Clock* clock,
        VCMTiming* timing)
        : task_queue_factory_(task_queue_factory),
          transport_adapter_(config.rtcp_send_transport),
          config_(std::move(config)),
          num_cpu_cores_(num_cpu_cores),
          process_thread_(process_thread),
          clock_(clock),
          call_stats_(call_stats),
          source_tracker_(clock_),
          stats_proxy_(&config_, clock_),
          rtp_receive_statistics_(ReceiveStatistics::Create(clock_)),
          timing_(timing),
          video_receiver_(clock_, timing_.get()),
          rtp_video_stream_receiver_(clock_,
                                     &transport_adapter_,
                                     call_stats,
                                     packet_router,
                                     &config_,
                                     rtp_receive_statistics_.get(),
                                     &stats_proxy_,
                                     process_thread_,
                                     this,     // NackSender
                                     nullptr,  // Use default KeyFrameRequestSender
                                     this,     // OnCompleteFrameCallback
                                     config_.frame_decryptor),
          rtp_stream_sync_(this),
          max_wait_for_keyframe_ms_(KeyframeIntervalSettings::ParseFromFieldTrials()
                                        .MaxWaitForKeyframeMs()
                                        .value_or(kMaxWaitForKeyFrameMs)),
          max_wait_for_frame_ms_(KeyframeIntervalSettings::ParseFromFieldTrials()
                                     .MaxWaitForFrameMs()
                                     .value_or(kMaxWaitForFrameMs)),
          decode_queue_(task_queue_factory_->CreateTaskQueue(
              "DecodingQueue",
              TaskQueueFactory::Priority::HIGH)) {
      .....        
      module_process_sequence_checker_.Detach();
      network_sequence_checker_.Detach();
    
      std::set<int> decoder_payload_types;
      for (const Decoder& decoder : config_.decoders) {
        .....
        decoder_payload_types.insert(decoder.payload_type);
      }
      
      timing_->set_render_delay(config_.render_delay_ms);
    
      frame_buffer_.reset(
          new video_coding::FrameBuffer(clock_, timing_.get(), &stats_proxy_));
    
      process_thread_->RegisterModule(&rtp_stream_sync_, RTC_FROM_HERE);
      if (config_.media_transport()) {
        config_.media_transport()->SetReceiveVideoSink(this);
        config_.media_transport()->AddRttObserver(this);
      } else {
        // Register with RtpStreamReceiverController.
        media_receiver_ = receiver_controller->CreateReceiver(
            config_.rtp.remote_ssrc, &rtp_video_stream_receiver_);
        if (config_.rtp.rtx_ssrc) {
          rtx_receive_stream_ = std::make_unique<RtxReceiveStream>(
              &rtp_video_stream_receiver_, config.rtp.rtx_associated_payload_types,
              config_.rtp.remote_ssrc, rtp_receive_statistics_.get());
          rtx_receiver_ = receiver_controller->CreateReceiver(
              config_.rtp.rtx_ssrc, rtx_receive_stream_.get());
        } else {
          rtp_receive_statistics_->EnableRetransmitDetection(config.rtp.remote_ssrc,
                                                             true);
        }
      }
    }
    
    • 构造函数主要是对其成员函数的一些实例化,其中在构造函数中所涉及到的成员变量主要如下:


      WebRtc_Video_Stream_Receiver_00_05.png
    • VideoReceiveStream`模块的派生关系如下:


      WebRtc_Video_Stream_Receiver_00_06.png
    • 在构造函数中对于接收流的控制,首先创建receiver_controller->CreateReceiver创建正常接收流,并将rtp_video_stream_receiver_绑定到RtpDemuxer。

    • 同时在config_.rtp.rtx_ssrc支持的情况下,(也就是重传流使用独立ssrc传输),以rtp_video_stream_receiver 为参数创建RtxReceiveStream实例(这也说明了最终的重传流也是交给rtp_video_stream_receiver处理的),然后再次通过receiver_controller->CreateReceiver将其绑定到RtpDemuxer

    • 可通过TransportAdapter transport_adapter_成员对该stream开启或者关闭rtp或者rtcp传输功能。

    • rtc::TaskQueue decode_queue_成员是解码任务队列,所有待解码的数据包都经过它进行投递。

    • RtpVideoStreamReceiver rtp_video_stream_receiver_成员负责从Call模块接收RTP数据包,收到后进行响应处理,最后通过解码队列将组帧后的数据送入解码器进行解码。

    • std::unique_ptr<RtxReceiveStream> rtx_receive_stream_成员负责处理接收到的重传包,最后重传包会交给rtp_video_stream_receiver_进行处理。

    • RtpStreamsSynchronizer rtp_stream_sync_成员负责音视频同步工作,其中由VideoReceiveStream派生Syncable并实现其GetPlayoutTimestamp()函数和SetMinimumPlayoutDela()函数,在RtpStreamsSynchronizer模块在同步过程中会调用者两个函数,具体的使用后续进行分析。

    • 成员std::unique_ptr< video_coding::FrameBuffer>frame_buffer_存储编码数据,也就是RTP解包组帧后的数据buffer,最终通过异步任务队列将它送到解码器,进行解码工作。

    • VideoReceiveStreamvideo_coding::OnCompleteFrameCallback的派生关系可知,当组帧完后会触发其OnCompleteFrame函数回调,在该函数中会将收到的一帧完整的帧数据插入到frame_buffer_当中。

    • VideoReceiveStreamNackSender的派生关系可知,当在RtpVideoStreamReceiver模块中收到RTP包由丢包的情况下,会通过VideoReceiveStream模块实现的SendNack()函数发送nack丢包重传请求。

    • VideoReceiveStreamVideoSinkInterface<VideoFrame>的派生关系可知,该模块也是一个解码后数据的消费者,最终会触发其OnFrame()函数,在该函数中会通过config_.renderer->OnFrame(video_frame),调用将YUV数据分发到WebRtcVideoReceiveStream 模块,WebRtcVideoReceiveStream 模块和VideoReceiveStream模块的数据接收传递关系是通过config_配置信息进行映射的,其映射过过程经过Call模块创建VideoReceiveStream模块的时候传入config_参数而来。

    • 到此视频接收流的创建已经完成,接下来就是启动解码线程。

    5)VideoReceiveStream启动分析

    • 根据图(1)可知在WebRtcVideoReceiveStream模块构造过程中调用RecreateWebRtcVideoStream()函数透过Call模块对接收流进行创建,当接收流实例化和初始化完毕后会调用其Start()方法来启动接收流,代码如下:
    void WebRtcVideoChannel::WebRtcVideoReceiveStream::RecreateWebRtcVideoStream() {
      .....
      stream_ = call_->CreateVideoReceiveStream(std::move(config));
      ....
      stream_->Start();
      ....
    }
    
    • 根据webrtc::VideoReceiveStream子类的派生关系,以及在Call模块中真正创建的是属于video模块下的VideoReceiveStream实例。
    • 在分析VideoReceiveStream::Start() 函数之前先看看,该函数中涉及到的成员变量


      WebRtc_Video_Stream_Receiver_00_07.png
    void VideoReceiveStream::Start() {
      RTC_DCHECK_RUN_ON(&worker_sequence_checker_);
    
      if (decoder_running_) {
        return;
      }
    
      const bool protected_by_fec = config_.rtp.protected_by_flexfec ||
                                    rtp_video_stream_receiver_.IsUlpfecEnabled();
    
      frame_buffer_->Start();
    
      if (rtp_video_stream_receiver_.IsRetransmissionsEnabled() &&
          protected_by_fec) {
        frame_buffer_->SetProtectionMode(kProtectionNackFEC);
      }
      /*使能RTCP和RTP发送*/
      transport_adapter_.Enable();
      rtc::VideoSinkInterface<VideoFrame>* renderer = nullptr;
      /*enable_prerenderer_smoothing默认为true  
        render_delay_ms默认为10ms,定义在VideoReceiveStream::Stats结构中  
        此时的值应该为0,IncomingVideoStream用于平滑渲染,这里将this指针传入,最终VideoFrame
        会触发到VideoReceiveStream::onFrame()函数
      */
      if (config_.enable_prerenderer_smoothing) {
        incoming_video_stream_.reset(new IncomingVideoStream(
            task_queue_factory_, config_.render_delay_ms, this));
        renderer = incoming_video_stream_.get();
      } else {
        renderer = this;
      }
    
      for (const Decoder& decoder : config_.decoders) {
        std::unique_ptr<VideoDecoder> video_decoder =
          decoder.decoder_factory->LegacyCreateVideoDecoder(
                        decoder.video_format,config_.stream_id);
        // If we still have no valid decoder, we have to create a "Null" decoder
        // that ignores all calls. The reason we can get into this state is that the
        // old decoder factory interface doesn't have a way to query supported
        // codecs.
        if (!video_decoder) {
          video_decoder = std::make_unique<NullVideoDecoder>();
        }
          
        video_decoders_.push_back(std::move(video_decoder));
    
        video_receiver_.RegisterExternalDecoder(video_decoders_.back().get(),
                                                decoder.payload_type);
        VideoCodec codec = CreateDecoderVideoCodec(decoder);
    
        const bool raw_payload =
            config_.rtp.raw_payload_types.count(codec.plType) > 0;
        rtp_video_stream_receiver_.AddReceiveCodec(
            codec, decoder.video_format.parameters, raw_payload);
        RTC_CHECK_EQ(VCM_OK, video_receiver_.RegisterReceiveCodec(
                                 &codec, num_cpu_cores_, false));
      }
    
      RTC_DCHECK(renderer != nullptr);
      video_stream_decoder_.reset(
          new VideoStreamDecoder(&video_receiver_, &stats_proxy_, renderer));
    
      // Make sure we register as a stats observer *after* we've prepared the
      // |video_stream_decoder_|.
      call_stats_->RegisterStatsObserver(this);
    
      // Start decoding on task queue.
      video_receiver_.DecoderThreadStarting();
      stats_proxy_.DecoderThreadStarting();
      decode_queue_.PostTask([this] {
        RTC_DCHECK_RUN_ON(&decode_queue_);
        decoder_stopped_ = false;
        StartNextDecode();
      });
      decoder_running_ = true;
      rtp_video_stream_receiver_.StartReceive();
    }
    
    • 调用frame_buffer_->Start()开启buffer循环,后续进行分析。
    • config_.render_delay_ms和this指针为参数实例化成员变量incoming_video_stream_,该成员的主要作用是实现平滑渲染,在其内部进行相关平滑逻辑处理,然后再将待渲染的数据回调到当前类的onFrame函数。
    • 循环遍历config_.decoders,该成员在WebRtcVideoReceiveStream::ConfigureCodecs中进行初始化,依据远端sdp解析而来,调用LegacyCreateVideoDecoder()函数创建解码器,并将其保存到video_decoders_,同时使用VideoReceiver2::RegisterExternalDecoder()向模块VideoReceiver2注册LegacyCreateVideoDecoder()函数创建的解码器。
    • 循环遍历config_.decoders,并调用CreateDecoderVideoCodec()函数创建解码器,同时调用RtpVideoStreamReceiver::AddReceiveCodec()函数向RtpVideoStreamReceiver模块添加创建好的解码器。
    • 循环遍历config_.decoders,并调用VideoReceiver2::RegisterReceiveCodec()向模块VideoReceiver2注册解码器。
    • 以this指针为参数向CallStats模块注册监听。
    • video_receiver_stats_proxy_renderer(也就是IncomingVideoStream)为参数实例化video_stream_decoder_成员,
    • 调用StartNextDecode启动解码线程。
    • 最后调用rtp_video_stream_receiver_.StartReceive()开启模块RtpVideoStreamReceiver的接收功能,这样RtpVideoStreamReceiver模块的OnRtpPacket()函数回调将被处理。
      WebRtc_Video_Stream_Receiver_00_08.png
    • 通过图(8)可知通过调用RtpVideoStreamReceiver::AddReceiveCodec()函数最终将解码器信息存储到RtpVideoStreamReceiver模块的payload_type_map_ 和 pt_codec_params_容器当中,但是只是存了参数信息和payloadtype等信息,并未存储解码器实例。
    • 通过调用VideoReceiver2模块的两大注册函数将解码器实例通过其成员变量codecDataBase_将创建好的解码器实例记录到模块VCMDecoderDataBase当中,由此可看出后续在RTP包解包并组帧率完成后,最终需要解码应该是从模块VideoReceiver2中拿到解码器实例,然后进行解码。具体的流程后续分析解码时再分析。
      WebRtc_Video_Stream_Receiver_00_09.png
    • 通过图(9)得知,在VideoReceiveStream模块的Start()函数中最终会以VideoReceiver2模块的实例作为参数来构造VideoStreamDecoder模块,而在VideoStreamDecoder的构造函数中会调用VideoReceiver2模块的RegisterReceiveCallback()函数并传入this指针(VideoStreamDecoder实例),进而调用VideoReceiver2模块的成员变量decodedFrameCallback_,将VideoStreamDecoder指针保存到VCMDecodedFrameCallback当中。
    • VideoStreamDecoder模块就能够接收VCMDecodedFrameCallback模块所回调的信息。
    • 那么记录在VideoReceiver2模块中的成员变量VCMDecodedFrameCallback decodedFrameCallback_ 又在哪里被引用呢?后续进行分析。

    6)总结:

    • 本文着重分析WebRtc 视频接收流的创建流程以及原理,同时分析与之相关模块或类之间的关系。
    • 了解webrtc::VideoReceiveStream以及WebRtcVideoReceiveStream的创建流程以及创建时机,对后续分析Webrtc 视频接收流rtp包处理流程的分析奠定坚实的基础。
    • 同时为未来对webrtc 框架进行定制化,比如h265解码的支持,可针对peerconnection直接转发encode frame而无需进行解码和渲染做好铺垫。
    • 掌握模块与模块之间的数据回调机制和传递机制十分重要。

    相关文章

      网友评论

        本文标题:WebRtc Video Receiver(一)-模块创建分析

        本文链接:https://www.haomeiwen.com/subject/pjjysktx.html