美文网首页
Android端实现TS流数据解析和Jni2Java回调

Android端实现TS流数据解析和Jni2Java回调

作者: Felix_lin | 来源:发表于2018-06-21 23:03 被阅读166次

    为了解析传输到手机的TS流有效视频数据,进行预览播放和录制等功能,在网上看到的部分关于android播放TS流,如UDP预览播放,采取的方式真是够标新立异,如通过保存到本地ts文件然后播放,会有闪屏现象等,当然也有很多的播放器支持播放,如ffmepg-android,vlc-android,vitamio等等。我感觉这些库要么太大,要么有很多都不满足我的需求,所以我自己尝试学习和通过jni调用c++去解析ts流

    以下文章是我本文学习并参考的部分感觉非常不错的,希望大家学习时可以看看,方便全面学习:

    一、TS流简介

    1. 什么是TS流 : TS(Transport Stream,传输流),全称则是MPEG2-TS,主要应用于实时传送的节目,如实时广播的电视节目,机顶盒等。它的主要格式分别为h264/mpeg4,acc/MP3等,MPEG2-TS格式的特点就是要求从视频流的任一片段开始都是可以独立解码的。

    2. 在学习TS流是需要了解的部分定义:

      • ES流:基本码流,不分段的音频、视频或其他信息的连续码流。
      • PES流:分包的ES流,通过添加PES头进行标记,PES包的长度是可变的
      • TS流:传输流,固定长度的封包(188B),便于解析和恢复错包,它包含三个部分:ts header、adaptation field、payload,如下图结构,ts header通过PID去识别数据包的内容,adaptation field为补充内容,payload即我们需要解析的pes数据。

      • 需要注意的是,一端TS流里面可能包含多个节目,这些在解析PAT和PMT时可以通过打印信息得到,在我代码里有注释,我的项目里固定只包含了一个,所以适配代码需要自己改动

    3. 解析TS流的重点在于理解他的表结构:解析TS流的流程主要是通过对应的PID去分布解析我们需要的信息,从而截取出对应的有效数据

      • 节目关联表Program Association Table (PAT) 0x0000,通过PAT我们可以解析对应的PMT表的PID
      • 节目映射表Program Map Tables (PMT) 在PMT中解析出对应的视频和音频的PID值
      • 条件接收表Conditional Access Table (CAT) 0x0001
      • 网络信息表Network Information Table(NIT) 0x0010
      • 部分参数或者结构说明我在代码注释中给出
      图片来源:https://www.cnblogs.com/jiayayao/p/6832614.html
    1. 解析流程:具体的对应结构在我上面列出的参考文章中都讲解的非常详细,本文主要写一个简单流程引导,做到一个快速集成到项目的目的
      1. 遍历TS流,通过同步字节查到ts header,sync byte: 1B,其值固定为0x47(需要考虑差错,buff拼接的情况)
      2. 获取PAT
      3. 根据PAT查询的PMT_PID查询对应的PMT的表
      4. 根据PMT查询对应的VEDIO_PID和AUDIO_PID
      5. 对应的PID解析视频和音频ParsePES
      6. 以下为解析流程结构图:


        2.png

    二、TS流解析代码

    本文给出的TS解析代码根据项目https://github.com/js2854/TSParser改动得来,该开源项目主要实现对TS文件的解析和各种信息的打印,我这边参考添加的改动:更改为TS流实现相应解析,增加的PES-音视频有效数据的解析,并通过jni输出到java层,添加android jni实现,数据缓存buff等,详细的方法都有部分注释,如有不明白,错误或侵权方面的问题请私信我,谢谢

    • Application.mk

        APP_PROJECT_PATH := $(call my-dir)
        APP_BUILD_SCRIPT := $(call my-dir)/Android.mk
        APP_ABI := armeabi armeabi-v7a
        APP_PLATFORM=android-23
      
    • Android.mk

        LOCAL_PATH := $(call my-dir)
        # Program
        include $(CLEAR_VARS)
        LOCAL_MODULE := tsparse
        LOCAL_SRC_FILES := jni_lib.cpp AACDecoder.cpp MFifo.cpp TSParser.cpp
        #LOCAL_C_INCLUDES :=    \
        #$(MY_LOCAL_ANDSRC)/system/core/include \
        #$(MY_LOCAL_ANDSRC)/frameworks/native/include   \
        #$(MY_LOCAL_ANDSRC)/hardware/libhardware/include
        #LOCAL_CFLAGS := -DHAVE_PTHREADS
        LOCAL_C_INCLUDES += $(LOCAL_PATH)/prebuilt/include
        LOCAL_LDLIBS := -llog -lz -lGLESv2 -landroid -lOpenSLES 
        include $(BUILD_SHARED_LIBRARY)
      
    • jni_lib.cpp

        #ifndef UINT64_C
        #define UINT64_C(c) (c ## ULL)
        #endif
        
        #include "mdebug.h"
        #include <stdio.h>
        #include <stdlib.h>
        #include <string.h>
        #include <jni.h>
        #include <pthread.h>
        #include <unistd.h>
        #include <fcntl.h>
        #include "TSParser.h"
        
        static JavaVM *g_jvm = NULL;
        static TSParser * mpTSParser=NULL;
        
        pthread_mutex_t playMutex = PTHREAD_MUTEX_INITIALIZER;
        extern "C" {
        JNIEXPORT jint JNI_OnLoad(JavaVM * vm, void *reserved) {
            JNIEnv *env = NULL;
            jint result = -1;
            mInfo("JNI_OnLoad");
            if (vm->GetEnv((void**) &env, JNI_VERSION_1_4) != JNI_OK)
                return -1;
            g_jvm = vm;
            mpCamera = new CUCamera();
            return JNI_VERSION_1_4;
        }
        }
        ///////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
        int OnDestroy() {
            if(mpTSParser){
                mpTSParser->__stopThread();
                if (mpTSParser->TsDoloopThreadHandle)
                        pthread_join(mpTSParser->TsDoloopThreadHandle, NULL);
                mpTSParser->TsDoloopThreadHandle=NULL;
                delete mpTSParser;
                mpTSParser = NULL;
            }
            return 0;
        }
        
        static void *_tmain(void * cc)
        {
            if(mpTSParser!=NULL)
                mpTSParser->Parse();
        }
        
        extern "C" {
        JNIEXPORT jint JNICALL Java_包名0_包名1_包名2_init(JNIEnv *env,
                jobject obj);
        JNIEXPORT jint JNICALL Java_包名0_包名1_包名2_JniLib_PushTsData(JNIEnv * env, jobject obj,  jbyteArray jbArr,jint DataLen);
        JNIEXPORT void JNICALL Java_包名0_包名1_包名2_JniLib_initTS(JNIEnv *env,jobject obj);
        JNIEXPORT void JNICALL Java_包名0_包名1_包名2_JniLib_stopTsParse(JNIEnv *env,jobject obj);
        JNIEXPORT void JNICALL Java_包名0_包名1_包名2_JniLib_startTsParse(JNIEnv *env,jobject obj);
        }
        ;
        
        JNIEXPORT jint JNICALL Java_包名0_包名1_包名2_PushTsData(JNIEnv * env, jobject obj,  jbyteArray jbArr,jint DataLen)
        {
            if(!mpTSParser)
                return -1;
           int ret = 0;
           jsize jlen = env->GetArrayLength(jbArr);
           jbyte* jbuf  =  env->GetByteArrayElements(jbArr, JNI_FALSE);
           char* buf  =  (char*)jbuf;
            mfxBitstreamTS *pBufTs = NULL;
            while(true){
                pBufTs = mpTSParser->GetEmptyTsBuf();
                if(pBufTs==NULL||pBufTs->Data==NULL)
                {
                    usleep(1);
                    continue;
                }
                break;
            }
            if (pBufTs == NULL ||pBufTs->Data == NULL) {
                return -2;
            }
           // mInfo("-----------------------PushTsFrame %d",DataLen); TS_TIMES
           memcpy(pBufTs->Data,(unsigned char *) jbuf, DataLen);
           pBufTs->DataLength = DataLen;
           mpTSParser->PushTsBuf(pBufTs);
           env->ReleaseByteArrayElements(jbArr, jbuf, 0);
           return ret;
        }
        
        JNIEXPORT void JNICALL Java_包名0_包名1_包名2_initTS(JNIEnv * env,
                jobject thiz) {
            if(mpTSParser==NULL)
            mpTSParser = new TSParser();
            mpTSParser->initMemory();
            mpTSParser->JavaMethodInit(g_jvm, thiz);
            return;
        }
        
        
        JNIEXPORT void JNICALL Java_包名0_包名1_包名2_stopTsParse(JNIEnv * env,
                jobject obj) {
            if (!mpTSParser)
                    return ;
            mpTSParser->__stopThread();
            if (mpTSParser->TsDoloopThreadHandle)
                    pthread_join(mpTSParser->TsDoloopThreadHandle, NULL);
            mpTSParser->TsDoloopThreadHandle=NULL;
            delete mpTSParser;
        
            mpTSParser = NULL;
        }
        
        JNIEXPORT void JNICALL Java_包名0_包名1_包名2_startTsParse(JNIEnv * env,jobject obj ) {
            if (!mpTSParser)
                        return ;
            int ret_t;
            struct sched_param param;
            pthread_attr_t attr;
            pthread_attr_init(&attr);
            pthread_attr_setschedpolicy(&attr, SCHED_RR);
            param.sched_priority = 90;
            pthread_attr_setschedparam(&attr, &param);
            ret_t = pthread_create(&mpTSParser->TsDoloopThreadHandle, &attr, _tmain,NULL);
            if (ret_t) {
                mLogW("pthread_create TsDoloopThreadHandle failed [%d] \n", ret_t);
            }
        }
        
        JNIEXPORT jint JNICALL Java_包名0_包名1_包名2_CameraLib_init(JNIEnv *env,
                jobject obj) {
            mpCamera->JavaMethodInit(g_jvm, obj);
        
            return 0;
        
        }
      
    • types.h 建议到参考开源项目中拷贝

    • TSParse.h

        #ifndef __TS_PARSER_H__
        #define __TS_PARSER_H__
        struct _HANDLE_
        {
            unsigned int code;
            void *pContext;
        };
        
        #include <assert.h>
        #include <errno.h>
        #include <stdio.h>
        #include "mdebug.h"
        #include <string.h>
        #include <fcntl.h>
        #include "types.h"
        #include <string.h>
        #include <stdlib.h>
        #include "MFifo.h"
        
        using namespace std;
        
        typedef enum TS_ERR
        {
            TS_OK = 0,
            TS_IN_PARAM_ERR,
            TS_SYNC_BYTE_ERR,
            TS_FILE_OPEN_FAIL,
            TS_FILE_SEEK_FAIL,
        }TS_ERR;
        
        
        
        // PID种类
        typedef enum E_PKT_TYPE
        {
            E_PAT       = 0,
            E_PMT       = 1,
            E_PCR       = 2,
            E_AUDIO     = 3,
            E_VIDEO     = 4,
            E_NIT       =5,
            E_SI       =6,
            E_MAX       = 7
        
        }E_PKT_TYPE;
        
        class TSPacket
        {
        public:
        //   uint8 * bufH264pkt;//=new uint8[1024*2000];
             uint32 pktH264Len;//=0;
             uint32 pktAccLen;//=0;
             uint32 pktindex;//=0;
             bool get_PAT_Head=false;
             bool get_PMT_Head=false;
             mfxBitstreamTS *h264Buf;
             mfxBitstreamTS *__accBuf;
        
            TSPacket()
                : m_pBuf(NULL)
                , m_pHdr(NULL)
                , m_u16PID(PID_UNSPEC)
                , m_u8CC(0)
                , m_u16PMTPID(PID_UNSPEC)
                , m_u8StreamId(0)
                , m_s64PCR(INVALID_VAL)
                , m_s64PTS(INVALID_VAL)
                , m_s64DTS(INVALID_VAL)
            {
        
        //      bufH264pkt=new uint8[1024*2000];
        
                pktH264Len=0;
                pktindex=0;
                get_PAT_Head=false;
                get_PMT_Head=false;
            }
            ~TSPacket() {}
        
            uint16 GetPID() const { return m_u16PID; }
            uint8  GetCC() const { return m_u8CC; }
        
            bool   IsPAT() { return (PID_PAT == m_u16PID); }
            uint16 GetPMTPID() const { return m_u16PMTPID; }
        
            bool   IsSIT() { return (PID_DVB_SIT == m_u16PID); }
            bool   IsNIT() { return (PID_DVB_NIT == m_u16PID); }
        
            bool   IsPMT() { return (PID_UNSPEC != m_u16PID &&s_au16PIDs[E_PMT] == m_u16PID); }//
            bool   IsVideo() { return (s_au16PIDs[E_VIDEO] == m_u16PID); }
            bool   IsAudio() { return (s_au16PIDs[E_AUDIO] == m_u16PID); }
        
            sint64 GetPCR() const { return m_s64PCR; }
            sint64 GetPTS() const { return m_s64PTS; }
            sint64 GetDTS() const { return m_s64DTS; }
        
        public:
            static uint16 s_au16PIDs[E_MAX]; // 记录所有pid
        
        
            bool   __HasAdaptField();
            bool   __HasPayload();
            AdaptFixedPart* __GetAdaptField();
            uint8  __GetAdaptLen();
            sint64 __GetPCR();
            bool   __IsVideoStream(uint8 u8StreamType);
            bool   __IsAudioStream(uint8 u8StreamType);
            uint8  __GetPayloadOffset();
            uint8  __GetTableStartPos();
            sint64 __GetPTS(const OptionPESHdrFixedPart *pHdr);
            sint64 __GetDTS(const OptionPESHdrFixedPart *pHdr);
        
        
        
            uint8           m_count_v;
            const uint8     *m_pBuf;
            TSHdrFixedPart  *m_pHdr;
            uint16          m_u16PID;
            uint8           m_u8CC;
            uint16          m_u16PMTPID;
            uint8           m_u8StreamId;
            sint64          m_s64PCR;
            sint64          m_s64PTS;
            sint64          m_s64DTS;
        };
        typedef struct{
              //对内使用
            uint8 out_videobuff[TS_MAX_OUT_BUFF];         //当前视频帧数据缓存
            int video_buflen;                             //当前视频帧数据缓存长度
            uint64_t pts_video;                             //当前视频帧PTS
            uint64_t dts_video;                             //当前视频帧DTS
            int video_cc_ok ;                             //
            int video_last_cc;                            //上个视频TS包计数值
            int video_intactness;                         //帧内容完整标志    1 : 完整 ;  0 : 不完整
        
            uint8 out_audiobuff[TS_MAX_OUT_BUFF];         //当前音频帧数据缓存
            int audio_buflen;                             //当前音频帧数据缓存长度
            uint64_t pts_audio;                             //当前音频帧PTS
            uint64_t dts_audio;                             //当前音频帧DTS
            int audio_cc_ok;
            int audio_last_cc;
            int audio_intactness;
        
        
        
        }ts_outdata;
        
        class TSParser :public TSPacket
        {
        public:
             TSParser();
            ~TSParser();
        
            unsigned char pcm_buffer[1024 * 20];
            long            Xferred;
            double          m_Fps;
            double          a_Fps;
            unsigned int jpg_count;
            unsigned int audio_count;
        
            CFifo m_H264BufFifo;
            CFifo m_AccBufFifo;
            CFifo m_TsBufFifo;
        
            CFifo m_DirtyH264BufFifo;
            CFifo m_DirtyAccBufFifo;
            CFifo m_DirtyTsBufFifo;
        
            int             m_H264BufCount;
            int             m_AccBufCount;
            int             m_TsBufCount;
        
            mfxBitstreamTS  m_H264Buf[100];
            mfxBitstreamTS  m_AccBuf[100];
            mfxBitstreamTS  m_TsBuf[100];
        
            TS_ERR Parse();
            HANDLE TsVedioThreadHandle;
            HANDLE TsAudioThreadHandle;
            HANDLE TsDoloopThreadHandle;
            HANDLE PrintThreadHandle;
        
            JavaVM*     m_jvm;
            jobject _javaAudioObj;
            jclass _javaAudioClass;
        
            jobject _javaVedioObj;
            jclass _javaVedioClass;
        
            jobject _javaSpeedObj;
            jclass _javaSpeedClass;
        
            jmethodID      _accCid;
            jmethodID      _h264Cid;
            jmethodID      _speedCid;
          void InitH264Memory();
          mfxBitstreamTS * GetEmptyH264Buf();
          void ResetH264Buf();
          bool PushDirytH264Buf(mfxBitstreamTS * pbuf);
          mfxBitstreamTS * GetH264Buf();
          bool PushH264Buf(mfxBitstreamTS * pbuf);
          void ReleaseH264Buf();
        
          void InitAccMemory();
          mfxBitstreamTS * GetEmptyAccBuf();
          void ReleaseAccBuf();
          bool PushAccBuf(mfxBitstreamTS * pbuf);
          mfxBitstreamTS * GetAccBuf();
          void ResetAccBuf();
          bool PushDirytAccBuf(mfxBitstreamTS * pbuf);
        
           void InitTsMemory();
           mfxBitstreamTS * GetEmptyTsBuf();
           void ReleaseTsBuf();
           bool PushTsBuf(mfxBitstreamTS * pbuf);
           bool PushTsFrame(unsigned char *pData, unsigned int len);
           mfxBitstreamTS * GetTsBuf();
           void ResetTsBuf();
           bool PushDirytTsBuf(mfxBitstreamTS * pbuf);
        
           TS_ERR __stopThread();
           TS_ERR initMemory();
           TS_ERR initAudioDecoder();
           int JavaMethodInit(JavaVM* vm, jobject obj);
        private:
        
            static void *videothread(void * cc);
            static void *audiothread(void * cc);
            static void * print_thread(void * cc) ;
        
            void ShowStat(long t,TSParser * pBc);
            bool   __SeekToFirstPkt(uint64 u64Offset=0);
            void   __PrintPacketInfo(TSPacket &tPkt, uint64 u64Offset, uint32 u32PktNo);
            const char *__TSTimeToStr(sint64 s64Time);
            TS_ERR __ParsePAT();
            TS_ERR __ParsePMT();
            TS_ERR __ParsePES();
            TS_ERR __ParsePESData();
            TS_ERR __Parse(const uint8 *pBuf, uint16 u16BufLen);
        
        private:
        //    const char* m_strFile;
        };
        
        #define DELETER_BUFFER(p)   if (NULL != p) { delete p; p = NULL;}
        
        class AutoDelCharBuf
        {
        public:
            AutoDelCharBuf(uint8 *pBuf) : m_pBuf(pBuf) {}
            ~AutoDelCharBuf() { DELETER_BUFFER(m_pBuf); }
        
            uint8 *Ptr() { return m_pBuf; }
        private:
            uint8 *m_pBuf;
        };
        
        #endif //__TS_PARSER_H__
      
      • TSParse.cpp

          #include "TSParser.h"
          #define MAX_READ_PKT_NUM                20000
          #define MAX_CHECK_PKT_NUM               3
          #define MAX_TIME_STR_LEN                20
          
          #define MK_WORD(high,low)               (((high)<<8)|(low))
          #define MK_PCR(b1,b2,b3,b4,b5)          (((sint64)(b1)<<25)|((sint64)(b2)<<17)|((sint64)(b3)<<9)|((sint64)(b4)<<1)|(b5))
          #define MK_PTS_DTS(b1,b2,b3,b4,b5)      (((sint64)(b1)<<30)|((sint64)(b2)<<22)|((sint64)(b3)<<15)|((sint64)(b4)<<7)|(b5))
          
          #define MIN(a,b)                        (((a) < (b)) ? (a) : (b))
          #define RETURN_IF_NOT_OK(ret)           if (TS_OK != ret) { return ret; }
          
          // 记录所有pid
          uint16 TSPacket::s_au16PIDs[E_MAX] = { PID_UNSPEC, PID_UNSPEC, PID_UNSPEC,
                  PID_UNSPEC, PID_UNSPEC, PID_UNSPEC, PID_UNSPEC };
          bool isDoWritePes = false;
          static uint8 avStreamId;
          bool m_H264Running = true;
          bool m_AccRunning = true;
          bool m_TsRunning = true;
          int videoIndex = 0;
          int last_ts_cc = 0;
          int current_ts_cc = 0;
          _HANDLE_ *pHandle;
        
          /*判断是否存在适应区域*/
          bool TSPacket::__HasAdaptField() {
              assert(NULL != m_pHdr);
              return (m_pHdr->adaptation_field_control == 0x3); //(0 != (m_pHdr->adaptation_field_control & 0x2));//m_pHdr->adaptation_field_control == 0x2 ||
          }
          
          /* 判断是否存在负载
           */
          bool TSPacket::__HasPayload() {
              assert(NULL != m_pHdr);
              return m_pHdr->payload_unit_start_indicator
                      || ((m_pHdr->adaptation_field_control & 0x1));
          }
          
          /*获取适应区域指针;适应区域不存在时返回NULL
          */
          AdaptFixedPart* TSPacket::__GetAdaptField() {
              assert(NULL != m_pBuf);
              assert(NULL != m_pHdr);
          
              AdaptFixedPart *pAdpt = NULL;
          
              if (__HasAdaptField()) {
                  pAdpt = (AdaptFixedPart*) (m_pBuf + sizeof(TSHdrFixedPart));
              }
          
              return pAdpt;
          }
          
          /*获取适应区域的长度
           */
          uint8 TSPacket::__GetAdaptLen() {
              uint8 u8AdaptLen = 0;
              AdaptFixedPart *pAdpt = __GetAdaptField();
              if (NULL != pAdpt) {
                  // "adaptation_field_length" field is 1 byte
                  u8AdaptLen = pAdpt->adaptation_field_length + 1;
              }
          
              return u8AdaptLen;
          }
          
          /*存在PCR字段时,获取PCR的值;不存在时返回-1*/
          sint64 TSPacket::__GetPCR() {
              assert(NULL != m_pBuf);
              assert(NULL != m_pHdr);
          
              sint64 s64PCR = INVALID_VAL;
              if (__HasAdaptField()) {
                  AdaptFixedPart *pAdpt = (AdaptFixedPart*) (m_pBuf
                          + sizeof(TSHdrFixedPart));
                  if (pAdpt->adaptation_field_length > 0 && pAdpt->PCR_flag) {
                      PCR *pcr = (PCR*) ((const char*) pAdpt + sizeof(AdaptFixedPart));
                      s64PCR = MK_PCR(pcr->pcr_base32_25,
                              pcr->pcr_base24_17,
                              pcr->pcr_base16_9,
                              pcr->pcr_base8_1,
                              pcr->pcr_base0);
                  }
              }
              return s64PCR;
          }
          
          /*根据StreamType判断是否视频流*/
          bool TSPacket::__IsVideoStream(uint8 u8StreamType) {
              return ((ES_TYPE_MPEG1V == u8StreamType) || (ES_TYPE_MPEG2V == u8StreamType)
                      || (ES_TYPE_MPEG4V == u8StreamType)
                      || (ES_TYPE_H264 == u8StreamType));
          }
          
          /*根据StreamType判断是否音频流*/
          bool TSPacket::__IsAudioStream(uint8 u8StreamType) {
              return ((ES_TYPE_MPEG1A == u8StreamType) || (ES_TYPE_MPEG2A == u8StreamType)
                      || (ES_TYPE_AC3 == u8StreamType) || (ES_TYPE_AAC == u8StreamType)
                      || (ES_TYPE_DTS == u8StreamType));
          }
          
          /*获取负载相对于TS包头的偏移*/
          uint8 TSPacket::__GetPayloadOffset() {
              uint8 u8Pos = sizeof(TSHdrFixedPart);
              if (__HasAdaptField()) {
                  u8Pos += __GetAdaptLen();
              }
              return u8Pos;
          }
          
          /*获取PAT/PMT表相对于TS包头的偏移*/
          uint8 TSPacket::__GetTableStartPos() {
              assert(NULL != m_pBuf);
          
              uint8 u8Pos = __GetPayloadOffset();
              if (__HasPayload()) {
                  // "pointer_field" field is 1 byte,
                  /**
                   * 当前 的 pointer_field 通过 PSI 包中赋值设置为‘1’的 payload_unit_start_indicator 来标示。(在非 PSI 包中,该指
                   示符标示传输流包中 PES 包起始)。pointer_field 指向传输流包中第一分段的起始。在传输流包中从不存在
                   多于一个的 pointer_field
                   */
                  // and whose value is the number of bytes before payload
                  uint8 u8PtrFieldLen = m_pBuf[u8Pos] + 1;
                  u8Pos += u8PtrFieldLen;
              }
              return u8Pos;
          }
          
          /*存在PTS字段时,获取PTS的值;不存在时返回-1*/
          sint64 TSPacket::__GetPTS(const OptionPESHdrFixedPart *pHdr) {
              assert(NULL != pHdr);
          
              sint64 s64PTS = INVALID_VAL;
              if (pHdr->PTS_DTS_flags & 0x2) {
                  PTS_DTS *pPTS =
                          (PTS_DTS*) ((char*) pHdr + sizeof(OptionPESHdrFixedPart));
                  s64PTS =
                          MK_PTS_DTS(pPTS->ts32_30, pPTS->ts29_22, pPTS->ts21_15, pPTS->ts14_7, pPTS->ts6_0);
              }
          
              return s64PTS;
          }
          
          /*存在DTS字段时,获取DTS的值;不存在时返回-1*/
          sint64 TSPacket::__GetDTS(const OptionPESHdrFixedPart *pHdr) {
              assert(NULL != pHdr);
          
              sint64 s64DTS = INVALID_VAL;
              if (pHdr->PTS_DTS_flags & 0x1) {
                  PTS_DTS *pDTS = (PTS_DTS*) ((char*) pHdr + sizeof(OptionPESHdrFixedPart)
                          + sizeof(PTS_DTS));
                  s64DTS =
                          MK_PTS_DTS(pDTS->ts32_30, pDTS->ts29_22, pDTS->ts21_15, pDTS->ts14_7, pDTS->ts6_0);
              }
          
              return s64DTS;
          }
          
          bool has_finish = false;
          int pre_head; //记录截取头字节
          int last_head; //记录剩余的字节尾部
          int first_index; //记录第一个头文件找到位置
          
          TSParser::TSParser() {
          }
          
          TSParser::~TSParser() {
              ReleaseH264Buf();
              ReleaseAccBuf();
              ReleaseTsBuf();
          }
          
          TS_ERR TSParser::__stopThread() {
          
              m_H264Running = false;
              m_TsRunning = false;
              m_AccRunning = false;
              ReleaseAccBuf();
              ReleaseH264Buf();
              ReleaseTsBuf();
          
              aac_decode_close(pHandle->pContext);
              delete pHandle;
          
              if (TsVedioThreadHandle)
                  pthread_join(TsVedioThreadHandle, NULL);
              TsVedioThreadHandle = NULL;
              if (TsAudioThreadHandle)
                  pthread_join(TsAudioThreadHandle, NULL);
              TsAudioThreadHandle = NULL;
          
              if (PrintThreadHandle)
                  pthread_join(PrintThreadHandle, NULL);
              PrintThreadHandle = NULL;
              return TS_OK;
          }
          
          TS_ERR TSParser::initMemory() {
              m_H264Running = true;
              m_TsRunning = true;
              m_AccRunning = true;
              InitTsMemory();
              InitH264Memory();
              InitAccMemory();
              return TS_OK;
          }
          
          TS_ERR TSParser::initAudioDecoder() {
              pHandle = new _HANDLE_;
              pHandle->code = AV_CODEC_ID_MP3;
              pHandle->pContext = 0;
              av_register_all();
              av_log_set_callback(my_logoutput);
              pHandle->pContext = aac_decoder_create(AV_CODEC_ID_MP3, IN_SAMPLE_RATE,
                      AUDIO_CHANNELS, SAMPLE_BIT);
              if (pHandle->pContext != NULL) {
                  mDebug("initAudioDecoder成功");
              }
              return TS_OK;
          }
          
          
          TS_ERR TSParser::Parse() {
              int ret_t;
              initAudioDecoder();
              mfxBitstreamTS *tsBuf = NULL;
              bool has_cache = false;
              bool has_sycn = false;
              bool is_head_start = false;
              m_H264Running = true;
              m_TsRunning = true;
              m_AccRunning = true;
              mDebug("--------------开始解析");
              struct sched_param param;
              pthread_attr_t attr;
              pthread_attr_init(&attr);
              pthread_attr_setschedpolicy(&attr, SCHED_RR);
              param.sched_priority = 90;
              pthread_attr_setschedparam(&attr, &param);
              ret_t = pthread_create(&TsVedioThreadHandle, &attr, videothread, this);
              if (ret_t) {}
              ret_t = pthread_create(&TsAudioThreadHandle, &attr, audiothread, this);
              if (ret_t) {}
              ret_t = pthread_create(&PrintThreadHandle, &attr, print_thread, this);
              if (ret_t) {}
              TS_ERR ret = TS_OK;
              unsigned char buffer[TS_PKT_LEN * 2];
              uint8 *pCacheBuf = buffer;
              uint32 ts_temp_len = 0;
              uint32 ts_cache_len = 0;
              unsigned char *pCurrentPos = NULL;
          
              long ms = 0;
              struct timespec ts;
              struct timespec pts;
              clock_gettime(CLOCK_MONOTONIC, &pts);
              clock_gettime(CLOCK_MONOTONIC, &ts);
          
              pktH264Len = 0;
              pktAccLen = 0;
              h264Buf = 0;
              jpg_count = 0;
              audio_count = 0;
          
              for (; m_TsRunning;) {
                  while (m_TsRunning) {
                      tsBuf = GetTsBuf();
                      if (tsBuf != NULL) {
                          break;
                      } else {
                          usleep(2);
                      }
                  }
                  if (tsBuf == NULL) {
                      break;
                  }
                  Xferred += tsBuf->DataLength;
                  if (tsBuf->Data == NULL || tsBuf->DataLength == 0) {
                      PushDirytTsBuf(tsBuf);
                      tsBuf = NULL;
                      continue;
                  }
                  pCurrentPos = tsBuf->Data;
                  ts_temp_len = tsBuf->DataLength;
                  is_head_start = true;
                  if (has_cache) {
                      has_cache = false;
                      memcpy(pCacheBuf + ts_cache_len, pCurrentPos,(TS_PKT_LEN - ts_cache_len));
                      if (TS_SYNC_BYTE == buffer[0]) {
                          ret = __Parse(pCacheBuf, TS_PKT_LEN);
                          ts_temp_len = tsBuf->DataLength - (TS_PKT_LEN - ts_cache_len);
                          pCurrentPos += TS_PKT_LEN - ts_cache_len;
                      } else {
          //缓存帧无头文件 %d ", ts_cache_len;
                      }
                  }
          
                  while (ts_temp_len > TS_PKT_LEN && m_TsRunning) {
                      if (TS_SYNC_BYTE == *(pCurrentPos)
                              && TS_SYNC_BYTE == *(pCurrentPos + TS_PKT_LEN)) {
                          is_head_start = false;
                          ret = __Parse(pCurrentPos, TS_PKT_LEN);
                          pCurrentPos += TS_PKT_LEN;
                          ts_temp_len -= TS_PKT_LEN;
                      } else {
                      //文件出错,查找同步头
                          pCurrentPos++;
                          ts_temp_len--;
          
                      }
          
                  }
          
                  if (TS_SYNC_BYTE == *(pCurrentPos)) {
                      if (ts_temp_len == TS_PKT_LEN) {
          
                          ret = __Parse(pCurrentPos, TS_PKT_LEN);
                      } else {
                          ts_cache_len = ts_temp_len;
                          memcpy(pCacheBuf, pCurrentPos, ts_cache_len);
                          has_cache = true;
                      }
                  } else {
                      for (int i = 0; i < ts_temp_len; i++) {
                          if (!m_TsRunning) {
                              break;
                          }
                          if (TS_SYNC_BYTE == *(pCurrentPos + i)) {
                              memcpy(pCacheBuf, pCurrentPos + i, ts_temp_len - i);
                              ts_cache_len = ts_temp_len - i;
                              has_cache = true;
                              break;
                          }
                      }
                  }
          
                  PushDirytTsBuf(tsBuf);
                  tsBuf = NULL;
                  clock_gettime(CLOCK_MONOTONIC, &ts);
                  ms = (ts.tv_sec - pts.tv_sec) * 1000
                          + (ts.tv_nsec - pts.tv_nsec) / 1000000;
                  if (ms >= 1000) {
                      clock_gettime(CLOCK_MONOTONIC, &pts);
                      m_Fps = (double) (jpg_count * 1000) / ms;
                      jpg_count = 0;
                      a_Fps = (double) (audio_count * 1000) / ms;
                      audio_count = 0;
                  }}
              return ret;
          }
          
          
          /*解析TS包*/
          TS_ERR TSParser::__Parse(const uint8 *pBuf, uint16 u16BufLen) {
          //
              assert(NULL != pBuf);
              TS_ERR ret = TS_OK;
              if ((NULL == pBuf) || (TS_PKT_LEN != u16BufLen)) {
                  return TS_IN_PARAM_ERR;
              }
              if (TS_SYNC_BYTE != pBuf[0]) {
                  return TS_SYNC_BYTE_ERR;
              }
          //  mInfo("--------------------__Parse 查找开始");
              m_pBuf = pBuf;
              m_pHdr = (TSHdrFixedPart*) pBuf;
              m_u16PID = MK_WORD(m_pHdr->pid12_8,m_pHdr->pid7_0);
          
              if (m_u16PID == PID_NULL) {
                  return ret;
              }
              //s_au16PIDs[E_PMT] = 256;      //项目中ts流信息基本固定了
              //s_au16PIDs[E_VIDEO] = 4113;
              //s_au16PIDs[E_AUDIO] = 4352;
              //s_au16PIDs[E_PCR] = 4097;
              //s_au16PIDs[E_SI] = 31;
              //s_au16PIDs[E_PAT] = 0;
              /**continuity_counter 为 4 比特字段,随着具有相同 PID 的每个传输流包而增加。
               continuity_counter 在取其最大值之后循环返回到 0 值。当包的 adaptation_field_control 为‘00’或‘10’时,
               continuity_counter 不增加*/
              m_u8CC = m_pHdr->continuity_counter;
              if (IsPAT()) {
                  ret = __ParsePAT();
                  return ret;
              } else if (IsSIT()) {
              } else if (IsNIT()) {
              } else if (IsPMT()) {
                  ret = __ParsePMT();
                  return ret;
              } else if (m_u16PID == s_au16PIDs[E_PCR]) {
                  //PCR是TS里面的,即TS packet的header里面可能会有,他用来指定所期望的该ts packet到达decoder的时间,他的作用于SCR类似。
                  /**包含 PID 未标示为 PCR_PID 的基本流数据的、包内连续性计数器不连续性点出现的以及包内 PTS 或
                   DTS 发生的每个传输流包,应在相关节目的系统时间基不连续性发生之后到达 T-STD 的输入端。在不连续
                   性状态为真的情况中,若相同 PID 的两个连续的传输流包出现,具有相同的 continuity_counter 值并具有
                   adaptation_field_control 值设置为‘01’或‘11’,则第二个包可以丢弃。传输流应不通过这样的方式来构造,
                   因为丢弃此类包它将引起 PES 包有效载荷数据或 PSI 数据的丢失。*/
          //          m_s64PCR = __GetPCR();
              }
          //          if(m_u16PID!=PID_NULL&&(m_pHdr->adaptation_field_control != 0x2)){
              /*当传输流包有效载荷包含 PSI 数据时,payload_unit_start_indicator 具有以下意义:若传输流包承载 PSI
               分段的首字节,则 payload_unit_start_indicator 值必为 1,指示此传输流包的有效载荷的首字节承载
               pointer_field。若传输流包不承载 PSI 分段的首字节,则 payload_unit_start_indicator 值必为‘0’,指示在此
               有效载荷中不存在 pointer_field。参阅 2.4.4.1 和 2.4.4.2。**/
              if (IsVideo() ) {//|| IsAudio()
                  //      mDebug("TAV——————————————————————视频数据");
                  if (m_pHdr->payload_unit_start_indicator == 1) { //‘1’,则一个且仅有一个 PES 包在此传输流包中起始
                  //      mDebug("--------------查询到PES头信息");
                      ret = __ParsePES();
                  } else { //‘0’指示在此传输流包中无任何 PES 包将开始
                      /*空包payload_unit_start_indicator应置为0.
                       ·PID:13b。表示净荷的数据类型。PID=0x0000,表示净荷的数据位节目关联表。*/
                      if (IsSIT()) {
                      } else {
                          ret = __ParsePESData();
                      }
                  }
              }else if(IsAudio()){
                  mDebug("TAV——————————————————————音频数据");
              }
              return ret;
          }
          
          TS_ERR TSParser::__ParsePAT() {
              assert(NULL != m_pBuf);
              const uint8 *pPATBuf = m_pBuf + __GetTableStartPos();
              PATHdrFixedPart *pPAT = (PATHdrFixedPart*) pPATBuf;
              uint16 u16SectionLen =
                      MK_WORD(pPAT->section_length11_8, pPAT->section_length7_0);
              uint16 u16AllSubSectionLen = u16SectionLen
                      - (sizeof(PATHdrFixedPart) - HDR_LEN_NOT_INCLUDE) - CRC32_LEN;
          
              uint16 u16SubSectionLen = sizeof(PATSubSection);
              const uint8 *ptr = pPATBuf + sizeof(PATHdrFixedPart);
              for (uint16 i = 0; i < u16AllSubSectionLen; i += u16SubSectionLen) {
                  PATSubSection *pDes = (PATSubSection*) (ptr + i);
                  uint16 u16ProgNum = pDes->program_number;
                  uint16 u16PID = MK_WORD(pDes->pid12_8, pDes->pid7_0);
                  if (0x00 == u16ProgNum) {
                      uint16 u16NetworkPID = u16PID;
                  } else {
                      m_u16PMTPID = u16PID; // program_map_PID
                      break;
                  }
              }
              s_au16PIDs[E_PMT] = m_u16PMTPID;
              return TS_OK;
          }
          
          TS_ERR TSParser::__ParsePMT() {
              assert(NULL != m_pBuf);
          
              const uint8 *pPMTBuf = m_pBuf + __GetTableStartPos();
              PMTHdrFixedPart *pPMT = (PMTHdrFixedPart*) pPMTBuf;
              s_au16PIDs[E_PCR] = MK_WORD(pPMT->PCR_PID12_8, pPMT->PCR_PID7_0);
              uint16 u16SectionLen =
                      MK_WORD(pPMT->section_length11_8, pPMT->section_length7_0);
              // n * program_info_descriptor的长度
              uint16 u16ProgInfoLen =
                      MK_WORD(pPMT->program_info_length11_8, pPMT->program_info_length7_0);
              uint16 u16AllSubSectionLen = u16SectionLen
                      - (sizeof(PMTHdrFixedPart) - HDR_LEN_NOT_INCLUDE) - u16ProgInfoLen
                      - CRC32_LEN;
          
              uint16 u16SubSectionLen = sizeof(PMTSubSectionFixedPart);
              const uint8 *ptr = pPMTBuf + sizeof(PMTHdrFixedPart) + u16ProgInfoLen;
              for (uint16 i = 0; i < u16AllSubSectionLen; i += u16SubSectionLen) {
                  PMTSubSectionFixedPart *pSec = (PMTSubSectionFixedPart*) (ptr + i);
                  uint16 u16ElementaryPID =
                          MK_WORD(pSec->elementaryPID12_8, pSec->elementaryPID7_0);
                  uint16 u16ESInfoLen =
                          MK_WORD(pSec->ES_info_lengh11_8, pSec->ES_info_lengh7_0);
                  u16SubSectionLen += u16ESInfoLen;
          
                  if (__IsVideoStream(pSec->stream_type)) {
                      s_au16PIDs[E_VIDEO] = u16ElementaryPID;
                  } else if (__IsAudioStream(pSec->stream_type)) {
                      s_au16PIDs[E_AUDIO] = u16ElementaryPID;
                  } else {
                  }
          
              }
              return TS_OK;
          }
          
          TS_ERR TSParser::__ParsePES() {
              int cc;
              int ret = -1;
              assert(NULL != m_pBuf);
              uint64 total_len = 0;
              uint64 es_len = 0;
          
              const uint8 *pPESBuf = m_pBuf + 4; // __GetPayloadOffset(); TODO: 此处4为格式固定值,写死调式降低延迟
              const uint8 *pPESData;
              PESHdrFixedPart *pPES = (PESHdrFixedPart*) pPESBuf;
          
              if (PES_START_CODE == pPES->packet_start_code_prefix) { //PES_START_CODE == pPES->packet_start_code_prefix
                  m_u8StreamId = pPES->stream_id;
                  if ((m_u8StreamId & PES_STREAM_VIDEO)
                          || (m_u8StreamId & PES_STREAM_AUDIO)) {
                      OptionPESHdrFixedPart *pHdr = (OptionPESHdrFixedPart*) (pPESBuf
                              + sizeof(PESHdrFixedPart));
                      avStreamId = m_u8StreamId;
                      pPESData = m_pBuf + (4 + sizeof(PESHdrFixedPart)
                                      + sizeof(OptionPESHdrFixedPart)
                                      + pHdr->PES_Hdr_data_length);
                      es_len = TS_PKT_LEN
                              - (4 + sizeof(PESHdrFixedPart)
                                      + sizeof(OptionPESHdrFixedPart)
                                      + pHdr->PES_Hdr_data_length);
          
                      if (IsVideo()) {
                          if (pktH264Len != 0 && h264Buf != NULL) {
                              if (h264Buf->MaxLength > pktH264Len) {
                                  h264Buf->DataLength = pktH264Len;
                                  PushH264Buf(h264Buf);
                                  h264Buf = NULL;
                                  pktH264Len = 0;
                                  jpg_count++;
                              } else {
                                  mDebug("h264 buf is small than H264 data -%d",
                                          pktH264Len);
                                  PushDirytH264Buf(h264Buf);
                                  h264Buf = NULL;
                                  pktH264Len = 0;
                              }
          
                          }
                          while (m_H264Running) {
          
                              h264Buf = GetEmptyH264Buf();
                              if (h264Buf == NULL || h264Buf->Data == NULL) {
                                  mDebug("取不到空的H264寄存对象");
                                  usleep(1);
                                  continue;
                              }
                              break;
                          }
                          if (h264Buf == NULL) {
                              return TS_IN_PARAM_ERR;
                          }
                          memcpy(h264Buf->Data, pPESData, es_len);
                          pktH264Len = es_len;
          
                      } else if (IsAudio()) {
          //              if (pktAccLen != 0 && __accBuf != NULL) {
          //                  if (__accBuf->MaxLength > pktAccLen) {
          //                      __accBuf->DataLength = pktAccLen;
          //                      PushAccBuf(__accBuf);
          //                      __accBuf = NULL;
          //                      pktAccLen = 0;
          //                  } else {
          //                      mDebug("__accBuf buf is small than H264 data -%d",
          //                              pktAccLen);
          //                      PushDirytAccBuf(__accBuf);
          //                      __accBuf = NULL;
          //                      pktAccLen = 0;
          //                  }
          //                  audio_count++;
          //              }
          //              while (m_AccRunning) {
          //                  __accBuf = GetEmptyAccBuf();
          //                  if (__accBuf == NULL || __accBuf->Data == NULL) {
          //                      usleep(10);
          //                      continue;
          //                  }
          //                  break;
          //              }
          //              if (__accBuf == NULL) {
          //                  return TS_IN_PARAM_ERR;
          //              }
          //              memcpy(__accBuf->Data, pPESData, es_len);
          //              pktAccLen = es_len;
                      }
                  } else {
          //              mLogW("---PES视频打头非视频帧");
                  }
              } else {
          //          mLogW("---PES非视频打头");
                  avStreamId = 0;
              }
              return TS_OK;
          }
          
          TS_ERR TSParser::__ParsePESData() {
              int cc;
              uint64 total_len = 0;
              uint64 es_len = 0;
              uint8 es_test = __GetPayloadOffset();
              const uint8 *pPESBuf = m_pBuf + __GetPayloadOffset();
              es_len = TS_PKT_LEN - __GetPayloadOffset();
          
              if (IsVideo()) //视频
              {
                  if (h264Buf != NULL && h264Buf->Data != NULL) {
                      memcpy(h264Buf->Data + pktH264Len, pPESBuf, es_len);
                      pktH264Len = pktH264Len + es_len;
                  } else {
                  }
              } else if (IsAudio()) { //音频
          //      if (__accBuf != NULL && __accBuf->Data != NULL) {
          //          memcpy(__accBuf->Data + pktAccLen, pPESBuf, es_len);
          //          pktAccLen = pktAccLen + es_len;
          //      } else {
          ////            mDebug("CameraLib ----__ParsePESData fifo存入为空  ");
          //      }
              }
              pPESBuf = NULL;
              return TS_OK;
          }
          
          const char *TSParser::__TSTimeToStr(sint64 s64Time) {
              static char s_acTimeStr[MAX_TIME_STR_LEN] = { 0 };
              sint64 s64MiliSecond = s64Time / 90;
              sint64 s64Second = s64MiliSecond / 1000;
              return s_acTimeStr;
          }
          
          void TSParser::InitAccMemory() {
              int i;
              bool ret;
          
              m_AccBufCount = 20;
              long len = 25000 * 200;
          
              for (i = 0; i < m_AccBufCount; i++) {
                  memset(&m_AccBuf[i], 0, sizeof(mfxBitstreamTS));
                  m_AccBuf[i].Data = new UCHAR[len];
                  if (m_AccBuf[i].Data) {
                      memset(m_AccBuf[i].Data, 0xff, len);
                  } else {
                      return;
                  }
                  m_AccBuf[i].MaxLength = len;
                  m_AccBuf[i].last_cc = -1;
                  m_AccBuf[i].intactness = 1;
          
              }
          
              ResetAccBuf();
          }
          void TSParser::ReleaseAccBuf() {
              for (int i = 0; i < m_AccBufCount; i++) {
                  if (m_AccBuf[i].Data) {
                      delete[] m_AccBuf[i].Data;
                      m_AccBuf[i].Data = NULL;
                  }
              }
          
          }
          bool TSParser::PushAccBuf(mfxBitstreamTS * pbuf) {
              bool ret = m_AccBufFifo.put((void *) pbuf);
              if (!ret) {
                  mfxBitstreamTS * pbuf1 = NULL;
                  pbuf1 = (mfxBitstreamTS *) m_AccBufFifo.get();
                  if (pbuf1) {
                      PushDirytAccBuf(pbuf1);
                  }
          
                  return m_AccBufFifo.put((void *) pbuf);
          
              } else
                  return ret;
          }
          mfxBitstreamTS * TSParser::GetAccBuf() {
              mfxBitstreamTS * pbuf = NULL;
              pbuf = (mfxBitstreamTS *) m_AccBufFifo.get();
              return pbuf;
          }
          
          bool TSParser::PushDirytAccBuf(mfxBitstreamTS * pbuf) {
              if (pbuf == NULL)
                  return false;
              pbuf->DataLength = 0;
              pbuf->DataOffset = 0;
              return m_DirtyAccBufFifo.put((void *) pbuf);
          }
          
          mfxBitstreamTS * TSParser::GetEmptyAccBuf() {
              mfxBitstreamTS * pbuf = NULL;
              pbuf = (mfxBitstreamTS *) m_DirtyAccBufFifo.get();
              if (pbuf) {
                  pbuf->DataLength = 0;
                  pbuf->DataOffset = 0;
              }
              return pbuf;
          }
          
          void TSParser::ResetAccBuf() {
              int i = 0;
              m_DirtyAccBufFifo.flush();
              m_DirtyAccBufFifo.Create(m_AccBufCount);
          
              for (i = 0; i < m_AccBufCount; i++) {
                  int ret = m_DirtyAccBufFifo.put((void *) &m_AccBuf[i]);
                  if (!ret) {
                      return;
                  }
              }
              m_AccBufFifo.flush();
              m_AccBufFifo.Create(m_AccBufCount - 1);
          }
          
          void TSParser::InitH264Memory() {
              int i;
              bool ret;
          
              m_H264BufCount = 20;
              long len = 1024 * 3500;
          
              for (i = 0; i < m_H264BufCount; i++) {
                  memset(&m_H264Buf[i], 0, sizeof(mfxBitstreamTS));
                  m_H264Buf[i].Data = new UCHAR[len];
                  if (m_H264Buf[i].Data) {
                      memset(m_H264Buf[i].Data, 0xff, len);
                  } else {
                      return;
                  }
                  m_H264Buf[i].MaxLength = len;
                  m_H264Buf[i].last_cc = -1;
                  m_H264Buf[i].intactness = 1;
          
              }
              ResetH264Buf();
          }
          void TSParser::ReleaseH264Buf() {
              for (int i = 0; i < m_H264BufCount; i++) {
                  if (m_H264Buf[i].Data) {
                      delete[] m_H264Buf[i].Data;
                      m_H264Buf[i].Data = NULL;
                  }
              }
          
          }
          
          bool TSParser::PushH264Buf(mfxBitstreamTS * pbuf) {
              if (pbuf->DataLength >= 1024 * 1000 || pbuf->DataLength <= 0) {
                  mDebug("error H264buf长度异常  %d", pbuf->DataLength);
              }
              bool ret = m_H264BufFifo.put((void *) pbuf);
              if (!ret) { //判断是否存入,没有存入说明已满,去除头丢掉,再存
                  mfxBitstreamTS * pbuf1 = NULL;
                  pbuf1 = (mfxBitstreamTS *) m_H264BufFifo.get();
                  if (pbuf1) {
                      PushDirytH264Buf(pbuf1);
                  }
                  return m_H264BufFifo.put((void *) pbuf);
          
              } else
          
                  return ret;
          }
          
          mfxBitstreamTS * TSParser::GetH264Buf() {
              mfxBitstreamTS * pbuf = NULL;
              pbuf = (mfxBitstreamTS *) m_H264BufFifo.get();
              return pbuf;
          }
          
          void TSParser::ResetH264Buf() {
              int i = 0;
              m_DirtyH264BufFifo.flush();
              m_DirtyH264BufFifo.Create(m_H264BufCount);
          
              for (i = 0; i < m_H264BufCount; i++) {
                  int ret = m_DirtyH264BufFifo.put((void *) &m_H264Buf[i]);
                  if (!ret) {
                      return;
                  }
              }
          
              m_H264BufFifo.flush();
              m_H264BufFifo.Create(m_H264BufCount - 1);
          }
          
          bool TSParser::PushDirytH264Buf(mfxBitstreamTS * pbuf) {
              if (pbuf == NULL)
                  return false;
              pbuf->DataLength = 0;
              pbuf->DataOffset = 0;
              return m_DirtyH264BufFifo.put((void *) pbuf);
          }
          
          mfxBitstreamTS * TSParser::GetEmptyH264Buf() {
              mfxBitstreamTS * pbuf = NULL;
              pbuf = (mfxBitstreamTS *) m_DirtyH264BufFifo.get();
              if (pbuf) {
                  pbuf->DataLength = 0;
                  pbuf->DataOffset = 0;
              }
              return pbuf;
          }
          
          void TSParser::InitTsMemory() {
              int i;
              bool ret;
              m_TsBufCount = 30;
              long len = 188 * 1024;
              for (i = 0; i < m_TsBufCount; i++) {
                  memset(&m_TsBuf[i], 0, sizeof(mfxBitstreamTS));
                  m_TsBuf[i].Data = new UCHAR[len];
                  if (m_TsBuf[i].Data) {
                      memset(m_TsBuf[i].Data, 0xff, len);
                  } else {
                      mDebug("new m_TsBuf[%d] failed:\n", i);
                      return;
                  }
                  m_TsBuf[i].MaxLength = len;
                  m_TsBuf[i].last_cc = -1;
                  m_TsBuf[i].intactness = 1;
              }
              ResetTsBuf();
          }
          void TSParser::ReleaseTsBuf() {
              for (int i = 0; i < m_TsBufCount; i++) {
                  if (m_TsBuf[i].Data) {
                      delete[] m_TsBuf[i].Data;
                      m_TsBuf[i].Data = NULL;
                  }
              }
          
          }
          
          bool TSParser::PushTsBuf(mfxBitstreamTS * pbuf) {
              bool ret = m_TsBufFifo.put((void *) pbuf);
              if (!ret) { //判断是否存入,没有存入说明已满,去除头丢掉,再存
                  mfxBitstreamTS * pbuf1 = NULL;
                  pbuf1 = (mfxBitstreamTS *) m_TsBufFifo.get();
                  if (pbuf1) {
                      PushDirytTsBuf(pbuf1);
                  }
                  return m_TsBufFifo.put((void *) pbuf);
              } else
                  return ret;
          }
          
          bool TSParser::PushTsFrame(unsigned char *pData, unsigned int len) {
              mfxBitstreamTS *pBufJpg = NULL;
              while (m_TsRunning) {
                  pBufJpg = GetEmptyTsBuf();
                  if (pBufJpg == NULL || pBufJpg->Data == NULL) {
                      usleep(1);
                      continue;
                  }
                  break;
              }
              if (pBufJpg == NULL || pBufJpg->Data == NULL) {
                  return -2;
              }
              memcpy(pBufJpg->Data, pData, len);
              pBufJpg->DataLength = len;
              PushTsBuf(pBufJpg);
              return true;
          }
          
          mfxBitstreamTS * TSParser::GetTsBuf() {
              mfxBitstreamTS * pbuf = NULL;
              pbuf = (mfxBitstreamTS *) m_TsBufFifo.get();
              return pbuf;
          }
          
          void TSParser::ResetTsBuf() {
              int i = 0;
              m_DirtyTsBufFifo.flush();
              m_DirtyTsBufFifo.Create(m_TsBufCount);
          
              for (i = 0; i < m_TsBufCount; i++) {
                  int ret = m_DirtyTsBufFifo.put((void *) &m_TsBuf[i]);
                  if (!ret) {
                      return;
                  }
              }
          
              m_TsBufFifo.flush();
              m_TsBufFifo.Create(m_TsBufCount - 1);
          }
          
          bool TSParser::PushDirytTsBuf(mfxBitstreamTS * pbuf) {
              if (pbuf == NULL)
                  return false;
              pbuf->DataLength = 0;
              pbuf->DataOffset = 0;
              return m_DirtyTsBufFifo.put((void *) pbuf);
          }
          
          mfxBitstreamTS * TSParser::GetEmptyTsBuf() {
              mfxBitstreamTS * pbuf = NULL;
              pbuf = (mfxBitstreamTS *) m_DirtyTsBufFifo.get();
              if (pbuf) {
                  pbuf->DataLength = 0;
                  pbuf->DataOffset = 0;
              }
              return pbuf;
          }
          
          jobject getInstanceTs(JNIEnv* env, jclass obj_class) {
              jmethodID construction_id = env->GetMethodID(obj_class, "<init>", "()V");
              jobject obj = env->NewObject(obj_class, construction_id);
              return obj;
          }
          
          //获取视频帧
          void *TSParser::videothread(void * cc) {
              TSParser * pBc = (TSParser *) cc;
              mfxBitstreamTS *h264Buf = NULL;
              for (; m_H264Running;) {
                  while (m_H264Running) {
                      h264Buf = pBc->GetH264Buf();
                      if (h264Buf != NULL && h264Buf->Data != NULL) { //如果是视频  &&h264Buf->DataLength!=0
                          if (pBc->m_jvm) {
                              bool isAttached = false;
                              JNIEnv* env = NULL;
                              if (pBc->m_jvm->GetEnv((void**) &env,
                                      JNI_VERSION_1_4) != JNI_OK) {
                                  jint res = pBc->m_jvm->AttachCurrentThread(&env, NULL);
                                  // Get the JNI env for this thread
                                  if ((res < 0) || !env) {
                                      env = NULL;
                                  } else {
                                      isAttached = true;
                                  }
                              }
                              if (env && pBc->_h264Cid) {
                                  jbyteArray bytes = env->NewByteArray(
                                          h264Buf->DataLength);
                                  env->SetByteArrayRegion(bytes, 0, h264Buf->DataLength,
                                          (jbyte*) h264Buf->Data);
                                  pBc->_javaVedioObj = getInstanceTs(env,
                                          pBc->_javaVedioClass);
                                  env->CallVoidMethod(pBc->_javaVedioObj, pBc->_h264Cid,
                                          bytes);
                                  env->DeleteLocalRef(bytes);
                                  env->DeleteLocalRef(pBc->_javaVedioObj);
          
                              }
          
                              if (isAttached) {
                                  if (pBc->m_jvm->DetachCurrentThread() < 0) {
                                      mDebug( "Could not detach thread from JVM");
                                  }
                              }
                          }
                          break;
                      } else {
                          usleep(1);
                      }
                  }
                  pBc->PushDirytH264Buf(h264Buf);
                  h264Buf = NULL;
              }
          }
          
          void *TSParser::audiothread(void * cc) {
              TSParser * pBc = (TSParser *) cc;
              uint32 decoderMp3State = 0;
              uint32 * pPCMLen = 0;
              mfxBitstreamTS *accBuf = NULL;
              for (; m_AccRunning;) {
                  while (m_AccRunning) {
                      accBuf = pBc->GetAccBuf();
                      //          mDebug("callback DeliverFrame test 0");
                      if (accBuf != NULL && accBuf->Data != NULL) { //如果是视频  &&h264Buf->DataLength!=0
                          //void *pParam, unsigned char *pData, int nLen, unsigned char *pPCM, unsigned int *outLen
                          decoderMp3State = aac_decode_frame(pHandle->pContext,
                                  accBuf->Data, accBuf->DataLength, pBc->pcm_buffer,
                                  pPCMLen);
                          if (pBc->m_jvm && decoderMp3State > 0) {
                              bool isAttached = false;
                              JNIEnv* env = NULL;
                              if (pBc->m_jvm->GetEnv((void**) &env,
                                      JNI_VERSION_1_4) != JNI_OK) {
                                  // try to attach the thread and get the env
                                  // Attach this thread to JVM
                                  jint res = pBc->m_jvm->AttachCurrentThread(&env, NULL);
                                  // Get the JNI env for this thread
                                  if ((res < 0) || !env) {
                                      mDebug("Could not attach thread to JVM (%d, %p)",
                                              res, env);
                                      env = NULL;
                                  } else {
                                      isAttached = true;
                                  }
                              }
          
                              if (env && pBc->_accCid) {
        
                                  jbyteArray bytes = env->NewByteArray(decoderMp3State);
                                  env->SetByteArrayRegion(bytes, 0, decoderMp3State,(jbyte*) pBc->pcm_buffer);
                                  pBc->_javaAudioObj = getInstanceTs(env,pBc->_javaAudioClass);
                                  env->CallVoidMethod(pBc->_javaAudioObj, pBc->_accCid,bytes);
                                  env->DeleteLocalRef(bytes);
                                  env->DeleteLocalRef(pBc->_javaAudioObj);
                              }
                              if (isAttached) {
                                  if (pBc->m_jvm->DetachCurrentThread() < 0) {
                                      mDebug( "Could not detach thread from JVM");
                                  }
                              }
                          }
                          break;
                      } else {
                          usleep(10);
                      }
                  }
                  pBc->PushDirytAccBuf(accBuf);
                  accBuf = NULL;
              }
          }
        
          int TSParser::JavaMethodInit(JavaVM* vm, jobject obj) {
              if (m_jvm)
                  return 0;
              m_jvm = vm;
              if (!m_jvm) {
                  mDebug( " No JavaVM have been provided.");
                  return -1;
              }
              // get the JNI env for this thread
              bool isAttached = false;
              JNIEnv* env = NULL;
              if (m_jvm->GetEnv((void**) &env, JNI_VERSION_1_4) != JNI_OK) {
                  // try to attach the thread and get the env
                  // Attach this thread to JVM
                  jint res = m_jvm->AttachCurrentThread(&env, NULL);
          
                  // Get the JNI env for this thread
                  if ((res < 0) || !env) {
                      mDebug( "Could not attach thread to JVM (%d, %p)", res, env);
                      return -1;
                  }
                  isAttached = true;
              }
        
              // get the ViEAndroidGLES20 class
              jclass javaRenderClassLocal = reinterpret_cast<jclass>(env->FindClass(
                      "包名0/包名1/包名2/JniLib"));
              //jclass javaRenderClassLocal = reinterpret_cast<jclass> (env->FindClass("包名0/包名1/包名2/JniLib.activity.xxx"));
              if (!javaRenderClassLocal) {
                  mDebug("could not find class 包名0/包名1/包名2/JniLib");
                  return -1;
              }
              mDebug("get class 包名0/包名1/包名2/JniLib success");
          
              _javaAudioClass = reinterpret_cast<jclass>(env->NewGlobalRef(
                      javaRenderClassLocal));
              if (!_javaAudioClass) {
                  mDebug( "could not create Java class reference");
                  return -1;
              }
              mDebug("create Java class reference success");
          
              _javaVedioClass = reinterpret_cast<jclass>(env->NewGlobalRef(
                      javaRenderClassLocal));
              if (!_javaVedioClass) {
                  mDebug( "could not create Java class reference");
                  return -1;
              }
              mDebug("create Java class reference success");
          
              _javaSpeedClass = reinterpret_cast<jclass>(env->NewGlobalRef(
                      javaRenderClassLocal));
              if (!_javaSpeedClass) {
                  mDebug( "could not create Java class reference");
                  return -1;
              }
          
              // Delete local class ref, we only use the global ref
              env->DeleteLocalRef(javaRenderClassLocal);
        
              _javaAudioObj = reinterpret_cast<jobject>(env->NewGlobalRef(obj));
              if (!_javaAudioObj) {
                  mDebug("could not create Java object reference");
                  return -1;
              }
          
              // get the method ID for the ReDraw function
              _accCid = env->GetMethodID(_javaAudioClass, "aacCallBack", "([B)V");
              if (_accCid == NULL) {
                  mDebug( " could not get dataCallBack ID");
                  return -1;
              }
          
              _javaVedioObj = reinterpret_cast<jobject>(env->NewGlobalRef(obj));
              if (!_javaVedioObj) {
                  mDebug("could not create Java object reference");
                  return -1;
              }
              mDebug(" create Global Java object reference success");
          
              // get the method ID for the ReDraw function
              _h264Cid = env->GetMethodID(_javaVedioClass, "h264CallBack", "([B)V");
              if (_h264Cid == NULL) {
                  mDebug( " could not get dataCallBack ID");
                  return -1;
              }
          
              _javaSpeedObj = reinterpret_cast<jobject>(env->NewGlobalRef(obj));
              if (!_javaSpeedObj) {
                  mDebug("could not create Java object reference");
                  return -1;
              }
              mDebug(" create Global Java object reference success");
              // get the method ID for the ReDraw function
              _speedCid = env->GetMethodID(_javaSpeedClass, "speedBack", "(JDDI)V");
              if (_speedCid == NULL) {
                  mDebug( " could not get dataCallBack ID");
                  return -1;
              }
              mDebug("get dataCallBack  ID success");
              return 0;
          }
        
      • mfxBitstreamTS结构体:

          typedef struct{
              uint64_t pts;                             //当前音频帧PTS
              uint64_t dts;                             //当前音频帧DTS
              int cc_ok;
              int last_cc;
              int intactness;
              int tream_id;//标识 流
              uint8*  Data;//当前音频帧数据缓存
              mfxU32  DataOffset;
              uint32  DataLength;//当前音频帧数据缓存长度
              mfxU32  MaxLength;
          } mfxBitstreamTS;       
        

    三、数据处理过程

    以下以H264视频数据处理为例:

    1. 初始化内存InitH264Memory()-->在fifo中存入空的数组ResetH264Buf()
    2. 从fifo中取出出GetEmptyH264Buf(),填入数据
      -->存入H264数据到fifo : PushYuvBuf(mfxBitstream * pbuf)
    3. 在线程中处理结果时:从H264数据fifo里取出:GetYuvBuf()
      --->处理完后数据置空再存入空的fifo里PushDirytH264Buf(mfxBitstream * pbuf)
    4. 处理结果线程结束:清理内存ReleaseH264Buf();

    数据处理线程:

    • C++线程1: 循环处理TS流查找头文件,解析表结构,裁劫出的音视频数据,填入数据fifo中
    • C++线程2: 创建一个Video的线程,从video fifo中取数据并回调到java方法中
    • C++线程3: 创建一个Audio的线程,从audio fifo中取数据并回调到java方法中
    • java线程1: 创建一个buff缓存组,存入视频组数据,并通过一个while(flag)线程,不断投入到Android Decoder硬解码中。

    四、总结

    在处理1080P的TS流数据时,测试从解析流到android硬解码预览显示,总延迟约140ms~200ms(晓龙835处理器),发现在硬解码部分耗时较大,有堵塞整个数据通道的嫌疑,具体优化方案在以后给出。

    相关文章

      网友评论

          本文标题:Android端实现TS流数据解析和Jni2Java回调

          本文链接:https://www.haomeiwen.com/subject/ucxhyftx.html