1. 程式人生 > >【Android音視訊】TS流資料解析和Jni2Java回撥

【Android音視訊】TS流資料解析和Jni2Java回撥

為了解析傳輸到手機的TS流有效視訊資料,進行預覽播放和錄製等功能,在網上看到的部分關於android播放TS流,如UDP預覽播放,採取的方式真是夠標新立異,如通過儲存到本地ts檔案然後播放,會有閃屏現象等,當然也有很多的播放器支援播放,如ffmepg-android,vlc-android,vitamio等等。我感覺這些庫要麼太大,要麼有很多都不滿足我的需求,所以我自己嘗試學習和通過jni呼叫c++去解析ts流

以下文章是我本文學習並參考的部分感覺非常不錯的,希望大家學習時可以看看,方便全面學習:

一、TS流簡介

1. 什麼是TS流 :

TS(Transport Stream,傳輸流),全稱則是MPEG2-TS,主要應用於實時傳送的節目,如實時廣播的電視節目,機頂盒等。它的主要格式分別為h264/mpeg4,acc/MP3等,MPEG2-TS格式的特點就是要求從視訊流的任一片段開始都是可以獨立解碼的。

2. 在學習TS流是需要了解的部分定義:

  • ES流:基本碼流,不分段的音訊、視訊或其他資訊的連續碼流。

  • PES流:分包的ES流,通過新增PES頭進行標記,PES包的長度是可變的

  • TS流:傳輸流,固定長度的封包(188B),便於解析和恢復錯包,它包含三個部分:ts header、adaptation field、payload,如下圖結構,ts header通過PID去識別資料包的內容,adaptation field為補充內容,payload即我們需要解析的pes資料。

    • 需要注意的是,一端TS流裡面可能包含多個節目,這些在解析PAT和PMT時可以通過列印資訊得到,在我程式碼裡有註釋,我的專案裡固定只包含了一個,所以適配程式碼需要自己改動

3. 解析TS流的重點在於理解他的表結構:解析TS流的流程主要是通過對應的PID去分佈解析我們需要的資訊,從而截取出對應的有效資料

  • 節目關聯表Program Association Table (PAT) 0x0000,通過PAT我們可以解析對應的PMT表的PID

  • 節目對映表Program Map Tables (PMT) 在PMT中解析出對應的視訊和音訊的PID值

  • 條件接收表Conditional Access Table (CAT) 0x0001

  • 網路資訊表Network Information Table(NIT) 0x0010

  • 部分引數或者結構說明我在程式碼註釋中給出

    圖片來源:https://www.cnblogs.com/jiayayao/p/6832614.html

4. 解析流程:具體的對應結構在我上面列出的參考文章中都講解的非常詳細,本文主要寫一個簡單流程引導,做到一個快速整合到專案的目的

  1. 遍歷TS流,通過同步位元組查到ts header,sync byte: 1B,其值固定為0x47(需要考慮差錯,buff拼接的情況)
  2. 獲取PAT
  3. 根據PAT查詢的PMT_PID查詢對應的PMT的表
  4. 根據PMT查詢對應的VEDIO_PID和AUDIO_PID
  5. 對應的PID解析視訊和音訊ParsePES
  6. 以下為解析流程結構圖: 2.png

二、TS流解析程式碼

本文給出的TS解析程式碼根據專案https://github.com/js2854/TSParser改動得來,該開源專案主要實現對TS檔案的解析和各種資訊的列印,我這邊參考新增的改動:更改為TS流實現相應解析,增加的PES-音視訊有效資料的解析,並通過jni輸出到java層,新增android jni實現,資料快取buff等,詳細的方法都有部分註釋,如有不明白,錯誤或侵權方面的問題請私信我,謝謝

  •   APP_PROJECT_PATH := $(call my-dir)
      APP_BUILD_SCRIPT := $(call my-dir)/Android.mk
      APP_ABI := armeabi armeabi-v7a
      APP_PLATFORM=android-23
    
  •   LOCAL_PATH := $(call my-dir)
      # Program
      include $(CLEAR_VARS)
      LOCAL_MODULE := tsparse
      LOCAL_SRC_FILES := jni_lib.cpp AACDecoder.cpp MFifo.cpp TSParser.cpp
      #LOCAL_C_INCLUDES := 	\
      #$(MY_LOCAL_ANDSRC)/system/core/include	\
      #$(MY_LOCAL_ANDSRC)/frameworks/native/include	\
      #$(MY_LOCAL_ANDSRC)/hardware/libhardware/include
      #LOCAL_CFLAGS := -DHAVE_PTHREADS
      LOCAL_C_INCLUDES += $(LOCAL_PATH)/prebuilt/include
      LOCAL_LDLIBS := -llog -lz -lGLESv2 -landroid -lOpenSLES 
      include $(BUILD_SHARED_LIBRARY)
    
  • jni_lib.cpp

      #ifndef UINT64_C
      #define UINT64_C(c) (c ## ULL)
      #endif
      
      #include "mdebug.h"
      #include <stdio.h>
      #include <stdlib.h>
      #include <string.h>
      #include <jni.h>
      #include <pthread.h>
      #include <unistd.h>
      #include <fcntl.h>
      #include "TSParser.h"
      
      static JavaVM *g_jvm = NULL;
      static TSParser * mpTSParser=NULL;
      
      pthread_mutex_t playMutex = PTHREAD_MUTEX_INITIALIZER;
      extern "C" {
      JNIEXPORT jint JNI_OnLoad(JavaVM * vm, void *reserved) {
      	JNIEnv *env = NULL;
      	jint result = -1;
      	mInfo("JNI_OnLoad");
      	if (vm->GetEnv((void**) &env, JNI_VERSION_1_4) != JNI_OK)
      		return -1;
      	g_jvm = vm;
      	mpCamera = new CUCamera();
      	return JNI_VERSION_1_4;
      }
      }
      ///////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
      int OnDestroy() {
      	if(mpTSParser){
      		mpTSParser->__stopThread();
      	    if (mpTSParser->TsDoloopThreadHandle)
      	            pthread_join(mpTSParser->TsDoloopThreadHandle, NULL);
      	    mpTSParser->TsDoloopThreadHandle=NULL;
      		delete mpTSParser;
      		mpTSParser = NULL;
      	}
      	return 0;
      }
      
      static void *_tmain(void * cc)
      {
      	if(mpTSParser!=NULL)
      	 	mpTSParser->Parse();
      }
      
      extern "C" {
      JNIEXPORT jint JNICALL Java_包名0_包名1_包名2_init(JNIEnv *env,
      		jobject obj);
      JNIEXPORT jint JNICALL Java_包名0_包名1_包名2_JniLib_PushTsData(JNIEnv * env, jobject obj,  jbyteArray jbArr,jint DataLen);
      JNIEXPORT void JNICALL Java_包名0_包名1_包名2_JniLib_initTS(JNIEnv *env,jobject obj);
      JNIEXPORT void JNICALL Java_包名0_包名1_包名2_JniLib_stopTsParse(JNIEnv *env,jobject obj);
      JNIEXPORT void JNICALL Java_包名0_包名1_包名2_JniLib_startTsParse(JNIEnv *env,jobject obj);
      }
      ;
      
      JNIEXPORT jint JNICALL Java_包名0_包名1_包名2_PushTsData(JNIEnv * env, jobject obj,  jbyteArray jbArr,jint DataLen)
      {
          if(!mpTSParser)
         		return -1;
         int ret = 0;
         jsize jlen = env->GetArrayLength(jbArr);
         jbyte* jbuf  =  env->GetByteArrayElements(jbArr, JNI_FALSE);
         char* buf  =  (char*)jbuf;
          mfxBitstreamTS *pBufTs = NULL;
         	while(true){
         		pBufTs = mpTSParser->GetEmptyTsBuf();
         		if(pBufTs==NULL||pBufTs->Data==NULL)
         		{
         			usleep(1);
         			continue;
         		}
         		break;
         	}
         	if (pBufTs == NULL ||pBufTs->Data == NULL) {
         		return -2;
         	}
         // mInfo("-----------------------PushTsFrame %d",DataLen); TS_TIMES
         memcpy(pBufTs->Data,(unsigned char *) jbuf, DataLen);
         pBufTs->DataLength = DataLen;
         mpTSParser->PushTsBuf(pBufTs);
         env->ReleaseByteArrayElements(jbArr, jbuf, 0);
         return ret;
      }
      
      JNIEXPORT void JNICALL Java_包名0_包名1_包名2_initTS(JNIEnv * env,
      		jobject thiz) {
          if(mpTSParser==NULL)
      	mpTSParser = new TSParser();
      	mpTSParser->initMemory();
      	mpTSParser->JavaMethodInit(g_jvm, thiz);
      	return;
      }
      
      
      JNIEXPORT void JNICALL Java_包名0_包名1_包名2_stopTsParse(JNIEnv * env,
      		jobject obj) {
      	if (!mpTSParser)
      			return ;
          mpTSParser->__stopThread();
          if (mpTSParser->TsDoloopThreadHandle)
                  pthread_join(mpTSParser->TsDoloopThreadHandle, NULL);
          mpTSParser->TsDoloopThreadHandle=NULL;
          delete mpTSParser;
      
      	mpTSParser = NULL;
      }
      
      JNIEXPORT void JNICALL Java_包名0_包名1_包名2_startTsParse(JNIEnv * env,jobject obj ) {
      	if (!mpTSParser)
      				return ;
      	int ret_t;
      	struct sched_param param;
      	pthread_attr_t attr;
      	pthread_attr_init(&attr);
      	pthread_attr_setschedpolicy(&attr, SCHED_RR);
      	param.sched_priority = 90;
      	pthread_attr_setschedparam(&attr, &param);
      	ret_t = pthread_create(&mpTSParser->TsDoloopThreadHandle, &attr, _tmain,NULL);
      	if (ret_t) {
      		mLogW("pthread_create TsDoloopThreadHandle failed [%d] \n", ret_t);
      	}
      }
      
      JNIEXPORT jint JNICALL Java_包名0_包名1_包名2_CameraLib_init(JNIEnv *env,
      		jobject obj) {
      	mpCamera->JavaMethodInit(g_jvm, obj);
      
      	return 0;
      
      }
    
  • types.h 建議到參考開源專案中拷貝

  • TSParse.h

      #ifndef __TS_PARSER_H__
      #define __TS_PARSER_H__
      struct _HANDLE_
      {
      	unsigned int code;
      	void *pContext;
      };
      
      #include <assert.h>
      #include <errno.h>
      #include <stdio.h>
      #include "mdebug.h"
      #include <string.h>
      #include <fcntl.h>
      #include "types.h"
      #include <string.h>
      #include <stdlib.h>
      #include "MFifo.h"
      
      using namespace std;
      
      typedef enum TS_ERR
      {
          TS_OK = 0,
          TS_IN_PARAM_ERR,
          TS_SYNC_BYTE_ERR,
          TS_FILE_OPEN_FAIL,
          TS_FILE_SEEK_FAIL,
      }TS_ERR;
      
      
      
      // PID種類
      typedef enum E_PKT_TYPE
      {
          E_PAT       = 0,
          E_PMT       = 1,
          E_PCR       = 2,
          E_AUDIO     = 3,
          E_VIDEO     = 4,
          E_NIT       =5,
          E_SI       =6,
          E_MAX       = 7
      
      }E_PKT_TYPE;
      
      class TSPacket
      {
      public:
      //	 uint8 * bufH264pkt;//=new uint8[1024*2000];
      	 uint32 pktH264Len;//=0;
      	 uint32 pktAccLen;//=0;
      	 uint32 pktindex;//=0;
      	 bool get_PAT_Head=false;
      	 bool get_PMT_Head=false;
      	 mfxBitstreamTS *h264Buf;
      	 mfxBitstreamTS *__accBuf;
      
          TSPacket()
              : m_pBuf(NULL)
              , m_pHdr(NULL)
              , m_u16PID(PID_UNSPEC)
              , m_u8CC(0)
              , m_u16PMTPID(PID_UNSPEC)
              , m_u8StreamId(0)
              , m_s64PCR(INVALID_VAL)
              , m_s64PTS(INVALID_VAL)
              , m_s64DTS(INVALID_VAL)
          {
      
      //    	bufH264pkt=new uint8[1024*2000];
      
          	pktH264Len=0;
          	pktindex=0;
          	get_PAT_Head=false;
          	get_PMT_Head=false;
          }
          ~TSPacket() {}
      
          uint16 GetPID() const { return m_u16PID; }
          uint8  GetCC() const { return m_u8CC; }
      
          bool   IsPAT() { return (PID_PAT == m_u16PID); }
          uint16 GetPMTPID() const { return m_u16PMTPID; }
      
          bool   IsSIT() { return (PID_DVB_SIT == m_u16PID); }
          bool   IsNIT() { return (PID_DVB_NIT == m_u16PID); }
      
          bool   IsPMT() { return (PID_UNSPEC != m_u16PID &&s_au16PIDs[E_PMT] == m_u16PID); }//
          bool   IsVideo() { return (s_au16PIDs[E_VIDEO] == m_u16PID); }
          bool   IsAudio() { return (s_au16PIDs[E_AUDIO] == m_u16PID); }
      
          sint64 GetPCR() const { return m_s64PCR; }
          sint64 GetPTS() const { return m_s64PTS; }
          sint64 GetDTS() const { return m_s64DTS; }
      
      public:
          static uint16 s_au16PIDs[E_MAX]; // 記錄所有pid
      
      
          bool   __HasAdaptField();
          bool   __HasPayload();
          AdaptFixedPart* __GetAdaptField();
          uint8  __GetAdaptLen();
          sint64 __GetPCR();
          bool   __IsVideoStream(uint8 u8StreamType);
          bool   __IsAudioStream(uint8 u8StreamType);
          uint8  __GetPayloadOffset();
          uint8  __GetTableStartPos();
          sint64 __GetPTS(const OptionPESHdrFixedPart *pHdr);
          sint64 __GetDTS(const OptionPESHdrFixedPart *pHdr);
      
      
      
          uint8           m_count_v;
          const uint8     *m_pBuf;
          TSHdrFixedPart  *m_pHdr;
          uint16          m_u16PID;
          uint8           m_u8CC;
          uint16          m_u16PMTPID;
          uint8           m_u8StreamId;
          sint64          m_s64PCR;
          sint64          m_s64PTS;
          sint64          m_s64DTS;
      };
      typedef struct{
            //對內使用
      	uint8 out_videobuff[TS_MAX_OUT_BUFF];         //當前視訊幀資料快取
      	int video_buflen;                             //當前視訊幀資料快取長度
      	uint64_t pts_video;                             //當前視訊幀PTS
      	uint64_t dts_video;                             //當前視訊幀DTS
          int video_cc_ok ;                             //
          int video_last_cc;                            //上個視訊TS包計數值
      	int video_intactness;                         //幀內容完整標誌    1 : 完整 ;  0 : 不完整
      
      	uint8 out_audiobuff[TS_MAX_OUT_BUFF];         //當前音訊幀資料快取
      	int audio_buflen;                             //當前音訊幀資料快取長度
      	uint64_t pts_audio;                             //當前音訊幀PTS
      	uint64_t dts_audio;                             //當前音訊幀DTS
          int audio_cc_ok;
      	int audio_last_cc;
      	int audio_intactness;
      
      
      
      }ts_outdata;
      
      class TSParser :public TSPacket
      {
      public:
      	 TSParser();
      	~TSParser();
      
      	unsigned char pcm_buffer[1024 * 20];
      	long 			Xferred;
      	double			m_Fps;
      	double			a_Fps;
      	unsigned int jpg_count;
      	unsigned int audio_count;
      
      	CFifo m_H264BufFifo;
      	CFifo m_AccBufFifo;
      	CFifo m_TsBufFifo;
      
      	CFifo m_DirtyH264BufFifo;
      	CFifo m_DirtyAccBufFifo;
      	CFifo m_DirtyTsBufFifo;
      
          int				m_H264BufCount;
          int				m_AccBufCount;
          int				m_TsBufCount;
      
          mfxBitstreamTS	m_H264Buf[100];
          mfxBitstreamTS	m_AccBuf[100];
          mfxBitstreamTS	m_TsBuf[100];
      
          TS_ERR Parse();
          HANDLE TsVedioThreadHandle;
          HANDLE TsAudioThreadHandle;
          HANDLE TsDoloopThreadHandle;
          HANDLE PrintThreadHandle;
      
          JavaVM*     m_jvm;
      	jobject _javaAudioObj;
      	jclass _javaAudioClass;
      
      	jobject _javaVedioObj;
      	jclass _javaVedioClass;
      
      	jobject _javaSpeedObj;
      	jclass _javaSpeedClass;
      
      	jmethodID      _accCid;
      	jmethodID      _h264Cid;
      	jmethodID      _speedCid;
        void InitH264Memory();
        mfxBitstreamTS * GetEmptyH264Buf();
        void ResetH264Buf();
        bool PushDirytH264Buf(mfxBitstreamTS * pbuf);
        mfxBitstreamTS * GetH264Buf();
        bool PushH264Buf(mfxBitstreamTS * pbuf);
        void ReleaseH264Buf();
      
        void InitAccMemory();
        mfxBitstreamTS * GetEmptyAccBuf();
        void ReleaseAccBuf();
        bool PushAccBuf(mfxBitstreamTS * pbuf);
        mfxBitstreamTS * GetAccBuf();
        void ResetAccBuf();
        bool PushDirytAccBuf(mfxBitstreamTS * pbuf);
      
         void InitTsMemory();
         mfxBitstreamTS * GetEmptyTsBuf();
         void ReleaseTsBuf();
         bool PushTsBuf(mfxBitstreamTS * pbuf);
         bool PushTsFrame(unsigned char *pData, unsigned int len);
         mfxBitstreamTS * GetTsBuf();
         void ResetTsBuf();
         bool PushDirytTsBuf(mfxBitstreamTS * pbuf);
      
         TS_ERR __stopThread();
         TS_ERR initMemory();
         TS_ERR initAudioDecoder();
         int JavaMethodInit(JavaVM* vm, jobject obj);
      private:
      
          static void *videothread(void * cc);
          static void *audiothread(void * cc);
          static void * print_thread(void * cc) ;
      
          void ShowStat(long t,TSParser * pBc);
          bool   __SeekToFirstPkt(uint64 u64Offset=0);
          void   __PrintPacketInfo(TSPacket &tPkt, uint64 u64Offset, uint32 u32PktNo);
          const char *__TSTimeToStr(sint64 s64Time);
      	TS_ERR __ParsePAT();
          TS_ERR __ParsePMT();
          TS_ERR __ParsePES();
          TS_ERR __ParsePESData();
      	TS_ERR __Parse(const uint8 *pBuf, uint16 u16BufLen);
      
      private:
      //    const char* m_strFile;
      };
      
      #define DELETER_BUFFER(p)   if (NULL != p) { delete p; p = NULL;}
      
      class AutoDelCharBuf
      {
      public:
          AutoDelCharBuf(uint8 *pBuf) : m_pBuf(pBuf) {}
          ~AutoDelCharBuf() { DELETER_BUFFER(m_pBuf); }
      
          uint8 *Ptr() { return m_pBuf; }
      private:
          uint8 *m_pBuf;
      };
      
      #endif //__TS_PARSER_H__
    
    • TSParse.cpp

        #include "TSParser.h"
        #define MAX_READ_PKT_NUM                20000
        #define MAX_CHECK_PKT_NUM               3
        #define MAX_TIME_STR_LEN                20
        
        #define MK_WORD(high,low)               (((high)<<8)|(low))
        #define MK_PCR(b1,b2,b3,b4,b5)          (((sint64)(b1)<<25)|((sint64)(b2)<<17)|((sint64)(b3)<<9)|((sint64)(b4)<<1)|(b5))
        #define MK_PTS_DTS(b1,b2,b3,b4,b5)      (((sint64)(b1)<<30)|((sint64)(b2)<<22)|((sint64)(b3)<<15)|((sint64)(b4)<<7)|(b5))
        
        #define MIN(a,b)                        (((a) < (b)) ? (a) : (b))
        #define RETURN_IF_NOT_OK(ret)           if (TS_OK != ret) { return ret; }
        
        // 記錄所有pid
        uint16 TSPacket::s_au16PIDs[E_MAX] = { PID_UNSPEC, PID_UNSPEC, PID_UNSPEC,
        		PID_UNSPEC, PID_UNSPEC, PID_UNSPEC, PID_UNSPEC };
        bool isDoWritePes = false;
        static uint8 avStreamId;
        bool m_H264Running = true;
        bool m_AccRunning = true;
        bool m_TsRunning = true;
        int videoIndex = 0;
        int last_ts_cc = 0;
        int current_ts_cc = 0;
        _HANDLE_ *pHandle;
      
        /*判斷是否存在適應區域*/
        bool TSPacket::__HasAdaptField() {
        	assert(NULL != m_pHdr);
        	return (m_pHdr->adaptation_field_control == 0x3); //(0 != (m_pHdr->adaptation_field_control & 0x2));//m_pHdr->adaptation_field_control == 0x2 ||
        }
        
        /* 判斷是否存在負載
         */
        bool TSPacket::__HasPayload() {
        	assert(NULL != m_pHdr);
        	return m_pHdr->payload_unit_start_indicator
        			|| ((m_pHdr->adaptation_field_control & 0x1));
        }
        
        /*獲取適應區域指標;適應區域不存在時返回NULL
        */
        AdaptFixedPart* TSPacket::__GetAdaptField() {
        	assert(NULL != m_pBuf);
        	assert(NULL != m_pHdr);
        
        	AdaptFixedPart *pAdpt = NULL;
        
        	if (__HasAdaptField()) {
        		pAdpt = (AdaptFixedPart*) (m_pBuf + sizeof(TSHdrFixedPart));
        	}
        
        	return pAdpt;
        }
        
        /*獲取適應區域的長度
         */
        uint8 TSPacket::__GetAdaptLen() {
        	uint8 u8AdaptLen = 0;
        	AdaptFixedPart *pAdpt = __GetAdaptField();
        	if (NULL != pAdpt) {
        		// "adaptation_field_length" field is 1 byte
        		u8AdaptLen = pAdpt->adaptation_field_length + 1;
        	}
        
        	return u8AdaptLen;
        }
        
        /*存在PCR欄位時,獲取PCR的值;不存在時返回-1*/
        sint64 TSPacket::__GetPCR() {
        	assert(NULL != m_pBuf);
        	assert(NULL != m_pHdr);
        
        	sint64 s64PCR = INVALID_VAL;
        	if (__HasAdaptField()) {
        		AdaptFixedPart *pAdpt = (AdaptFixedPart*) (m_pBuf
        				+ sizeof(TSHdrFixedPart));
        		if (pAdpt->adaptation_field_length > 0 && pAdpt->PCR_flag) {
        			PCR *pcr = (PCR*) ((const char*) pAdpt + sizeof(AdaptFixedPart));
        			s64PCR = MK_PCR(pcr->pcr_base32_25,
        					pcr->pcr_base24_17,
        					pcr->pcr_base16_9,
        					pcr->pcr_base8_1,
        					pcr->pcr_base0);
        		}
        	}
        	return s64PCR;
        }
        
        /*根據StreamType判斷是否視訊流*/
        bool TSPacket::__IsVideoStream(uint8 u8StreamType) {
        	return ((ES_TYPE_MPEG1V == u8StreamType) || (ES_TYPE_MPEG2V == u8StreamType)
        			|| (ES_TYPE_MPEG4V == u8StreamType)
        			|| (ES_TYPE_H264 == u8StreamType));
        }
        
        /*根據StreamType判斷是否音訊流*/
        bool TSPacket::__IsAudioStream(uint8 u8StreamType) {
        	return ((ES_TYPE_MPEG1A == u8StreamType) || (ES_TYPE_MPEG2A == u8StreamType)
        			|| (ES_TYPE_AC3 == u8StreamType) || (ES_TYPE_AAC == u8StreamType)
        			|| (ES_TYPE_DTS == u8StreamType));
        }
        
        /*獲取負載相對於TS包頭的偏移*/
        uint8 TSPacket::__GetPayloadOffset() {
        	uint8 u8Pos = sizeof(TSHdrFixedPart);
        	if (__HasAdaptField()) {
        		u8Pos += __GetAdaptLen();
        	}
        	return u8Pos;
        }
        
        /*獲取PAT/PMT表相對於TS包頭的偏移*/
        uint8 TSPacket::__GetTableStartPos() {
        	assert(NULL != m_pBuf);
        
        	uint8 u8Pos = __GetPayloadOffset();
        	if (__HasPayload()) {
        		// "pointer_field" field is 1 byte,
        		/**
        		 * 當前 的 pointer_field 通過 PSI 包中賦值設定為‘1’的 payload_unit_start_indicator 來標示。(在非 PSI 包中,該指
        		 示符標示傳輸流包中 PES 包起始)。pointer_field 指向傳輸流包中第一分段的起始。在傳輸流包中從不存在
        		 多於一個的 pointer_field
        		 */
        		// and whose value is the number of bytes before payload
        		uint8 u8PtrFieldLen = m_pBuf[u8Pos] + 1;
        		u8Pos += u8PtrFieldLen;
        	}
        	return u8Pos;
        }
        
        /*存在PTS欄位時,獲取PTS的值;不存在時返回-1*/
        sint64 TSPacket::__GetPTS(const OptionPESHdrFixedPart *pHdr) {
        	assert(NULL != pHdr);
        
        	sint64 s64PTS = INVALID_VAL;
        	if (pHdr->PTS_DTS_flags & 0x2) {
        		PTS_DTS *pPTS =
        				(PTS_DTS*) ((char*) pHdr + sizeof(OptionPESHdrFixedPart));
        		s64PTS =
        				MK_PTS_DTS(pPTS->ts32_30, pPTS->ts29_22, pPTS->ts21_15, pPTS->ts14_7, pPTS->ts6_0);
        	}
        
        	return s64PTS;
        }
        
        /*存在DTS欄位時,獲取DTS的值;不存在時返回-1*/
        sint64 TSPacket::__GetDTS(const OptionPESHdrFixedPart *pHdr) {
        	assert(NULL != pHdr);
        
        	sint64 s64DTS = INVALID_VAL;
        	if (pHdr->PTS_DTS_flags & 0x1) {
        		PTS_DTS *pDTS = (PTS_DTS*) ((char*) pHdr + sizeof(OptionPESHdrFixedPart)
        				+ sizeof(PTS_DTS));
        		s64DTS =
        				MK_PTS_DTS(pDTS->ts32_30, pDTS->ts29_22, pDTS->ts21_15, pDTS->ts14_7, pDTS->ts6_0);
        	}
        
        	return s64DTS;
        }
        
        bool has_finish = false;
        int pre_head; //記錄擷取頭位元組
        int last_head; //記錄剩餘的位元組尾部
        int first_index; //記錄第一個標頭檔案找到位置
        
        TSParser::TSParser() {
        }
        
        TSParser::~TSParser() {
        	ReleaseH264Buf();
        	ReleaseAccBuf();
        	ReleaseTsBuf();
        }
        
        TS_ERR TSParser::__stopThread() {
        
        	m_H264Running = false;
        	m_TsRunning = false;
        	m_AccRunning = false;
        	ReleaseAccBuf();
        	ReleaseH264Buf();
        	ReleaseTsBuf();
        
        	aac_decode_close(pHandle->pContext);
        	delete pHandle;
        
        	if (TsVedioThreadHandle)
        		pthread_join(TsVedioThreadHandle, NULL);
        	TsVedioThreadHandle = NULL;
        	if (TsAudioThreadHandle)
        		pthread_join(TsAudioThreadHandle, NULL);
        	TsAudioThreadHandle = NULL;
        
        	if (PrintThreadHandle)
        		pthread_join(PrintThreadHandle, NULL);
        	PrintThreadHandle = NULL;
        	return TS_OK;
        }
        
        TS_ERR TSParser::initMemory() {
        	m_H264Running = true;
        	m_TsRunning = true;
        	m_AccRunning = true;
        	InitTsMemory();
        	InitH264Memory();
        	InitAccMemory();
        	return TS_OK;
        }
        
        TS_ERR TSParser::initAudioDecoder() {
        	pHandle = new _HANDLE_;
        	pHandle->code = AV_CODEC_ID_MP3;
        	pHandle->pContext = 0;
        	av_register_all();
        	av_log_set_callback(my_logoutput);
        	pHandle->pContext = aac_decoder_create(AV_CODEC_ID_MP3, IN_SAMPLE_RATE,
        			AUDIO_CHANNELS, SAMPLE_BIT);
        	if (pHandle->pContext != NULL) {
        		mDebug("initAudioDecoder成功");
        	}
        	return TS_OK;
        }
        
        
        TS_ERR TSParser::Parse() {
        	int ret_t;
        	initAudioDecoder();
        	mfxBitstreamTS *tsBuf = NULL;
        	bool has_cache = false;
        	bool has_sycn = false;
        	bool is_head_start = false;
        	m_H264Running = true;
        	m_TsRunning = true;
        	m_AccRunning = true;
        	mDebug("--------------開始解析");
        	struct sched_param param;
        	pthread_attr_t attr;
        	pthread_attr_init(&attr);
        	pthread_attr_setschedpolicy(&attr, SCHED_RR);
        	param.sched_priority = 90;
        	pthread_attr_setschedparam(&attr, &param);
        	ret_t = pthread_create(&TsVedioThreadHandle, &attr, videothread, this);
        	if (ret_t) {}
        	ret_t = pthread_create(&TsAudioThreadHandle, &attr, audiothread, this);
        	if (ret_t) {}
        	ret_t = pthread_create(&PrintThreadHandle, &attr, print_thread, this);
        	if (ret_t) {}
        	TS_ERR ret = TS_OK;
        	unsigned char buffer[TS_PKT_LEN * 2];
        	uint8 *pCacheBuf = buffer;
        	uint32 ts_temp_len = 0;
        	uint32 ts_cache_len = 0;
        	unsigned char *pCurrentPos = NULL;
        
        	long ms = 0;
        	struct timespec ts;
        	struct timespec pts;
        	clock_gettime(CLOCK_MONOTONIC, &pts);
        	clock_gettime(CLOCK_MONOTONIC, &ts);
        
        	pktH264Len = 0;
        	pktAccLen = 0;
        	h264Buf = 0;
        	jpg_count = 0;
        	audio_count = 0;
        
        	for (; m_TsRunning;) {
        		while (m_TsRunning) {
        			tsBuf = GetTsBuf();
        			if (tsBuf != NULL) {
        				break;
        			} else {
        				usleep(2);
        			}
        		}
        		if (tsBuf == NULL) {
        			break;
        		}
        		Xferred += tsBuf->DataLength;
        		if (tsBuf->Data == NULL || tsBuf->DataLength == 0) {
        			PushDirytTsBuf(tsBuf);
        			tsBuf = NULL;
        			continue;
        		}
        		pCurrentPos = tsBuf->Data;
        		ts_temp_len = tsBuf->DataLength;
        		is_head_start = true;
        		if (has_cache) {
        			has_cache = false;
        			memcpy(pCacheBuf + ts_cache_len, pCurrentPos,(TS_PKT_LEN - ts_cache_len));
        			if (TS_SYNC_BYTE == buffer[0]) {
        				ret = __Parse(pCacheBuf, TS_PKT_LEN);
        				ts_temp_len = tsBuf->DataLength - (TS_PKT_LEN - ts_cache_len);
        				pCurrentPos += TS_PKT_LEN - ts_cache_len;
        			} else {
        //快取幀無標頭檔案 %d ", ts_cache_len;
        			}
        		}
        
        		while (ts_temp_len > TS_PKT_LEN && m_TsRunning) {
        			if (TS_SYNC_BYTE == *(pCurrentPos)
        					&& TS_SYNC_BYTE == *(pCurrentPos + TS_PKT_LEN)) {
        				is_head_start = false;
        				ret = __Parse(pCurrentPos, TS_PKT_LEN);
        				pCurrentPos += TS_PKT_LEN;
        				ts_temp_len -= TS_PKT_LEN;
        			} else {
        			//檔案出錯,查詢同步頭
        				pCurrentPos++;
        				ts_temp_len--;
        
        			}
        
        		}
        
        		if (TS_SYNC_BYTE == *(pCurrentPos)) {
        			if (ts_temp_len == TS_PKT_LEN) {
        
        				ret = __Parse(pCurrentPos, TS_PKT_LEN);
        			} else {
        				ts_cache_len = ts_temp_len;
        				memcpy(pCacheBuf, pCurrentPos, ts_cache_len);
        				has_cache = true;
        			}
        		} else {
        			for (int i = 0; i < ts_temp_len; i++) {
        				if (!m_TsRunning) {
        					break;
        				}
        				if (TS_SYNC_BYTE == *(pCurrentPos + i)) {
        					memcpy(pCacheBuf, pCurrentPos + i, ts_temp_len - i);
        					ts_cache_len = ts_temp_len - i;
        					has_cache = true;
        					break;
        				}
        			}
        		}
        
        		PushDirytTsBuf(tsBuf);
        		tsBuf = NULL;
        		clock_gettime(CLOCK_MONOTONIC, &ts);
        		ms = (ts.tv_sec - pts.tv_sec) * 1000
        				+ (ts.tv_nsec - pts.tv_nsec) / 1000000;
        		if (ms >= 1000) {
        			clock_gettime(CLOCK_MONOTONIC, &pts);
        			m_Fps = (double) (jpg_count * 1000) / ms;
        			jpg_count = 0;
        			a_Fps = (double) (audio_count * 1000) / ms;
        			audio_count = 0;
        		}}
        	return ret;
        }
        
        
        /*解析TS包*/
        TS_ERR TSParser::__Parse(const uint8 *pBuf, uint16 u16BufLen) {
        //
        	assert(NULL != pBuf);
        	TS_ERR ret = TS_OK;
        	if ((NULL == pBuf) || (TS_PKT_LEN != u16BufLen)) {
        		return TS_IN_PARAM_ERR;
        	}
        	if (TS_SYNC_BYTE != pBuf[0]) {
        		return TS_SYNC_BYTE_ERR;
        	}
        //	mInfo("--------------------__Parse 查詢開始");
        	m_pBuf = pBuf;
        	m_pHdr = (TSHdrFixedPart*) pBuf;
        	m_u16PID = MK_WORD(m_pHdr->pid12_8,m_pHdr->pid7_0);
        
        	if (m_u16PID == PID_NULL) {
        		return ret;
        	}
        	//s_au16PIDs[E_PMT] = 256;      //專案中ts流資訊基本固定了
        	//s_au16PIDs[E_VIDEO] = 4113;
        	//s_au16PIDs[E_AUDIO] = 4352;
        	//s_au16PIDs[E_PCR] = 4097;
        	//s_au16PIDs[E_SI] = 31;
        	//s_au16PIDs[E_PAT] = 0;
        	/**continuity_counter 為 4 位元欄位,隨著具有相同 PID 的每個傳輸流包而增加。
        	 continuity_counter 在取其最大值之後迴圈返回到 0 值。當包的 adaptation_field_control 為‘00’或‘10’時,
        	 continuity_counter 不增加*/
        	m_u8CC = m_pHdr->continuity_counter;
        	if (IsPAT()) {
        		ret = __ParsePAT();
        		return ret;
        	} else if (IsSIT()) {
        	} else if (IsNIT()) {
        	} else if (IsPMT()) {
        		ret = __ParsePMT();
        		return ret;
        	} else if (m_u16PID == s_au16PIDs[E_PCR]) {
        		//PCR是TS裡面的,即TS packet的header裡面可能會有,他用來指定所期望的該ts packet到達decoder的時間,他的作用於SCR類似。
        		/**包含 PID 未標示為 PCR_PID 的基本流資料的、包內連續性計數器不連續性點出現的以及包內 PTS 或
        		 DTS 發生的每個傳輸流包,應在相關節目的系統時間基不連續性發生之後到達 T-STD 的輸入端。在不連續
        		 性狀態為真的情況中,若相同 PID 的兩個連續的傳輸流包出現,具有相同的 continuity_counter 值並具有
        		 adaptation_field_control 值設定為‘01’或‘11’,則第二個包可以丟棄。傳輸流應不通過這樣的方式來構造,
        		 因為丟棄此類包它將引起 PES 包有效載荷資料或 PSI 資料的丟失。*/
        //			m_s64PCR = __GetPCR();
        	}
        //			if(m_u16PID!=PID_NULL&&(m_pHdr->adaptation_field_control != 0x2)){
        	/*當傳輸流包有效載荷包含 PSI 資料時,payload_unit_start_indicator 具有以下意義:若傳輸流包承載 PSI
        	 分段的首位元組,則 payload_unit_start_indicator 值必為 1,指示此傳輸流包的有效載荷的首位元組承載
        	 pointer_field。若傳輸流包不承載 PSI 分段的首位元組,則 payload_unit_start_indicator 值必為‘0’,指示在此
        	 有效載荷中不存在 pointer_field。參閱 2.4.4.1 和 2.4.4.2。**/
        	if (IsVideo() ) {//|| IsAudio()
                //		mDebug("TAV——————————————————————視訊資料");
        		if (m_pHdr->payload_unit_start_indicator == 1) { //‘1’,則一個且僅有一個 PES 包在此傳輸流包中起始
                //		mDebug("--------------查詢到PES頭資訊");
        			ret = __ParsePES();
        		} else { //‘0’指示在此傳輸流包中無任何 PES 包將開始
        			/*空包payload_unit_start_indicator應置為0.
        			 ·PID:13b。表示淨荷的資料型別。PID=0x0000,表示淨荷的資料位節目關聯表。*/
                    if (IsSIT()) {
                    } else {
                        ret = __ParsePESData();
                    }
        		}
        	}else if(IsAudio()){
        		mDebug("TAV——————————————————————音訊資料");
        	}
        	return ret;
        }
        
        TS_ERR TSParser::__ParsePAT() {
        	assert(NULL != m_pBuf);
        	const uint8 *pPATBuf = m_pBuf + __GetTableStartPos();
        	PATHdrFixedPart *pPAT = (PATHdrFixedPart*) pPATBuf;
        	uint16 u16SectionLen =
        			MK_WORD(pPAT->section_length11_8, pPAT->section_length7_0);
        	uint16 u16AllSubSectionLen = u16SectionLen
        			- (sizeof(PATHdrFixedPart) - HDR_LEN_NOT_INCLUDE) - CRC32_LEN;
        
        	uint16 u16SubSectionLen = sizeof(PATSubSection);
        	const uint8 *ptr = pPATBuf + sizeof(PATHdrFixedPart);
        	for (uint16 i = 0; i < u16AllSubSectionLen; i += u16SubSectionLen) {
        		PATSubSection *pDes = (PATSubSection*) (ptr + i);
        		uint16 u16ProgNum = pDes->program_number;
        		uint16 u16PID = MK_WORD(pDes->pid12_8, pDes->pid7_0);
        		if (0x00 == u16ProgNum) {
        			uint16 u16NetworkPID = u16PID;
        		} else {
        			m_u16PMTPID = u16PID; // program_map_PID
        			break;
        		}
        	}
        	s_au16PIDs[E_PMT] = m_u16PMTPID;
        	return TS_OK;
        }
        
        TS_ERR TSParser::__ParsePMT() {
        	assert(NULL != m_pBuf);
        
        	const uint8 *pPMTBuf = m_pBuf + __GetTableStartPos();
        	PMTHdrFixedPart *pPMT = (PMTHdrFixedPart*) pPMTBuf;
        	s_au16PIDs[E_PCR] = MK_WORD(pPMT->PCR_PID12_8, pPMT->PCR_PID7_0);
        	uint16 u16SectionLen =
        			MK_WORD(pPMT->section_length11_8, pPMT->section_length7_0);
        	// n * program_info_descriptor的長度
        	uint16 u16ProgInfoLen =
        			MK_WORD(pPMT->program_info_length11_8, pPMT->program_info_length7_0);
        	uint16 u16AllSubSectionLen = u16SectionLen
        			- (sizeof(PMTHdrFixedPart) - HDR_LEN_NOT_INCLUDE) - u16ProgInfoLen
        			- CRC32_LEN;
        
        	uint16 u16SubSectionLen = sizeof(PMTSubSectionFixedPart);
        	const uint8 *ptr = pPMTBuf + sizeof(PMTHdrFixedPart) + u16ProgInfoLen;
        	for (uint16 i = 0; i < u16AllSubSectionLen; i += u16SubSectionLen) {
        		PMTSubSectionFixedPart *pSec = (PMTSubSectionFixedPart*) (ptr + i);
        		uint16 u16ElementaryPID =
        				MK_WORD(pSec->elementaryPID12_8, pSec->elementaryPID7_0);
        		uint16 u16ESInfoLen =
        				MK_WORD(pSec->ES_info_lengh11_8, pSec->ES_info_lengh7_0);
        		u16SubSectionLen += u16ESInfoLen;
        
        		if (__IsVideoStream(pSec->stream_type)) {
        			s_au16PIDs[E_VIDEO] = u16ElementaryPID;
        		} else if (__IsAudioStream(pSec->stream_type)) {
        			s_au16PIDs[E_AUDIO] = u16ElementaryPID;
        		} else {
        		}
        
        	}
        	return TS_OK;
        }
        
        TS_ERR TSParser::__ParsePES() {
        	int cc;
        	int ret = -1;
        	assert(NULL != m_pBuf);
        	uint64 total_len = 0;
        	uint64 es_len = 0;
        
        	const uint8 *pPESBuf = m_pBuf + 4; // __GetPayloadOffset(); TODO: 此處4為格式固定值,寫死調式降低延遲
        	const uint8 *pPESData;
        	PESHdrFixedPart *pPES = (PESHdrFixedPart*) pPESBuf;
        
        	if (PES_START_CODE == pPES->packet_start_code_prefix) { //PES_START_CODE == pPES->packet_start_code_prefix
        		m_u8StreamId = pPES->stream_id;
        		if ((m_u8StreamId & PES_STREAM_VIDEO)
        				|| (m_u8StreamId & PES_STREAM_AUDIO)) {
        			OptionPESHdrFixedPart *pHdr = (OptionPESHdrFixedPart*) (pPESBuf
        					+ sizeof(PESHdrFixedPart));
        			avStreamId = m_u8StreamId;
        			pPESData = m_pBuf + (4 + sizeof(PESHdrFixedPart)
        							+ sizeof(OptionPESHdrFixedPart)
        							+ pHdr->PES_Hdr_data_length);
        			es_len = TS_PKT_LEN
        					- (4 + sizeof(PESHdrFixedPart)
        							+ sizeof(OptionPESHdrFixedPart)
        							+ pHdr->PES_Hdr_data_length);
        
        			if (IsVideo()) {
        				if (pktH264Len != 0 && h264Buf != NULL) {
        					if (h264Buf->MaxLength > pktH264Len) {
        						h264Buf->DataLength = pktH264Len;
        						PushH264Buf(h264Buf);
        						h264Buf = NULL;
        						pktH264Len = 0;
        						jpg_count++;
        					} else {
        						mDebug("h264 buf is small than H264 data -%d",
        								pktH264Len);
        						PushDirytH264Buf(h264Buf);
        						h264Buf = NULL;
        						pktH264Len = 0;
        					}
        
        				}
        				while (m_H264Running) {
        
        					h264Buf = GetEmptyH264Buf();
        					if (h264Buf == NULL || h264Buf->Data == NULL) {
        						mDebug("取不到空的H264寄存物件");
        						usleep(1);
        						continue;
        					}
        					break;
        				}
        				if (h264Buf == NULL) {
        					return TS_IN_PARAM_ERR;
        				}
        				memcpy(h264Buf->Data, pPESData, es_len);
        				pktH264Len = es_len;
        
        			} else if (IsAudio()) {
        //				if (pktAccLen != 0 && __accBuf != NULL) {
        //					if (__accBuf->MaxLength > pktAccLen) {
        //						__accBuf->DataLength = pktAccLen;
        //						PushAccBuf(__accBuf);
        //						__accBuf = NULL;
        //						pktAccLen = 0;
        //					} else {
        //						mDebug("__accBuf buf is small than H264 data -%d",
        //								pktAccLen);
        //						PushDirytAccBuf(__accBuf);
        //						__accBuf = NULL;
        //						pktAccLen = 0;
        //					}
        //					audio_count++;
        //				}
        //				while (m_AccRunning) {
        //					__accBuf = GetEmptyAccBuf();
        //					if (__accBuf == NULL || __accBuf->Data == NULL) {
        //						usleep(10);
        //						continue;
        //					}
        //					break;
        //				}
        //				if (__accBuf == NULL) {
        //					return TS_IN_PARAM_ERR;
        //				}
        //				memcpy(__accBuf->Data, pPESData, es_len);
        //				pktAccLen = es_len;
        			}
        		} else {
        //				mLogW("---PES視訊打頭非視訊幀");
        		}
        	} else {
        //			mLogW("---PES非視訊打頭");
        		avStreamId = 0;
        	}
        	return TS_OK;
        }
        
        TS_ERR TSParser::__ParsePESData() {
        	int cc;
        	uint64 total_len = 0;
        	uint64 es_len = 0;
        	uint8 es_test = __GetPayloadOffset();
        	const uint8 *pPESBuf = m_pBuf + __GetPayloadOffset();
        	es_len = TS_PKT_LEN - __GetPayloadOffset();
        
        	if (IsVideo()) //視訊
        	{
        		if (h264Buf != NULL && h264Buf->Data != NULL) {
        			memcpy(h264Buf->Data + pktH264Len, pPESBuf, es_len);
        			pktH264Len = pktH264Len + es_len;
        		} else {
        		}
        	} else if (IsAudio()) { //音訊
        //		if (__accBuf != NULL && __accBuf->Data != NULL) {
        //			memcpy(__accBuf->Data + pktAccLen, pPESBuf, es_len);
        //			pktAccLen = pktAccLen + es_len;
        //		} else {
        ////			mDebug("CameraLib ----__ParsePESData fifo存入為空  ");
        //		}
        	}
        	pPESBuf = NULL;
        	return TS_OK;
        }
        
        const char *TSParser::__TSTimeToStr(sint64 s64Time) {
        	static char s_acTimeStr[MAX_TIME_STR_LEN] = { 0 };
        	sint64 s64MiliSecond = s64Time / 90;
        	sint64 s64Second = s64MiliSecond / 1000;
        	return s_acTimeStr;
        }
        
        void TSParser::InitAccMemory() {
        	int i;
        	bool ret;
        
        	m_AccBufCount = 20;
        	long len = 25000 * 200;
        
        	for (i = 0; i < m_AccBufCount; i++) {
        		memset(&m_AccBuf[i], 0, sizeof(mfxBitstreamTS));
        		m_AccBuf[i].Data = new UCHAR[len];
        		if (m_AccBuf[i].Data) {
        			memset(m_AccBuf[i].Data, 0xff, len);
        		} else {
        			return;
        		}
        		m_AccBuf[i].MaxLength = len;
        		m_AccBuf[i].last_cc = -1;
        		m_AccBuf[i].intactness = 1;
        
        	}
        
        	ResetAccBuf();
        }
        void TSParser::ReleaseAccBuf() {
        	for (int i = 0; i < m_AccBufCount; i++) {
        		if (m_AccBuf[i].Data) {
        			delete[] m_AccBuf[i].Data;
        			m_AccBuf[i].Data = NULL;
        		}
        	}
        
        }
        bool TSParser::PushAccBuf(mfxBitstreamTS * pbuf) {
        	bool ret = m_AccBufFifo.put((void *) pbuf);
        	if (!ret) {
        		mfxBitstreamTS * pbuf1 = NULL;
        		pbuf1 = (mfxBitstreamTS *) m_AccBufFifo.get();
        		if (pbuf1) {
        			PushDirytAccBuf(pbuf1);
        		}
        
        		return m_AccBufFifo.put((void *) pbuf);
        
        	} else
        		return ret;
        }
        mfxBitstreamTS * TSParser::GetAccBuf() {
        	mfxBitstreamTS * pbuf = NULL;
        	pbuf = (mfxBitstreamTS *) m_AccBufFifo.get();
        	return pbuf;
        }
        
        bool TSParser::PushDirytAccBuf(mfxBitstreamTS * pbuf) {
        	if (pbuf == NULL)
        		return false;
        	pbuf->DataLength = 0;
        	pbuf->DataOffset = 0;
        	return m_DirtyAccBufFifo.put((void *) pbuf);
        }
        
        mfxBitstreamTS * TSParser::GetEmptyAccBuf() {
        	mfxBitstreamTS * pbuf = NULL;
        	pbuf = (mfxBitstreamTS *) m_DirtyAccBufFifo.get();
        	if (pbuf) {
        		pbuf->DataLength = 0;
        		pbuf->DataOffset = 0;
        	}
        	return pbuf;
        }
        
        void TSParser::ResetAccBuf() {
        	int i = 0;
        	m_DirtyAccBufFifo.flush();
        	m_DirtyAccBufFifo.Create(m_AccBufCount);
        
        	for (i = 0; i < m_AccBufCount; i++) {
        		int ret = m_DirtyAccBufFifo.put((void *) &m_AccBuf[i]);
        		if (!ret) {
        			return;
        		}
        	}
        	m_AccBufFifo.flush();
        	m_AccBufFifo.Create(m_AccBufCount - 1);
        }
        
        void TSParser::InitH264Memory() {
        	int i;
        	bool ret;
        
        	m_H264BufCount = 20;
        	long len = 1024 * 3500;
        
        	for (i = 0; i < m_H264BufCount; i++) {
        		memset(&m_H264Buf[i], 0, sizeof(mfxBitstreamTS));
        		m_H264Buf[i].Data = new UCHAR[len];
        		if (m_H264Buf[i].Data) {
        			memset(m_H264Buf[i].Data, 0xff, len);
        		} else {
        			return;
        		}
        		m_H264Buf[i].MaxLength = len;
        		m_H264Buf[i].last_cc = -1;
        		m_H264Buf[i].intactness = 1;
        
        	}
        	ResetH264Buf();
        }
        void TSParser::ReleaseH264Buf() {
        	for (int i = 0; i < m_H264BufCount; i++) {
        		if (m_H264Buf[i].Data) {
        			delete[] m_H264Buf[i].Data;
        			m_H264Buf[i].Data = NULL;
        		}
        	}
        
        }
        
        bool TSParser::PushH264Buf(mfxBitstreamTS * pbuf) {
        	if (pbuf->DataLength >= 1024 * 1000 || pbuf->DataLength <= 0) {
        		mDebug("error H264buf長度異常  %d", pbuf->DataLength);
        	}
        	bool ret = m_H264BufFifo.put((void *) pbuf);
        	if (!ret) { //判斷是否存入,沒有存入說明已滿,去除頭丟掉,再存
        		mfxBitstreamTS * pbuf1 = NULL;
        		pbuf1 = (mfxBitstreamTS *) m_H264BufFifo.get();
        		if (pbuf1) {
        			PushDirytH264Buf(pbuf1);
        		}
        		return m_H264BufFifo.put((void *) pbuf);
        
        	} else
        
        		return ret;
        }
        
        mfxBitstreamTS * TSParser::GetH264Buf() {
        	mfxBitstreamTS * pbuf = NULL;
        	pbuf = (mfxBitstreamTS *) m_H264BufFifo.get();
        	return pbuf;
        }
        
        void TSParser::ResetH264Buf() {
        	int i = 0;
        	m_DirtyH264BufFifo.flush();
        	m_DirtyH264BufFifo.Create(m_H264BufCount);
        
        	for (i = 0; i < m_H264BufCount; i++) {
        		int ret = m_DirtyH264BufFifo.put((void *) &m_H264Buf[i]);
        		if (!ret) {
        			return;
        		}
        	}
        
        	m_H264BufFifo.flush();
        	m_H264BufFifo.Create(m_H264BufCount - 1);
        }
        
        bool TSParser::PushDirytH264Buf(mfxBitstreamTS * pbuf) {
        	if (pbuf == NULL)
        		return false;
        	pbuf->DataLength = 0;
        	pbuf->DataOffset = 0;
        	return m_DirtyH264BufFifo.put((void *) pbuf);
        }
        
        mfxBitstreamTS * TSParser::GetEmptyH264Buf() {
        	mfxBitstreamTS * pbuf = NULL;
        	pbuf = (mfxBitstreamTS *) m_DirtyH264BufFifo.get();
        	if (pbuf) {
        		pbuf->DataLength = 0;
        		pbuf->DataOffset = 0;
        	}
        	return pbuf;
        }
        
        void TSParser::InitTsMemory() {
        	int i;
        	bool ret;
        	m_TsBufCount = 30;
        	long len = 188 * 1024;
        	for (i = 0; i < m_TsBufCount; i++) {
        		memset(&m_TsBuf[i], 0, sizeof(mfxBitstreamTS));
        		m_TsBuf[i].Data = new UCHAR[len];
        		if (m_TsBuf[i].Data) {
        			memset(m_TsBuf[i].Data, 0xff, len);
        		} else {
        			mDebug("new m_TsBuf[%d] failed:\n", i);
        			return;
        		}
        		m_TsBuf[i].MaxLength = len;
        		m_TsBuf[i].last_cc = -1;
        		m_TsBuf[i].intactness = 1;
        	}
        	ResetTsBuf();
        }
        void TSParser::ReleaseTsBuf() {
        	for (int i = 0; i < m_TsBufCount; i++) {
        		if (m_TsBuf[i].Data) {
        			delete[] m_TsBuf[i].Data;
        			m_TsBuf[i].Data = NULL;
        		}
        	}
        
        }
        
        bool TSParser::PushTsBuf(mfxBitstreamTS * pbuf) {
        	bool ret = m_TsBufFifo.put((void *) pbuf);
        	if (!ret) { //判斷是否存入,沒有存入說明已滿,去除頭丟掉,再存
        		mfxBitstreamTS * pbuf1 = NULL;
        		pbuf1 = (mfxBitstreamTS *) m_TsBufFifo.get();
        		if (pbuf1) {
        			PushDirytTsBuf(pbuf1);
        		}
        		return m_TsBufFifo.put((void *) pbuf);
        	} else
        		return ret;
        }
        
        bool TSParser::PushTsFrame(unsigned char *pData, unsigned int len) {
        	mfxBitstreamTS *pBufJpg = NULL;
        	while (m_TsRunning) {
        		pBufJpg = GetEmptyTsBuf();
        		if (pBufJpg == NULL || pBufJpg->Data == NULL) {
        			usleep(1);
        			continue;
        		}
        		break;
        	}
        	if (pBufJpg == NULL || pBufJpg->Data == NULL) {
        		return -2;
        	}
        	memcpy(pBufJpg->Data, pData, len);
        	pBufJpg->DataLength = len;
        	PushTsBuf(pBufJpg);
        	return true;
        }
        
        mfxBitstreamTS * TSParser::GetTsBuf() {
        	mfxBitstreamTS * pbuf = NULL;
        	pbuf = (mfxBitstreamTS *) m_TsBufFifo.get();
        	return pbuf;
        }
        
        void TSParser::ResetTsBuf() {
        	int i = 0;
        	m_DirtyTsBufFifo.flush();
        	m_DirtyTsBufFifo.Create(m_TsBufCount);
        
        	for (i = 0; i < m_TsBufCount; i++) {
        		int ret = m_DirtyTsBufFifo.put((void *) &m_TsBuf[i]);
        		if (!ret) {
        			return;
        		}
        	}
        
        	m_TsBufFifo.flush();
        	m_TsBufFifo.Create(m_TsBufCount - 1);
        }
        
        bool TSParser::PushDirytTsBuf(mfxBitstreamTS * pbuf) {
        	if (pbuf == NULL)
        		return false;
        	pbuf->DataLength = 0;
        	pbuf->DataOffset = 0;
        	return m_DirtyTsBufFifo.put((void *) pbuf);
        }
        
        mfxBitstreamTS * TSParser::GetEmptyTsBuf() {
        	mfxBitstreamTS * pbuf = NULL;
        	pbuf = (mfxBitstreamTS *) m_DirtyTsBufFifo.get();
        	if (pbuf) {
        		pbuf->DataLength = 0;
        		pbuf->DataOffset = 0;
        	}
        	return pbuf;
        }
        
        jobject getInstanceTs(JNIEnv* env, jclass obj_class) {
        	jmethodID construction_id = env->GetMethodID(obj_class, "<init>", "()V");
        	jobject obj = env->NewObject(obj_class, construction_id);
        	return obj;
        }
        
        //獲取視訊幀
        void *TSParser::videothread(void * cc) {
        	TSParser * pBc = (TSParser *) cc;
        	mfxBitstreamTS *h264Buf = NULL;
        	for (; m_H264Running;) {
        		while (m_H264Running) {
        			h264Buf = pBc->GetH264Buf();
        			if (h264Buf != NULL && h264Buf->Data != NULL) { //如果是視訊  &&h264Buf->DataLength!=0
        				if (pBc->m_jvm) {
        					bool isAttached = false;
        					JNIEnv* env = NULL;
        					if (pBc->m_jvm->GetEnv((void**) &env,
        							JNI_VERSION_1_4) != JNI_OK) {
        						jint res = pBc->m_jvm->AttachCurrentThread(&env, NULL);
        						// Get the JNI env for this thread
        						if ((res < 0) || !env) {
        							env = NULL;
        						} else {
        							isAttached = true;
        						}
        					}
        					if (env && pBc->_h264Cid) {
        						jbyteArray bytes = env->NewByteArray(
        								h264Buf->DataLength);
        						env->SetByteArrayRegion(bytes, 0, h264Buf->DataLength,
        								(jbyte*) h264Buf->Data);
        						pBc->_javaVedioObj = getInstanceTs(env,
        								pBc->_javaVedioClass);
        						env->CallVoidMethod(pBc->_javaVedioObj, pBc->_h264Cid,
        								bytes);
        						env->DeleteLocalRef(bytes);
        						env->DeleteLocalRef(pBc->_javaVedioObj);
        
        					}
        
        					if (isAttached) {
        						if (pBc->m_jvm->DetachCurrentThread() < 0) {
        							mDebug( "Could not detach thread from JVM");
        						}
        					}
        				}
        				break;
        			} else {
        				usleep(1);
        			}
        		}
        		pBc->PushDirytH264Buf(h264Buf);
        		h264Buf = NULL;
        	}
        }
        
        void *TSParser::audiothread(void * cc) {
        	TSParser * pBc = (TSParser *) cc;
        	uint32 decoderMp3State = 0;
        	uint32 * pPCMLen = 0;
        	mfxBitstreamTS *accBuf = NULL;
        	for (; m_AccRunning;) {
        		while (m_AccRunning) {
        			accBuf = pBc->GetAccBuf();
        			//			mDebug("callback DeliverFrame test 0");
        			if (accBuf != NULL && accBuf->Data != NULL) { //如果是視訊  &&h264Buf->DataLength!=0
        				//void *pParam, unsigned char *pData, int nLen, unsigned char *pPCM, unsigned int *outLen
        				decoderMp3State = aac_decode_frame(pHandle->pContext,
        						accBuf->Data, accBuf->DataLength, pBc->pcm_buffer,
        						pPCMLen);
        				if (pBc->m_jvm && decoderMp3State > 0) {
        					bool isAttached = false;
        					JNIEnv* env = NULL;
        					if (pBc->m_jvm->GetEnv((void**) &env,
        							JNI_VERSION_1_4) != JNI_OK) {
        						// try to attach the thread and get the env
        						// Attach this thread to JVM
        						jint res = pBc->m_jvm->AttachCurrentThread(&env, NULL);
        						// Get the JNI env for this thread
        						if ((res < 0) || !env) {
        							mDebug("Could not attach thread to JVM (%d, %p)",
        									res, env);
        							env = NULL;
        						} else {
        							isAttached = true;
        						}
        					}
        
        					if (env && pBc->_accCid) {
      
        						jbyteArray bytes = env->NewByteArray(decoderMp3State);
        						env->SetByteArrayRegion(bytes, 0, decoderMp3State,(jbyte*) pBc->pcm_buffer);
        						pBc->_javaAudioObj = getInstanceTs(env,pBc->_javaAudioClass);
        						env->CallVoidMethod(pBc->_javaAudioObj, pBc->_accCid,bytes);
        						env->DeleteLocalRef(bytes);
        						env->DeleteLocalRef(pBc->_javaAudioObj);
        					}
        					if (isAttached) {
        						if (pBc->m_jvm->DetachCurrentThread() < 0) {
        							mDebug( "Could not detach thread from JVM");
        						}
        					}
        				}
        				break;
        			} else {
        				usleep(10);
        			}
        		}
        		pBc->PushDirytAccBuf(accBuf);
        		accBuf = NULL;
        	}
        }
      
        int TSParser::JavaMethodInit(JavaVM* vm, jobject obj) {
        	if (m_jvm)
        		return 0;
        	m_jvm = vm;
        	if (!m_jvm) {
        		mDebug( " No JavaVM have been provided.");
        		return -1;
        	}
        	// get the JNI env for this thread
        	bool isAttached = false;
        	JNIEnv* env = NULL;
        	if (m_jvm->GetEnv((void**) &env, JNI_VERSION_1_4) != JNI_OK) {
        		// try to attach the thread and get the env
        		// Attach this thread to JVM
        		jint res = m_jvm->AttachCurrentThread(&env, NULL);
        
        		// Get the JNI env for this thread
        		if ((res < 0) || !env) {
        			mDebug( "Could not attach thread to JVM (%d, %p)", res, env);
        			return -1;
        		}
        		isAttached = true;
        	}
      
        	// get the ViEAndroidGLES20 class
        	jclass javaRenderClassLocal = reinterpret_cast<jclass>(env->FindClass(
        			"包名0/包名1/包名2/JniLib"));
        	//jclass javaRenderClassLocal = reinterpret_cast<jclass> (env->FindClass("包名0/包名1/包名2/JniLib.activity.xxx"));
        	if (!javaRenderClassLocal) {
        		mDebug("could not find class 包名0/包名1/包名2/JniLib");
        		return -1;
        	}
        	mDebug("get class 包名0/包名1/包名2/JniLib success");
        
        	_javaAudioClass = reinterpret_cast<jclass>(env->NewGlobalRef(
        			javaRenderClassLocal));
        	if (!_javaAudioClass) {
        		mDebug( "could not create Java class reference");
        		return -1;
        	}
        	mDebug("create Java class reference success");
        
        	_javaVedioClass = reinterpret_cast<jclass>(env->NewGlobalRef(
        			javaRenderClassLocal));
        	if (!_javaVedioClass) {
        		mDebug( "could not create Java class reference");
        		return -1;
        	}
        	mDebug("create Java class reference success");
        
        	_javaSpeedClass = reinterpret_cast<jclass>(env->NewGlobalRef(
        			javaRenderClassLocal));
        	if (!_javaSpeedClass) {
        		mDebug( "could not create Java class reference");
        		return -1;
        	}
        
        	// Delete local class ref, we only use the global ref
        	env->DeleteLocalRef(javaRenderClassLocal);
      
        	_javaAudioObj = reinterpret_cast<jobject>(env->NewGlobalRef(obj));
        	if (!_javaAudioObj) {
        		mDebug("could not create Java object reference");
        		return -1;
        	}
        
        	// get the method ID for the ReDraw function
        	_accCid = env->GetMethodID(_javaAudioClass, "aacCallBack", "([B)V");
        	if (_accCid == NULL) {
        		mDebug( " could not get dataCallBack ID");
        		return -1;
        	}
        
        	_javaVedioObj = reinterpret_cast<jobject>(env->NewGlobalRef(obj));
        	if (!_javaVedioObj) {
        		mDebug("could not create Java object reference");
        		return -1;
        	}
        	mDebug(" create Global Java object reference success");
        
        	// get the method ID for the ReDraw function
        	_h264Cid = env->GetMethodID(_javaVedioClass, "h264CallBack", "([B)V");
        	if (_h264Cid == NULL) {
        		mDebug( " could not get dataCallBack ID");
        		return -1;
        	}
        
        	_javaSpeedObj = reinterpret_cast<jobject>(env->NewGlobalRef(obj));
        	if (!_javaSpeedObj) {
        		mDebug("could not create Java object reference");
        		return -1;
        	}
        	mDebug(" create Global Java object reference success");
        	// get the method ID for the ReDraw function
        	_speedCid = env->GetMethodID(_javaSpeedClass, "speedBack", "(JDDI)V");
        	if (_speedCid == NULL) {
        		mDebug( " could not get dataCallBack ID");
        		return -1;
        	}
        	mDebug("get dataCallBack  ID success");
        	return 0;
        }
      
    • mfxBitstreamTS結構體:

        typedef struct{
        	uint64_t pts;                             //當前音訊幀PTS
        	uint64_t dts;                             //當前音訊幀DTS
        	int cc_ok;
        	int last_cc;
        	int intactness;
        	int tream_id;//標識 流
            uint8*  Data;//當前音訊幀資料快取
        	mfxU32  DataOffset;
        	uint32  DataLength;//當前音訊幀資料快取長度
            mfxU32  MaxLength;
        } mfxBitstreamTS;		
      

三、資料處理過程

以下以H264視訊資料處理為例:

  1. 初始化記憶體InitH264Memory()–>在fifo中存入空的陣列ResetH264Buf()
  2. 從fifo中取出出GetEmptyH264Buf(),填入資料 –>存入H264資料到fifo : PushYuvBuf(mfxBitstream * pbuf)
  3. 線上程中處理結果時:從H264資料fifo裡取出:GetYuvBuf() —>處理完後資料置空再存入空的fifo裡PushDirytH264Buf(mfxBitstream * pbuf)
  4. 處理結果執行緒結束:清理記憶體ReleaseH264Buf();

資料處理執行緒:

  • C++執行緒1: 迴圈處理TS流查詢標頭檔案,解析表結構,裁劫出的音視訊資料,填入資料fifo中
  • C++執行緒2: 建立一個Video的執行緒,從video fifo中取資料並回調到java方法中
  • C++執行緒3: 建立一個Audio的執行緒,從audio fifo中取資料並回調到java方法中
  • java執行緒1: 建立一個buff快取組,存入視訊組資料,並通過一個while(flag)執行緒,不斷投入到Android Decoder硬解碼中。

四、總結

在處理1080P的TS流資料時,測試從解析流到android硬解碼預覽顯示,總延遲約140ms~200ms(曉龍835處理器),發現在硬解碼部分耗時較大,有堵塞整個資料通道的嫌疑,具體優化方案在以後給出。