1. 程式人生 > >Live555用做RTSPClient時,利用RTP時間戳進行音視訊同步的解決方案(必須有RTCP支援才可行)

Live555用做RTSPClient時,利用RTP時間戳進行音視訊同步的解決方案(必須有RTCP支援才可行)

http://www.mworkbox.com/wp/work/551.html

先看來自Live555官網的2個常見問題
問題1:Why do most RTP sessions use separate streams for audio and video? How can a receiving client synchronize these streams?
回答:Sending audio and video in separate RTP streams provides a great deal of flexibility. For example, this makes it possible for a player to receive only the audio stream, but not video (or vice-versa). It would even be possible to have one computer receive and play audio, and a separate computer receive and play video.

These audio and video streams are synchronized using RTCP “Sender Report” (SR) packets – which map each stream’s RTP timestamp to ‘wall clock’ (NTP) time. For more information, see the IETF’s RTP/RTCP specification.

Receivers can then use this mapping to synchronize the incoming RTP streams. The LIVE555 Streaming Media code does this automatically: For subclasses of “RTPSource”, the “presentationTime” parameter that’s passed to the ‘afterGettingFunc’ of “getNextFrame()” (see “liveMedia/include/FramedSource.hh”) will be an accurate, time-synchronized time. (For this to work, you need to have also created a “RTCPInstance” for each RTP source.)

For example, if you use “openRTSP” to receive RTSP/RTP streams, then the contents of each RTP stream (audio and video) are written into separate files. This is done using the “FileSink” class. If you look at the “FileSink::afterGettingFrame()” member function, you’ll notice that there’s a “presentationTime” parameter for each incoming frame. Some other receiver could use the “presentationTime” parameter to synchronize audio and video.

問題2:But I notice that there’s an abrupt change in a stream’s presentation times after the first RTCP “SR” packet has been received. Is this a bug?
回答:No, this is normal, and expected; there’s no bug here. This happens because the first few presentation times – before RTCP synchronization occurs – are just ‘guesses’ made by the receiving code (based on the receiver’s ‘wall clock’ and the RTP timestamp). However, once RTCP synchronization occurs, all subsequent presentation times will be accurate.

This means is that a receiver should be prepared for the fact that the first few presentation times (until RTCP synchronization starts) will not be accurate. The code, however, can check this by calling “RTPSource:: hasBeenSynchronizedUsingRTCP()”. If this returns False, then the presentation times are not accurate, and should not be used for synchronization. However, once the call to returns True, then the presentation times (from then on) will be accurate.

我的心得:
1. Live555中關於寫檔案時,需要先同步音視訊流的例子見Live555的原始碼QuickTimeFileSink.cpp:

void QuickTimeFileSink
::afterGettingFrame(void* clientData, unsigned packetDataSize,
            unsigned numTruncatedBytes,
            struct timeval presentationTime,
            unsigned /*durationInMicroseconds*/) {
  SubsessionIOState* ioState = (SubsessionIOState*)clientData;
  if (!ioState->syncOK(presentationTime)) {
    // Ignore this data:音視訊還未同步,忽略這些資料包,syncOK中呼叫了hasBeenSynchronizedUsingRTCP()
    ioState->fOurSink.continuePlaying();
    return;
  }
  ...
  ioState->afterGettingFrame(packetDataSize, presentationTime);
}

2.1 通過Live555原始碼搜尋,查詢RTPSource同步標記(fCurPacketHasBeenSynchronizedUsingRTCP)設定的邏輯:
—- hasBeenSynchronizedUsingRTCP Matches (17 in 9 files) —-
MediaSession.cpp (livemedia): if (!rtpSource()->hasBeenSynchronizedUsingRTCP()) {
MultiFramedRTPSource.cpp (livemedia): fPresentationTime, fCurPacketHasBeenSynchronizedUsingRTCP,
RTPSource.cpp (livemedia): fCurPacketHasBeenSynchronizedUsingRTCP(False), fLastReceivedSSRC(0),
RTPSource.hh (livemedia\include): virtual Boolean hasBeenSynchronizedUsingRTCP();
RTPSource.hh (livemedia\include): Boolean fCurPacketHasBeenSynchronizedUsingRTCP;
testRTSPClient.cpp: if (fSubsession.rtpSource() != NULL && !fSubsession.rtpSource()->hasBeenSynchronizedUsingRTCP()) {
—- fHasBeenSyncedUsingRTCP Matches (3 in 2 files) —-
MultiFramedRTPSource.cpp (livemedia): fHasBeenSyncedUsingRTCP = hasBeenSyncedUsingRTCP;
MultiFramedRTPSource.cpp (livemedia): hasBeenSyncedUsingRTCP = fHasBeenSyncedUsingRTCP;
MultiFramedRTPSource.hh (livemedia\include): Boolean fHasBeenSyncedUsingRTCP;
—- assignMiscParams Matches (3 in 2 files) —-
MultiFramedRTPSource.cpp (livemedia): bPacket->assignMiscParams(rtpSeqNo, rtpTimestamp, presentationTime,
—- hasBeenSyncedUsingRTCP Matches (18 in 4 files) —-
MultiFramedRTPSource.cpp (livemedia): fHasBeenSyncedUsingRTCP = hasBeenSyncedUsingRTCP;
RTPSource.cpp (livemedia): resultHasBeenSyncedUsingRTCP = fHasBeenSynchronized;

2.2 真正進行RTCP時間同步標記判斷和儲存的地方,最終會通過bPacket->assignMiscParams函式,賦值給RTPSource::fCurPacketHasBeenSynchronizedUsingRTCP變數:

void MultiFramedRTPSource::networkReadHandler1() 
{
...
    struct timeval presentationTime; // computed by:
    Boolean hasBeenSyncedUsingRTCP; // computed by:
    receptionStatsDB()
      .noteIncomingPacket(rtpSSRC, rtpSeqNo, rtpTimestamp,
              timestampFrequency(),
              usableInJitterCalculation, presentationTime,
              hasBeenSyncedUsingRTCP, bPacket->dataSize());

    // Fill in the rest of the packet descriptor, and store it:
    struct timeval timeNow;
    gettimeofday(&timeNow, NULL);
    bPacket->assignMiscParams(rtpSeqNo, rtpTimestamp, presentationTime,
                  hasBeenSyncedUsingRTCP, rtpMarkerBit,
                  timeNow);
...
}

3.結論:參考Live55原始碼中的QuickTimeFileSink類中關於音視訊同步邏輯相關的部分,即可在自己的FileSink或者MemorySink中實現音視訊同步邏輯。

其他:研讀原始碼,發現MediaSubsession::getNormalPlayTime函式可以獲取當前rtp包的ntp時間,值得一試:
—- curPacketRTPTimestamp( Matches (3 in 2 files) —-
MediaSession.cpp (livemedia): u_int32_t timestampOffset = rtpSource()->curPacketRTPTimestamp() – rtpInfo.timestamp;
MediaSession.cpp (livemedia): u_int32_t timestampOffset = rtpSource()->curPacketRTPTimestamp() – rtpInfo.timestamp;
RTPSource.hh (livemedia\include): u_int32_t curPacketRTPTimestamp() const { return fCurPacketRTPTimestamp; }
—- getNormalPlayTime Matches (4 in 3 files) —-
MediaSession.cpp (livemedia):double MediaSubsession::getNormalPlayTime(struct timeval const& presentationTime) {
MediaSession.hh (livemedia\include): double getNormalPlayTime(struct timeval const& presentationTime);
MediaSession.hh (livemedia\include): double fNPT_PTS_Offset; // set by “getNormalPlayTime()”; add this to a PTS to get NPT
testRTSPClient.cpp: envir() << “\tNPT: ” << fSubsession.getNormalPlayTime(presentationTime);


相關推薦

Live555RTSPClient利用RTP時間進行視訊同步解決方案(必須RTCP支援可行)

http://www.mworkbox.com/wp/work/551.html 先看來自Live555官網的2個常見問題: 問題1:Why do most RTP sessions use separate streams for audio and video?

安裝Loadrunner 11.0彈出缺少2.8 sp1組件--解決方案(win7)

pos OS node 運行 分享 mac 進入 sta load 這是因為註冊表缺少FullInstallVer和Version,歸根到底是madc安裝的的問題 以下是解決方法: 1.運行regedit,打開註冊表,進入HKEY_LOCAL_MACHINE\SOFTWAR

使用better-scroll實現滾動選單出現報錯“Failed to resolve directive: el”解決方案

錯誤詳情: 使用better-scroll實現滾動選單時,報錯“Failed to resolve directive: el”。 錯誤原因: 這是因為v-el在vue2.x以後被淘汰。使用新的標籤ref替換v-el,下面是修改的方法。 解決

將檔案移動到指定目錄下並且加上時間進行重新命名

def add_timastamp(self): ''' return timestamp mark''' stamp = time.strftime("%Y%m%d%H%M%S",time.localtime())

ssh框架專案輸出資料出現錯誤

10:25:52.382 [http-bio-8080-exec-1] ERROR org.hibernate.proxy.pojo.javassist.JavassistProxyFactory - HHH000142: Javassist Enhancement fai

Java國際化假如properties檔案的屬性值需要換行如何處理?

今天在做國際化的時候,在資原始檔中,有一個很長的屬性值,想要把它顯示在頁面上,按照一般的來做的化,屬性值總是在一行中,想了很久才把這個問題搞定。因此把它記下來,分享給大家。 Java讀取Propert

iOS開發UIStoryBoard還是純程式碼編寫?

總體上來說,Storyboard有以下好處:你可以從storyboard中很方便地梳理出所有View Controller的介面間的呼叫關係。這一點對於新加入專案組的開發同事來說,比較友好。使用Storyboard可以使用Table View Controller的Static Cell功能。對於開發一些Ce

設置戶密碼將全角轉換為半角

英文字母 har 分享 elf bsp log 密碼 解決 mage 情景: 註冊管理員賬號,用戶輸入的是全角密碼,登錄時輸入半角密碼,顯示密碼錯誤 解決方案: 將全角密碼改成半角密碼 var password = self.loginForm.password /

當你使用LINQ底層最好設計一個工廠不要把LINQ的動作暴露給業務層

handle ram tile div ++ space ner 數據庫名 string 1: using System; 2: using System.Collections.Generic; 3: using System.Linq; 4: usi

Python 使用 Matplotlib 如何畫豎直和水平的分割線?

col http dcb www. prior range swe cdd 轉載 作者:看看鏈接:https://www.zhihu.com/question/21929761/answer/164975814來源:知乎著作權歸作者所有。商業轉載請聯系作者獲得授權,非商業轉

劍走偏鋒自媒體利用自己的興趣愛好打造賺錢IP

精準用戶這個世界,對於賺錢這個事情,小白是有很多的。可是當真正進來的時候,理想與現實產生了不可逾越的鴻溝。不是說好一天上萬的嗎?怎麽我才賺幾十塊。不是說好躺賺的嗎?薅羊毛薅的我頭暈。做引流好難,我每天熬夜到兩三點,為什麽每天才只有幾個人加我。做自媒體也好難,每天找素材已經是精疲力竭了,為啥文章閱讀量還不高,更

【轉】編寫高質量代碼改善C#程序的157個建議——建議141:不知道該不該大括號

body pos 高質量 一行 ron strong gpo clas div 建議141:不知道該不該用大括號時,就用 如果if條件語句只有一行語句,要不要使用大括號? 答案是:建議使用。一個括號不會增加多少代碼,但是卻讓代碼看上去增加了一致性。括號本身只會讓代碼更具

(轉)nginx轉發帶'_'的header內容丟失

做了 scores 就是 use version 完成 所在 解決 sco 原本在測試環境測試通過的APP,今天準備切到線上環境做最後測試,結果發現了錯誤。查看日誌發現是APP端發送的http請求中的header內容丟失了。那麽代碼沒有改動,怎麽平白無故會丟失頭信息? 於是

ComboBox控制元件對子控制元件關聯出現崩潰問題

上一節,我實現了Combobox控制元件中子控制元件(edit)可以讓游標垂直的效果。 後來,發現了一個問題。 我使用的開發工具是vs2010. 當從工具箱中拖出來的控制元件時,使用上一章節我推薦的方案,沒有問題的。 但是,當自己create出來一個控制元件時,此時就會在下圖位置

無線網路覆蓋-java中Math.sqrt()必須要注意小數問題

時間限制:3000 ms  |  記憶體限制:65535 KB 難度:3 描述 我們的樂樂同學對於網路可算得上是情有獨鍾,他有一個計劃,那就是用無線網覆蓋鄭州大學。 現在學校給了他一個機會,因此他要購買很多的無線路由。現在他正在部署某條大道的網路,而學校只允許把他的無線路由器放在路的正中間。我們預設這條大道

[轉載]Oracle 11G在EXP 匯出空表不能匯出解決

  11G中有個新特性,當表無資料時,不分配segment,以節省空間   解決方法:   1、insert一行,再rollback就產生segment了。   該方法是在在空表中插入資料,再刪除,則產生segment。匯出時則可匯出空表。   2、設定deferred_segment_creat

利用C#開發web應用程式對登錄檔進行操作提示沒有許可權的解決辦法

因為公司專案需要對web程式新增一套限制客戶惡意傳播的方案。沒辦法,東西放在客戶的伺服器或者電腦裡面。鑑於本人菜鳥一個,也就能想到利用兩種方案,具體的實現的方式,將會在之後的博文中寫出。 我寫這篇文章

到select2臨時抱佛腳學習了一下

$('#e3').select2({       placeholder: "請輸入",       minimumInputLength: 1,       separator: ",", // 分隔符       maximumSelectionSize: 5, // 限制數量       in

hyperscan-pythonfatal error: Python.h: No such file or directory

1、centos7.3用hyperscan-python時報錯:fatal error: Python.h: No such file or directory 系統中沒有python.h的原因,是因為沒有安裝python的開發版,即Python-devel這個包,命

在Windows系統下pip安裝PremissionError錯誤解決辦法

當用pip install package_name 語句直接安裝時,出現如下報錯: C:\Users\wangz>pip install pygame Collecting pygame   Using cached pygame-1.9.3-cp36-cp36m-