1. 程式人生 > >WebRTC學習之九:攝像頭的捕捉和顯示

WebRTC學習之九:攝像頭的捕捉和顯示

分享 註意 conn con wid pre rac art 升級版本

技術分享

較新的WebRTC源代碼中已經沒有了與VoiceEngine結構相應的VidoeEngine了,取而代之的是MeidaEngine。MediaEngine包括了MediaEngineInterface接口及事實上現CompositeMediaEngine,CompositeMediaEngine本身也是個模板類,兩個模板參數各自是音頻引擎和視頻引擎。CompositeMediaEngine派生類WebRtcMediaEngine依賴的模板參數是WebRtcVoiceEngine和WebRtcVideoEngine2。


上圖中base文件夾中是一些抽象類。engine文件夾中是相應抽象類的實現,使用時直接調用engine文件夾中的接口就可以。WebRtcVoiceEngine實際上是VoiceEngine的再次封裝。它使用VoiceEngine進行音頻處理。

註意命名,WebRtcVideoEngine2帶了個2字,不用想,這肯定是個升級版本號的VideoEngine,還有個WebRtcVideoEngine類。

WebRtcVideoEngine2比WebRtcVideoEngine改進之處在於將視頻流一分為二:發送流(WebRtcVideoSendStream)和接收流(WebRtcVideoReceiveStream),從而結構上更合理。源代碼更清晰。
本文的實現主要是使用了WebRtcVideoEngine2中WebRtcVideoCapturer類。


一.環境
參考上篇:WebRTC學習之三:錄音和播放
二.實現
打開WebRtcVideoCapturer的頭文件webrtcvideocapture.h。公有的函數基本上都是base文件夾中VideoCapturer類的實現,用於初始化設備和啟動捕捉。私有函數OnIncomingCapturedFrame和OnCaptureDelayChanged會在攝像頭採集模塊VideoCaptureModeule中回調。將採集的圖像傳給OnIncomingCapturedFrame,並將採集的延時變化傳給OnCaptureDelayChanged。
WebRTC中也實現了類似Qt中的信號和槽機制,詳見WebRTC學習之七:精煉的信號和槽機制 。可是就像在該文中提到的,sigslot.h中的emit函數名會和Qt中的emit宏沖突。我將sigslot.h中的emit改成了Emit,當然改完之後,須要又一次編譯rtc_baseproject。


VideoCapturer類有兩個信號sigslot::signal2<VideoCapturer*, CaptureState> SignalStateChange和sigslot::signal2<VideoCapturer*, const CapturedFrame*, sigslot::multi_threaded_local> SignalFrameCaptured,從SignalFrameCaptured的參數能夠看出我們僅僅要實現相應的槽函數就能獲取到CapturedFrame,在槽函數中將 CapturedFrame進行轉換顯示就可以。SignalStateChange信號的參數CaptureState是個枚舉。標識捕捉的狀態(停止、開始、正在進行、失敗)。
信號SignalFrameCaptured正是在回調函數OnIncomingCapturedFrame中發射出去的。

OnIncomingCapturedFrame裏面用到了函數的異步運行。詳見WebRTC學習之八:函數的異步運行。

mainwindow.h

#ifndef MAINWINDOW_H
#define MAINWINDOW_H

#include <QMainWindow>
#include <QDebug>

#include <map>
#include <memory>
#include <string>

#include "webrtc/base/sigslot.h"
#include "webrtc/modules/video_capture/video_capture.h"
#include "webrtc/modules/video_capture/video_capture_factory.h"
#include "webrtc/media/base/videocapturer.h"
#include "webrtc/media/engine/webrtcvideocapturer.h"
#include "webrtc/media/engine/webrtcvideoframe.h"

namespace Ui {
class MainWindow;
}

class MainWindow : public QMainWindow,public sigslot::has_slots<>
{
    Q_OBJECT

public:
    explicit MainWindow(QWidget *parent = 0);
    ~MainWindow();
    void OnFrameCaptured(cricket::VideoCapturer* capturer, const cricket::CapturedFrame* frame);
    void OnStateChange(cricket::VideoCapturer* capturer, cricket::CaptureState state);

private slots:
    void on_pushButtonOpen_clicked();

private:
     void getDeviceList();

private:
    Ui::MainWindow *ui;
    cricket::WebRtcVideoCapturer *videoCapturer;
    cricket::WebRtcVideoFrame *videoFrame;
    std::unique_ptr<uint8_t[]> videoImage;
    QStringList deviceNameList;
    QStringList deviceIDList;
};

#endif // MAINWINDOW_H
mainwindow.cpp

#include "mainwindow.h"
#include "ui_mainwindow.h"

MainWindow::MainWindow(QWidget *parent) :
    QMainWindow(parent),
    ui(new Ui::MainWindow),
    videoCapturer(new cricket::WebRtcVideoCapturer()),
    videoFrame(new cricket::WebRtcVideoFrame())
{
   ui->setupUi(this);
   getDeviceList();
}

MainWindow::~MainWindow()
{
    delete ui;
    videoCapturer->SignalFrameCaptured.disconnect(this);
    videoCapturer->SignalStateChange.disconnect(this);
    videoCapturer->Stop();
}

void MainWindow::OnFrameCaptured(cricket::VideoCapturer* capturer,const cricket::CapturedFrame* frame)
{

    videoFrame->Init(frame, frame->width, frame->height,true);
    //將視頻圖像轉成RGB格式
    videoFrame->ConvertToRgbBuffer(cricket::FOURCC_ARGB,
                                  videoImage.get(),
                                  videoFrame->width()*videoFrame->height()*32/8,
                                  videoFrame->width()*32/8);

    QImage image(videoImage.get(), videoFrame->width(), videoFrame->height(), QImage::Format_RGB32);
    ui->label->setPixmap(QPixmap::fromImage(image));
}


void MainWindow::OnStateChange(cricket::VideoCapturer* capturer, cricket::CaptureState state)
{

}

void MainWindow::getDeviceList()
{
    deviceNameList.clear();
    deviceIDList.clear();
    webrtc::VideoCaptureModule::DeviceInfo *info=webrtc::VideoCaptureFactory::CreateDeviceInfo(0);
    int deviceNum=info->NumberOfDevices();

    for (int i = 0; i < deviceNum; ++i)
    {
        const uint32_t kSize = 256;
        char name[kSize] = {0};
        char id[kSize] = {0};
        if (info->GetDeviceName(i, name, kSize, id, kSize) != -1)
        {
            deviceNameList.append(QString(name));
            deviceIDList.append(QString(id));
            ui->comboBoxDeviceList->addItem(QString(name));
        }
    }

    if(deviceNum==0)
    {
        ui->pushButtonOpen->setEnabled(false);
    }
}

void MainWindow::on_pushButtonOpen_clicked()
{
    static bool flag=true;
    if(flag)
    {
         ui->pushButtonOpen->setText(QStringLiteral("關閉"));

        const std::string kDeviceName = ui->comboBoxDeviceList->currentText().toStdString();
        const std::string kDeviceId = deviceIDList.at(ui->comboBoxDeviceList->currentIndex()).toStdString();

        videoCapturer->Init(cricket::Device(kDeviceName, kDeviceId));
        int width=videoCapturer->GetSupportedFormats()->at(0).width;
        int height=videoCapturer->GetSupportedFormats()->at(0).height;
        cricket::VideoFormat format(videoCapturer->GetSupportedFormats()->at(0));
        //開始捕捉
        if(cricket::CS_STARTING == videoCapturer->Start(format))
        {
            qDebug()<<"Capture is started";
        }
        //連接WebRTC的信號和槽
        videoCapturer->SignalFrameCaptured.connect(this,&MainWindow::OnFrameCaptured);
        videoCapturer->SignalStateChange.connect(this,&MainWindow::OnStateChange);

        if(videoCapturer->IsRunning())
        {
            qDebug()<<"Capture is running";
        }

        videoImage.reset(new uint8_t[width*height*32/8]);

    }
    else
    {
        ui->pushButtonOpen->setText(QStringLiteral("打開"));
        //反復連接會報錯,須要先斷開,才幹再次連接
        videoCapturer->SignalFrameCaptured.disconnect(this);
        videoCapturer->SignalStateChange.disconnect(this);
        videoCapturer->Stop();
        if(!videoCapturer->IsRunning())
        {
            qDebug()<<"Capture is stoped";
        }
        ui->label->clear();
    }
    flag=!flag;
}
main.cpp

#include "mainwindow.h"
#include <QApplication>

int main(int argc, char *argv[])
{
    QApplication a(argc, argv);
    MainWindow w;
    w.show();
    while(true)
    {
        //WebRTC消息循環
        rtc::Thread::Current()->ProcessMessages(0);
        rtc::Thread::Current()->SleepMs(1);
        //Qt消息循環
        a.processEvents( );
    }
}
註意main函數中對WebRTC和Qt消息循環的處理,這是用Qt調用WebRTC進行攝像頭捕捉和顯示的關鍵。

三.效果

技術分享


技術分享




WebRTC學習之九:攝像頭的捕捉和顯示