1. 程式人生 > >寫代碼實現直播推流

寫代碼實現直播推流

global tag double 掌握 點播 -i void turn duration

花滿樓原創


小白:你之前介紹過使用nginx來實現直播,使用攝像頭來錄制,這些知識已經可以做到推流了。

花滿樓:之前是使用ffmpeg命令來推流,控制度不夠高,現在以代碼的方式來實現,可靈活控制。

本文介紹如何寫代碼實現直播的推流功能。

最終的效果是這樣的:
技術分享圖片

演示推流的代碼

#include <stdio.h>
#include "ffmpeg/include/libavformat/avformat.h"
#include "ffmpeg/include/libavcodec/avcodec.h"

void publishstream() {
    const char* srcfile = "t.mp4";
    const char* streamseverurl = "rtmp://localhost/rtmpdemo/test1";
    av_register_all();
    avformat_network_init();
    av_log_set_level(AV_LOG_DEBUG);
    int status = 0;
    AVFormatContext* formatcontext = avformat_alloc_context();
    status = avformat_open_input(&formatcontext, srcfile, NULL, NULL);
    if (status >= 0) {
        status = avformat_find_stream_info(formatcontext, NULL);
        if (status >= 0) {
            int videoindex = -1;
            for (int i = 0; i < formatcontext->nb_streams; i ++) {
                if (formatcontext->streams[i]->codec->codec_type == AVMEDIA_TYPE_VIDEO) {
                    videoindex = i;
                    break;
                }
            }
            if (videoindex >= 0) {
                AVFormatContext* outformatcontext;
                avformat_alloc_output_context2(&outformatcontext, NULL, "flv", streamseverurl);
                if (outformatcontext) {
                    status = -1;
                    for (int i = 0; i < formatcontext->nb_streams; i ++) {
                        AVStream* onestream = formatcontext->streams[i];
                        AVStream* newstream = avformat_new_stream(outformatcontext, onestream->codec->codec);
                        status = newstream ? 0 : -1;
                        if (status == 0) {
                            status = avcodec_copy_context(newstream->codec, onestream->codec);
                            if (status >= 0) {
                                newstream->codec->codec_tag = 0;
                                if (outformatcontext->oformat->flags & AVFMT_GLOBALHEADER) {
                                    newstream->codec->flags |= CODEC_FLAG_GLOBAL_HEADER;
                                }
                            }
                        }
                    }
                    if (status >= 0) {
                        AVOutputFormat* outformat = outformatcontext->oformat;
                        av_usleep(5*1000*1000); // 故意等一下再開始推流,讓拉流的客戶端有時間啟動,以拿到視頻的pps/sps
                        if (!(outformat->flags & AVFMT_NOFILE)) {
                            av_dump_format(outformatcontext, 0, streamseverurl, 1);
                            status = avio_open(&outformatcontext->pb, streamseverurl, AVIO_FLAG_WRITE);
                            if (status >= 0) {
                                status = avformat_write_header(outformatcontext, NULL);
                                if (status >= 0) {
                                    AVPacket packet;
                                    int videoframeidx = 0;
                                    int64_t starttime = av_gettime();
                                    while (1) {
                                        status = av_read_frame(formatcontext, &packet);
                                        if (status < 0) {
                                            break;
                                        }
                                        if (packet.pts == AV_NOPTS_VALUE) {
                                            av_log(NULL, AV_LOG_DEBUG, "set pakcet.pts\n");
                                            AVRational video_time_base = formatcontext->streams[videoindex]->time_base;
                                            int64_t frameduration = (double)AV_TIME_BASE / av_q2d(formatcontext->streams[videoindex]->r_frame_rate);
                                            packet.pts = (double)(videoframeidx * frameduration) / (double)(av_q2d(video_time_base) * AV_TIME_BASE);
                                            packet.dts = packet.pts;
                                            packet.duration = (double)frameduration / (double)(av_q2d(video_time_base) * AV_TIME_BASE);
                                        }
                                        if (packet.stream_index == videoindex) {
                                            AVRational video_time_base = formatcontext->streams[videoindex]->time_base;
                                            AVRational time_base_q = {1, AV_TIME_BASE};
                                            int64_t cur_pts = av_rescale_q(packet.dts, video_time_base, time_base_q);
                                            int64_t curtime = av_gettime() - starttime;
                                            av_log(NULL, AV_LOG_DEBUG, "on video frame curpts=%lld curtime=%lld\n", cur_pts, curtime);
                                            if (cur_pts > curtime) {
                                                av_usleep(cur_pts - curtime);
                                            }
                                        }
                                        AVStream* instream = formatcontext->streams[packet.stream_index];
                                        AVStream* outstream = outformatcontext->streams[packet.stream_index];
                                        packet.pts = av_rescale_q_rnd(packet.pts, instream->time_base, outstream->time_base, AV_ROUND_NEAR_INF | AV_ROUND_PASS_MINMAX);
                                        packet.dts = av_rescale_q_rnd(packet.dts, instream->time_base, outstream->time_base, AV_ROUND_NEAR_INF | AV_ROUND_PASS_MINMAX);
                                        packet.duration = av_rescale_q(packet.duration, instream->time_base, outstream->time_base);
                                        packet.pos = -1;
                                        if (packet.stream_index == videoindex) {
                                            videoframeidx ++;
                                        }
                                        status = av_interleaved_write_frame(outformatcontext, &packet);
                                        if (status < 0) {
                                            break;
                                        }
                                    }
                                    av_write_trailer(outformatcontext);
                                }
                                avio_close(outformatcontext->pb);
                            }
                        }
                    }
                    avformat_free_context(outformatcontext);
                }
            }
        }
        avformat_close_input(&formatcontext);
    }
    avformat_free_context(formatcontext);
}

int main(int argc, char *argv[])
{
    publishstream();
    return 0;
}

這裏以本地的視頻文件作為內容,模擬了直播推流,功能上相當於直接調用ffmpeg命令:

sudo ffmpeg -re -i Movie-1.mp4 -vcodec copy -f flv rtmp://localhost/rtmpdemo/test1

當然也可以邊錄制,邊推送。

當然也可以在不同的電腦或手機上,拉流播放。

這裏有一個前提,就是把nginx架設好並啟動,可以參考“流媒體服務器,給你好看”這篇文章,它介紹了如何用nginx實現點播與直播。

直播開始後,這裏的流服務器並沒有給中途拉流的客戶端發送視頻解碼所必須的參數(pps/sps),所以在測試的時候,要保證拉流端能拿到第一幀數據,比如演示代碼中故意sleep幾秒後才開始推流,讓拉流端有時間開啟並拿到推上去的所有數據(包括關鍵參數)。

對於h264的知識,或者對於FFmpeg使用的知識,可以參考之前的文章,也可以留意後續的更新。

小白:你說的話好長啊,而且沒什麽用,我還是看代碼吧!

花滿樓:如果你有掌握細節的必要,那最好自己寫一遍代碼。

小白:我只是看看!


寫代碼實現直播推流