1. 程式人生 > >Android實現視訊硬編碼

Android實現視訊硬編碼

0. 前言

Android視訊錄製一直是個大問題,之前做一款短視訊應用,視訊錄製採用ffmpeg,雖然做了很多優化,但是畫面質量和幀率難以達到要求,至少達不到IOS的水準。所以痛下決心研究Android平臺的硬編碼方案。

硬編碼

所謂的硬編碼就是採用DSP對視訊幀進行編碼,相對於軟編碼來說,硬編碼的編碼效率天差地別。更高的編碼效率就意味著在相同幀率下能夠獲得更高的解析度,更佳的畫面質量。

1. Android視訊錄製一般流程

Video Recording Pipeline

1.1 一般流程

(1)視訊資料採集
從Camera中獲取視訊資料
(2)視訊幀處理
新增視訊濾鏡,人臉識別等
可能還需要進行RGB轉YUV
(3)編碼
FFmpeg/MediaCodec
(4)Mux
FFmpeg/MediaMuxer

1.2 短視訊應用的挑戰

目前短視訊應用都是要求對視訊的實時錄製,對視訊幀率要求至少24FPS。此外大部分應用還包含視訊濾鏡,人臉識別,新增人物飾品等需求。
按照最低24FPS的要求,每幀影象在各個流程的處理時間不超過1000/24 = 41.67ms,否則就會出現丟幀現象,造成視訊卡頓。
(1)(4)視訊採集和Mux階段一般不會存在任何瓶頸,瓶頸在於視訊幀(2)幀處理和(3)編碼階段,哪怕是非同步進行也必須要求每個階段的處理時間不能超過42ms。

2. Android硬編碼

2.1 API Level限制

Android對視訊進行硬編碼需要用到MediaCodec, MediaMuxer一般都對API Level有要求。
MediaCodec


API Level >= 16,其中個別非常重要的介面要求18以上,比如createInputSurface

MediaMuxer
API Level >= 18

MediaRecorder
API Level >= 1,但是重要介面setInputSurface要求API Level >= 23。

不過目前API Level < 18的機型比較少了,哪怕是存在,機器的配置也實在太低了,就算是採用軟編碼方案,錄製出來的視訊效果極其糟糕。

2.2 適配性較差

由於硬編碼和手機硬體平臺相關性較大,目前發現在部分機型上存在不相容現象,所以並不能完全拋棄軟編碼方案作為硬編碼的補充。

2.3 Pipeline

//TODO Hardware encoding on Android Platform

3. Camera資料採集

主要流程如下:

  1. 開啟camera
  2. 設定texture
  3. 設定引數
  4. 開始採集
  5. 結束
    //(1)開啟Camera
    cam = Camera.open();
    //(2)設定texture
    cam.setPreviewTexture(texID);
    //(3)設定Camera引數
    Camera.Parameters parameters = cam.getParameters();
    cam.setParameters(parameters);
    // (4)開始採集
    cam.startPreview();
    //(5)結束
    cam.stopPreview();
    cam.release();

3.1 開啟Camera

參考Grafika程式碼:

        try {
            if (Build.VERSION.SDK_INT > Build.VERSION_CODES.FROYO) {
                int numberOfCameras = Camera.getNumberOfCameras();

                Camera.CameraInfo cameraInfo = new Camera.CameraInfo();
                for (int i = 0; i < numberOfCameras; i++) {
                    Camera.getCameraInfo(i, cameraInfo);
                    if (cameraInfo.facing == facing) {
                        mDefaultCameraID = i;
                        mFacing = facing;
                    }
                }
            }
            stopPreview();
            if (mCameraDevice != null)
                mCameraDevice.release();

            if (mDefaultCameraID >= 0) {
                mCameraDevice = Camera.open(mDefaultCameraID);
            } else {
                mCameraDevice = Camera.open();
                mFacing = Camera.CameraInfo.CAMERA_FACING_BACK; //default: back facing
            }
        } catch (Exception e) {
            LogUtil.e(TAG, "Open Camera Failed!");
            e.printStackTrace();
            mCameraDevice = null;
            return false;
        }

3.2 Camera引數設定

比較重要的幾個引數:
setPreviewSize
setPictureSize
setFocusMode
setPreviewFrameRate
setPictureFormat

設定寬高參考如下程式碼:

    /**
     * Attempts to find a preview size that matches the provided width and height (which
     * specify the dimensions of the encoded video).  If it fails to find a match it just
     * uses the default preview size for video.
     * <p>
     * TODO: should do a best-fit match, e.g.
     * https://github.com/commonsguy/cwac-camera/blob/master/camera/src/com/commonsware/cwac/camera/CameraUtils.java
     */
    public static void choosePreviewSize(Camera.Parameters parms, int width, int height) {
        // We should make sure that the requested MPEG size is less than the preferred
        // size, and has the same aspect ratio.
        Camera.Size ppsfv = parms.getPreferredPreviewSizeForVideo();
        if (ppsfv != null) {
            Log.d(TAG, "Camera preferred preview size for video is " +
                    ppsfv.width + "x" + ppsfv.height);
        }

        //for (Camera.Size size : parms.getSupportedPreviewSizes()) {
        //    Log.d(TAG, "supported: " + size.width + "x" + size.height);
        //}

        for (Camera.Size size : parms.getSupportedPreviewSizes()) {
            if (size.width == width && size.height == height) {
                parms.setPreviewSize(width, height);
                return;
            }
        }

        Log.w(TAG, "Unable to set preview size to " + width + "x" + height);
        if (ppsfv != null) {
            parms.setPreviewSize(ppsfv.width, ppsfv.height);
        }
        // else use whatever the default size is
    }

設定影象格式
這個引數需要注意:
如果不需要對視訊幀進行處理,可以把影象格式設定為YUV格式,可以直接給編碼器進行編碼,否則還需要進行格式轉換。

mParams.setPreviewFormat(ImageFormat.YV12);

如果需要對視訊幀新增濾鏡等渲染操作,那麼就必須把影象格式設定為RGB格式:

mParams.setPictureFormat(PixelFormat.JPEG);

完整程式碼如下:

    public void initCamera(int previewRate) {
        if (mCameraDevice == null) {
            LogUtil.e(TAG, "initCamera: Camera is not opened!");
            return;
        }

        mParams = mCameraDevice.getParameters();
        List<Integer> supportedPictureFormats = mParams.getSupportedPictureFormats();

        for (int fmt : supportedPictureFormats) {
            LogUtil.i(TAG, String.format("Picture Format: %x", fmt));
        }

        mParams.setPictureFormat(PixelFormat.JPEG);

        List<Camera.Size> picSizes = mParams.getSupportedPictureSizes();
        Camera.Size picSz = null;

        Collections.sort(picSizes, comparatorBigger);

        for (Camera.Size sz : picSizes) {
            LogUtil.i(TAG, String.format("Supported picture size: %d x %d", sz.width, sz.height));
            if (picSz == null || (sz.width >= mPictureWidth && sz.height >= mPictureHeight)) {
                picSz = sz;
            }
        }

        List<Camera.Size> prevSizes = mParams.getSupportedPreviewSizes();
        Camera.Size prevSz = null;

        Collections.sort(prevSizes, comparatorBigger);

        for (Camera.Size sz : prevSizes) {
            LogUtil.i(TAG, String.format("Supported preview size: %d x %d", sz.width, sz.height));
            if (prevSz == null || (sz.width >= mPreferPreviewWidth && sz.height >=
                    mPreferPreviewHeight)) {
                prevSz = sz;
            }
        }

        List<Integer> frameRates = mParams.getSupportedPreviewFrameRates();

        int fpsMax = 0;

        for (Integer n : frameRates) {
            LogUtil.i(TAG, "Supported frame rate: " + n);
            if (fpsMax < n) {
                fpsMax = n;
            }
        }

        mParams.setPreviewSize(prevSz.width, prevSz.height);
        mParams.setPictureSize(picSz.width, picSz.height);

        List<String> focusModes = mParams.getSupportedFocusModes();
        if (focusModes.contains(Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO)) {
            mParams.setFocusMode(Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO);
        }

        previewRate = fpsMax;
        mParams.setPreviewFrameRate(previewRate); //設定相機預覽幀率
//        mParams.setPreviewFpsRange(20, 60);

        try {
            mCameraDevice.setParameters(mParams);
        } catch (Exception e) {
            e.printStackTrace();
        }


        mParams = mCameraDevice.getParameters();

        Camera.Size szPic = mParams.getPictureSize();
        Camera.Size szPrev = mParams.getPreviewSize();

        mPreviewWidth = szPrev.width;
        mPreviewHeight = szPrev.height;

        mPictureWidth = szPic.width;
        mPictureHeight = szPic.height;

        LogUtil.i(TAG, String.format("Camera Picture Size: %d x %d", szPic.width, szPic.height));
        LogUtil.i(TAG, String.format("Camera Preview Size: %d x %d", szPrev.width, szPrev.height));
    }

4. MediaCodec編碼

採用MediaCodec對音訊和視訊編碼類似。
MediaCodec需要注意的有一下幾點:
(1)MediaCodec本身包含緩衝區
(2)輸入(Raw Data):可以是ByteBuffer或者Surface。
音訊資料要求:16-bit signed integer in native byte order
視訊最好是Surface,Surface在native層就是一個buffer,效率上更加高效。Google API上說:

Codecs operate on three kinds of data: compressed data, raw audio data and raw video data. All three kinds of data can be processed using ByteBuffers, but you should use a Surface for raw video data to improve codec performance. Surface uses native video buffers without mapping or copying them to ByteBuffers; thus, it is much more efficient. You normally cannot access the raw video data when using a Surface, but you can use the ImageReader class to access unsecured decoded (raw) video frames. This may still be more efficient than using ByteBuffers, as some native buffers may be mapped into direct ByteBuffers. When using ByteBuffer mode, you can access raw video frames using the Image class and getInput/OutputImage(int).

主要流程如下:
(1)prepare
(2)encode
(3)drainEnd & flushBuffer

4.1 初始化 prepare

核心流程如下:
format = new MediaFormat();
//set format
encoder = MediaCodec.createEncoderByType(mime);
encoder.config(format, ….);
encoder.start

        mBufferInfo = new MediaCodec.BufferInfo();

        MediaFormat format = MediaFormat.createVideoFormat(MIME_TYPE, config.mWidth, config
                .mHeight);

        // Set some properties.  Failing to specify some of these can cause the MediaCodec
        // configure() call to throw an unhelpful exception.
        format.setInteger(MediaFormat.KEY_COLOR_FORMAT,
                MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface);
        format.setInteger(MediaFormat.KEY_BIT_RATE, config.mVBitRate);
        format.setInteger(MediaFormat.KEY_FRAME_RATE, config.mFPS);
        format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, IFRAME_INTERVAL);
        if (VERBOSE) LogUtil.d(TAG, "format: " + format);

        // Create a MediaCodec encoder, and configure it with our format.  Get a Surface
        // we can use for input and wrap it with a class that handles the EGL work.
        try {
            mEncoder = MediaCodec.createEncoderByType(MIME_TYPE);
        } catch (Exception e) {
            e.printStackTrace();

            if (mOnPrepare != null) {
                mOnPrepare.onPrepare(false);
            }

            return;
        }
        mEncoder.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);

4.2 編碼

核心流程如下:
while(1):
{
(1) get status

encoderStatus = mEncoder.dequeueOutputBuffer(mBufferInfo, TIMEOUT_USEC);

encoderStatus:
- wait for input: MediaCodec.INFO_TRY_AGAIN_LATER
- end
- error < 0
- ok

(2) get encoded data

ByteBuffer encodedData = encoderOutputBuffers[encoderStatus];

(3) setup buffer info

                    mBufferInfo.presentationTimeUs = mTimestamp;
                    // adjust the ByteBuffer values to match BufferInfo (not needed?)
                    encodedData.position(mBufferInfo.offset);
                    encodedData.limit(mBufferInfo.offset + mBufferInfo.size);

(4) write to muxer

mMuxer.writeSampleData(mTrackIndex, encodedData, mBufferInfo);

(5) release buffer

mEncoder.releaseOutputBuffer(encoderStatus, false);

}

纖細程式碼如下:

protected void drain(boolean endOfStream) {
        if (mWeakMuxer == null) {
            LogUtil.w(TAG, "muxer is unexpectedly null");
            return;
        }
        IMuxer muxer = mWeakMuxer.get();

        if (VERBOSE) LogUtil.d(TAG, "drain(" + endOfStream + ")");

        if (endOfStream) {
            if (VERBOSE) LogUtil.d(TAG, "sending EOS to encoder");
            mEncoder.signalEndOfInputStream();
        }

        ByteBuffer[] encoderOutputBuffers = mEncoder.getOutputBuffers();
        while (true) {
            int encoderStatus = mEncoder.dequeueOutputBuffer(mBufferInfo, TIMEOUT_USEC);
            if (VERBOSE) LogUtil.d(TAG, "drain: status = " + encoderStatus);
            if (encoderStatus == MediaCodec.INFO_TRY_AGAIN_LATER) {
                // no output available yet
                if (!endOfStream) {
                    break;      // out of while
                } else {
                    if (VERBOSE) LogUtil.d(TAG, "no output available, spinning to await EOS");
                }
            } else if (encoderStatus == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
                // not expected for an encoder
                encoderOutputBuffers = mEncoder.getOutputBuffers();
            } else if (encoderStatus == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
                // should happen before receiving buffers, and should only happen once
//                if (muxer.isStarted()) {
//                    throw new RuntimeException("format changed twice");
//                }
                MediaFormat newFormat = mEncoder.getOutputFormat();
                LogUtil.d(TAG, "encoder output format changed: " + newFormat);

                // now that we have the Magic Goodies, start the muxer
                mTrackIndex = muxer.addTrack(IMuxer.TRACK_VIDEO_ID, newFormat);
                if (!muxer.start()) {
                    synchronized ((muxer)) {
                        while (!muxer.isStarted() && !endOfStream)
                            try {
                                LogUtil.d(TAG, "drain: wait...");
                                muxer.wait(100);
                            } catch (final InterruptedException e) {
                                break;
                            }
                    }
                }
            } else if (encoderStatus < 0) {
                LogUtil.w(TAG, "unexpected result from encoder.dequeueOutputBuffer: " +
                        encoderStatus);
                // let's ignore it
            } else {
                ByteBuffer encodedData = encoderOutputBuffers[encoderStatus];
                if (encodedData == null) {
                    throw new RuntimeException("encoderOutputBuffer " + encoderStatus +
                            " was null");
                }

                if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0) {
                    // The codec config data was pulled out and fed to the muxer when we got
                    // the INFO_OUTPUT_FORMAT_CHANGED status.  Ignore it.
                    if (VERBOSE) LogUtil.d(TAG, "ignoring BUFFER_FLAG_CODEC_CONFIG");
                    mBufferInfo.size = 0;
                }

                if (mBufferInfo.size != 0) {
                    if (!muxer.isStarted()) {
//                        throw new RuntimeException("muxer hasn't started");
                        return;
                    }

                    mBufferInfo.presentationTimeUs = mTimestamp;
                    // adjust the ByteBuffer values to match BufferInfo (not needed?)
                    encodedData.position(mBufferInfo.offset);
                    encodedData.limit(mBufferInfo.offset + mBufferInfo.size);
                    muxer.writeSampleData(mTrackIndex, encodedData, mBufferInfo);

                    if (VERBOSE) {
                        LogUtil.d(TAG, "sent " + mBufferInfo.size + " bytes to muxer, ts=" +
                                mBufferInfo.presentationTimeUs);
                    }
                }

                mEncoder.releaseOutputBuffer(encoderStatus, false);

                if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
                    if (!endOfStream) {
                        LogUtil.w(TAG, "reached end of stream unexpectedly");
                    } else {
                        if (VERBOSE) LogUtil.d(TAG, "end of stream reached");
                    }
                    break;      // out of while
                }

            }//if encode status
        }//while
    }

4.3 drainEnd & flushBuffer

mEncoder.signalEndOfInputStream();

4.4 輸入資料

對視訊來說,直接傳遞Texture即可,呼叫MediaCodec編碼需要在GLSurfaceView.onDrawFrame中呼叫。
對音訊來說需要呼叫:

    protected void encode(final ByteBuffer buffer, final int length, final long
            presentationTimeUs) {
        if (!mRunning || mEncoder == null || !mCodecPrepared) {
            LogUtil.w(TAG, "encode: Audio encode thread is not running yet.");
            return;
        }
        final ByteBuffer[] inputBuffers = mEncoder.getInputBuffers();       //illegal state
        // exception
        while (mRunning) {
            final int inputBufferIndex = mEncoder.dequeueInputBuffer(TIMEOUT_USEC);
            if (inputBufferIndex >= 0) {
                final ByteBuffer inputBuffer = inputBuffers[inputBufferIndex];
                inputBuffer.clear();
                if (buffer != null) {
                    inputBuffer.put(buffer);
                }
//              if (DEBUG) LogUtil.v(TAG, "encode:queueInputBuffer");
                if (length <= 0) {
                    // send EOS
                    mIsEOS = true;
                    LogUtil.i(TAG, "send BUFFER_FLAG_END_OF_STREAM");
                    mEncoder.queueInputBuffer(inputBufferIndex, 0, 0,
                            presentationTimeUs, MediaCodec.BUFFER_FLAG_END_OF_STREAM);
                    break;
                } else {
                    mEncoder.queueInputBuffer(inputBufferIndex, 0, length,
                            presentationTimeUs, 0);
                }
                break;
            } else if (inputBufferIndex == MediaCodec.INFO_TRY_AGAIN_LATER) {
                // wait for MediaCodec encoder is ready to encode
                // nothing to do here because MediaCodec#dequeueInputBuffer(TIMEOUT_USEC)
                // will wait for maximum TIMEOUT_USEC(10msec) on each call
                LogUtil.d(TAG, "encode: wait for MediaCodec encoder is ready to encode");
            }
        }
    }

5. MediaMuxer

(1) 初始化

mMuxer = new MediaMuxer(mOutputPath, format);

(2)新增音訊/視訊流

    /**
     * assign encoder to muxer
     *
     * @param trackID
     * @param format
     * @return minus value indicate error
     */
    public synchronized int addTrack(int trackID, final MediaFormat format) {
        if (mIsStarted)
            throw new IllegalStateException("muxer already started");
        final int trackIndex = mMuxer.addTrack(format);
        if (trackID == TRACK_VIDEO_ID) {
            LogUtil.d(TAG, "addTrack: add video track = " + trackIndex);
            mIsVideoAdded = true;
        } else if (trackID == TRACK_AUDIO_ID) {
            LogUtil.d(TAG, "addTrack: add audio track = " + trackIndex);
            mIsAudioAdded = true;
        }
        return trackIndex;
    }

(3)開始

    /**
     * request readyStart recording from encoder
     *
     * @return true when muxer is ready to write
     */
    @Override
    public synchronized boolean start() {
        LogUtil.v(TAG, "readyStart:");
        if ((mHasAudio == mIsAudioAdded)
                && (mHasVideo == mIsVideoAdded)) {
            mMuxer.start();
            mIsStarted = true;
            if (mOnPrepared != null) {
                mOnPrepared.onPrepare(true);
            }
            LogUtil.v(TAG, "MediaMuxer started:");
        }
        return mIsStarted;
    }

(3)寫入資料

    /**
     * write encoded data to muxer
     *
     * @param trackIndex
     * @param byteBuf
     * @param bufferInfo
     */
    @Override
    public synchronized void writeSampleData(final int trackIndex, final ByteBuffer byteBuf,
                                             final MediaCodec.BufferInfo bufferInfo) {
        if (mIsStarted)
            mMuxer.writeSampleData(trackIndex, byteBuf, bufferInfo);
    }

6. 硬編碼軟編碼對比

ffmpeg編碼設定為:

av_opt_set(c->priv_data, "preset", "superfast", 0);

bitrate,fps等條件相同情況:
(1)FPS
硬編碼FPS=30,軟編碼FPS<17
(2)畫面質量
目前來看二者畫面質量差別不是很大,之後會用專業的視訊質量分析工具驗證下。

軟編碼:
這裡寫圖片描述

硬編碼:
這裡寫圖片描述

7. 參考文獻