1. 程式人生 > >安卓人臉檢測優化

安卓人臉檢測優化

一、人臉檢測模組移植

1.拷貝opencv-3.3.0-android-sdk\OpenCV-android-sdk\samples\face-detection\jni目錄到工程app module的main目錄下
 

2.修改jni目錄下的Android.mk

(1) 將

     #OPENCV_INSTALL_MODULES:=off
     #OPENCV_LIB_TYPE:=SHARED

 修改為:

     OPENCV_INSTALL_MODULES:=on
     OPENCV_LIB_TYPE:=SHARED

其中,OPENCV_INSTALL_MODULES的作用是在打包apk時載入OpenCV的動態庫;OPENCV_LIB_TYPE的作用是指定OpenCV庫的型別為動態庫。

(2) 將

 ifdef OPENCV_ANDROID_SDK
    ifneq ("","$(wildcard $(OPENCV_ANDROID_SDK)/OpenCV.mk)")
        include ${OPENCV_ANDROID_SDK}/OpenCV.mk
    else
        include ${OPENCV_ANDROID_SDK}/sdk/native/jni/OpenCV.mk
   endif
      include ../../sdk/native/jni/OpenCV.mk
   endif

         修改為:

   include E:\Environment\opencv-3.3.0-android-sdk\OpenCV-android-sdk\sdk\native\jni\OpenCV.mk

其中,include包含的就是OpenCV SDK中OpenCV.mk檔案所儲存的絕對路徑。最終Android.mk修改效果如下:


3.修改jni目錄下Application.mk。由於在匯入OpenCV libs時只拷貝了armeabi 、armeabi-v7a、arm64-v8a,因此這裡指定編譯平臺也為上述三個;修改APP_PLaTFORM版本為android-16(可根據自身情況而定),具體如下:

      APP_STL := gnustl_static
      APP_CPPFLAGS := -frtti –fexceptions
       # 指定編譯平臺
       APP_ABI := armeabi armeabi-v7a arm64-v8a
      # 指定Android平臺
      APP_PLATFORM := android-16

4.修改DetectionBasedTracker_jni.h和DetectionBasedTracker_jni.cpp檔案,將原始檔中所有包含字首“Java_org_opencv_samples_facedetect_”替換為“Java_com_jiangdg_opencv4android_natives_”,其中com.jiangdg.opencv4android.natives是Java層類DetectionBasedTracker.java所在的包路徑,該類包含了人臉檢測相關的native方法,否則,在呼叫自己編譯生成的so庫時會提示找不到該本地函式錯誤,以DetectionBasedTracker_jni.h為例:

/* DO NOT EDIT THIS FILE - it is machine generated */
#include <jni.h>
/* Header for class org_opencv_samples_fd_DetectionBasedTracker */

#ifndef _Included_org_opencv_samples_fd_DetectionBasedTracker
#define _Included_org_opencv_samples_fd_DetectionBasedTracker
#ifdef __cplusplus
extern "C" {
#endif
/*
 * Class:     org_opencv_samples_fd_DetectionBasedTracker
 * Method:    nativeCreateObject
 * Signature: (Ljava/lang/String;F)J
 */
JNIEXPORT jlong JNICALL Java_com_jiangdg_opencv4android_natives_DetectionBasedTracker_nativeCreateObject
  (JNIEnv *, jclass, jstring, jint);

/*
 * Class:     org_opencv_samples_fd_DetectionBasedTracker
 * Method:    nativeDestroyObject
 * Signature: (J)V
 */
JNIEXPORT void JNICALL Java_com_jiangdg_opencv4android_natives_DetectionBasedTracker_nativeDestroyObject
  (JNIEnv *, jclass, jlong);

/*
 * Class:     org_opencv_samples_fd_DetectionBasedTracker
 * Method:    nativeStart
 * Signature: (J)V
 */
JNIEXPORT void JNICALL Java_com_jiangdg_opencv4android_natives_DetectionBasedTracker_nativeStart
  (JNIEnv *, jclass, jlong);

/*
 * Class:     org_opencv_samples_fd_DetectionBasedTracker
 * Method:    nativeStop
 * Signature: (J)V
 */
JNIEXPORT void JNICALL Java_com_jiangdg_opencv4android_natives_DetectionBasedTracker_nativeStop
  (JNIEnv *, jclass, jlong);

  /*
   * Class:     org_opencv_samples_fd_DetectionBasedTracker
   * Method:    nativeSetFaceSize
   * Signature: (JI)V
   */
  JNIEXPORT void JNICALL Java_com_jiangdg_opencv4android_natives_DetectionBasedTracker_nativeSetFaceSize
  (JNIEnv *, jclass, jlong, jint);

/*
 * Class:     org_opencv_samples_fd_DetectionBasedTracker
 * Method:    nativeDetect
 * Signature: (JJJ)V
 */
JNIEXPORT void JNICALL Java_com_jiangdg_opencv4android_natives_DetectionBasedTracker_nativeDetect
  (JNIEnv *, jclass, jlong, jlong, jlong);

#ifdef __cplusplus
}
#endif
#endif

5.開啟Android Studio中的Terminal視窗,使用cd命令切換到工程jni目錄所在位置,並執行ndk-build命令,然後會自動在工程的app/src/main目錄下生成libs和obj目錄,其中libs目錄存放的是目標動態庫libdetection_based_tracker.so。


生成so庫:


注意:如果執行ndk-build命令提示命令不存在,說明你的ndk環境變數沒有配置好。
6.修改app模組build.gradle中的sourceSets欄位,禁止自動呼叫ndk-build命令,設定目標so的存放路徑,程式碼如下:

android {
    compileSdkVersion 25
    defaultConfig {
        applicationId "com.jiangdg.opencv4android"
        minSdkVersion 15
        targetSdkVersion 25
        versionCode 1
        versionName "1.0"
    }
    ….// 程式碼省略
   sourceSets {
        main {
            jni.srcDirs = []                 //禁止自動呼叫ndk-build命令
            jniLibs.srcDir 'src/main/libs'  // 設定目標的so存放路徑
}
    }
    ….// 程式碼省略
}

      其中,jni.srcDirs = []的作用是禁用gradle預設的ndk-build,防止AS自己生成android.mk編譯jni工程,jniLibs.srcDir 'src/main/libs'的作用設定目標的so存放路徑,以將自己生成的so組裝到apk中。
二、原始碼解析
使用OpenCV3.3.0庫實現人臉檢測功能主要包含以下四個步驟,即:
(1) 初始化載入OpenCV庫引擎;
(2) 通過OpenCV庫開啟Camera渲染;
(3) 載入人臉檢測模型;
(4) 呼叫人臉檢測本地動態庫實現人臉識別;
1.初始化載入OpenCV庫引擎
OpenCV庫的載入有兩種方式,一種通過OpenCV Manager進行動態載入,也就是官方推薦的方式,這種方式需要另外安裝OpenCV Manager,主要通過呼叫OpenCVLoader.initAsync()方法進行初始化;另一種為靜態載入,也就是本文所使用的方法,即先將相關架構的so包拷貝到工程的libs目錄,通過呼叫OpenCVLoader.initDebug()方法進行初始化,類似於呼叫system.loadLibrary("opencv_java")。

if (!OpenCVLoader.initDebug()) {
    // 靜態載入OpenCV失敗,使用OpenCV Manager初始化
    // 引數:OpenCV版本;上下文;載入結果回撥介面
     OpenCVLoader.initAsync(OpenCVLoader.OPENCV_VERSION_3_3_0,
    this, mLoaderCallback);
 } else {
     // 如果靜態載入成功,直接呼叫onManagerConnected方法
     mLoaderCallback.onManagerConnected(LoaderCallbackInterface.SUCCESS);
 }

其中,mLoaderCallback為OpenCV庫初始化狀態回撥介面,當OpenCV被初始化成功後其onManagerConnected(int status)方法會被呼叫,而我們就可以在該方法中處理本地動態庫的載入、載入人臉檢測模型檔案、初始化人臉檢測引擎以及開啟Camera渲染等操作,具體程式碼如下:

private BaseLoaderCallback mLoaderCallback = new BaseLoaderCallback(this) {
@Override
public void onManagerConnected(int status) {
   switch (status) {
      case LoaderCallbackInterface.SUCCESS:
          // OpenCV初始化載入成功,再載入本地so庫
          System.loadLibrary("detection_based_tracker");
          // 載入人臉檢測模型
          …..
         // 初始化人臉檢測引擎
         …..
         // 開啟渲染Camera
         mCameraView.enableView();
         break;
      default:
         super.onManagerConnected(status);
         break;
     }
   }
};

2. 通過OpenCV庫開啟Camera渲染
在OpenCV中與Camera緊密相關的主要有兩個類,即CameraBridgeViewBase和JavaCameraView,其中,CameraBridgeViewBase是一個基類,繼承於SuarfaceView和SurafaceHolder.Callback介面,用於實現Camera與OpenCV庫之間的互動,它主要的作用是控制Camera、處理視訊幀以及呼叫相關內部介面對視訊幀做相關調整,然後將調整後的視訊幀資料渲染到手機螢幕上。比如enableView()方法、disableView()方法用於連線到Camera和斷開與Camera的連線,程式碼如下:

    public void enableView() {
        synchronized(mSyncObject) {
            mEnabled = true;
            checkCurrentState();
        }
}
   public void disableView() {
        synchronized(mSyncObject) {
            mEnabled = false;
            checkCurrentState();
        }
}

其中,checkCurrentState()方法用於更新Camera的渲染狀態,它呼叫了processEnterState()方法來啟動或停用Camera,以及將Camera的狀態對外回撥。為了方便開發者實時獲取Camera的連線狀態,CameraBridgeViewBase還提供了一個setCvCameraViewListener(CvCameraViewListener2 listener)方法,引數listener其一個內部介面,它包括三個方法:onCameraViewStarted(int width, int height)、void onCameraViewStopped()、Mat onCameraFrame(CvCameraViewFrame inputFrame),分別用於對外回撥Camera連線狀態和傳遞Camera的實時視訊幀資料。

  private void checkCurrentState() {
        Log.d(TAG, "call checkCurrentState");
        int targetState;


        if (mEnabled && mSurfaceExist && getVisibility() == VISIBLE) {
            targetState = STARTED;
        } else {
            targetState = STOPPED;
        }


        if (targetState != mState) {
            /* The state change detected. Need to exit the current state and enter target state */
            processExitState(mState);
            mState = targetState;
            processEnterState(mState);
        }
    }
    private void processEnterState(int state) {
        Log.d(TAG, "call processEnterState: " + state);
        switch(state) {
        case STARTED:
            // 呼叫connectCamera()抽象方法,啟動Camera
            onEnterStartedState();
            // 呼叫連線成功監聽器介面方法
            if (mListener != null) {
                mListener.onCameraViewStarted(mFrameWidth, mFrameHeight);
            }
            break;
        case STOPPED:
            // 呼叫disconnectCamera()抽象方法,停用Camera
            onEnterStoppedState();
            // 呼叫斷開連線監聽器介面方法
            if (mListener != null) {
                mListener.onCameraViewStopped();
            }
            break;
        };
}

       既然CameraBridgeViewBase是一個基類,與Camera緊密相關的connectCamera()和disconnectCamera()又是抽象方法,那麼就必定會有一個子類來實現這兩個方法,而這個子類就是JavaCameraView。JavaCameraView繼承於CameraBridgeViewBase和PreviewCallback介面,是銜接OpenCV和Camera的橋樑,是Camera啟動、禁止的實際實現者,在這個類裡我們可以看到關於Camera很多熟悉的操作。原始碼如下:

  @Override
    protected boolean connectCamera(int width, int height) {
        // 初始化Camera,連線到Camera
        if (!initializeCamera(width, height))
            return false;
        mCameraFrameReady = false;
        // 開啟一個與Camera相關的工作執行緒CameraWorker
        Log.d(TAG, "Starting processing thread");
        mStopThread = false;
        mThread = new Thread(new CameraWorker());
        mThread.start();
        return true;
    }


    @Override
    protected void disconnectCamera() {
        // 斷開Camera連線,釋放相關資源
        try {
            mStopThread = true;
            Log.d(TAG, "Notify thread");
            synchronized (this) {
                this.notify();
            }
            // 停止工作執行緒
            if (mThread != null)
                mThread.join();
        } catch (InterruptedException e) {
            e.printStackTrace();
        } finally {
            mThread =  null;
        }


        /* Now release camera */
        releaseCamera();


        mCameraFrameReady = false;
}

     CameraWorker是一個工作執行緒,用於處理從onPreviewFrame獲得的視訊幀資料,其儲存在一個Mat型別的陣列中。它會不斷呼叫父類CameraBridgeViewBase的deliverAndDrawFrame方法,將處理後的視訊幀資料流通過呼叫內部介面CvCameraViewListener2的onCameraFrame(CvCameraViewFrame frame)對外回撥。

private class CameraWorker implements Runnable {
        @Override
        public void run() {
            do {
                 …..//程式碼省略
                if (!mStopThread && hasFrame) {
                    if (!mFrameChain[1 - mChainIdx].empty())
                        deliverAndDrawFrame(mCameraFrame[1 - mChainIdx]);
                }
            } while (!mStopThread);
        }
    }

3. 載入人臉檢測模型
    為了得到更好的人臉檢測效能,OpenCV在SDK中提供了多個frontface檢測器(人臉模型),存放在..\opencv-3.3.0-android-sdk\OpenCV-android-sdk\sdk\etc\目錄下,這篇對OpenCV自帶的人臉檢測模型做了比較,結果顯示LBP實時性要好些。因此,本文選用目lbpcascades錄下lbpcascade_frontalface.xml模型,該模型包括了3000個正樣本和1500個負樣本,我們將其拷貝到AS工程的res/raw目錄下,並通過getDir方法儲存到/data/data/com.jiangdg.opencv4android/ cascade目錄下。

InputStream is = getResources().openRawResource(R.raw.lbpcascade_frontalface);
File cascadeDir = getDir("cascade", Context.MODE_PRIVATE);
mCascadeFile = new File(cascadeDir, "lbpcascade_frontalface.xml");
FileOutputStream os = new FileOutputStream(mCascadeFile);
byte[] buffer = new byte[4096];
int byteesRead;
while ((byteesRead = is.read(buffer)) != -1) {
    os.write(buffer, 0, byteesRead);
}
is.close();
os.close();

注:關於模型的訓練在以後的博文中會討論到。
4. 人臉檢測
在opencv-3.3.0-android-sdk的face-detection示例專案中,提供了CascadeClassifier和
DetectionBasedTracker兩種方式來實現人臉檢測,其中,CascadeClassifier是OpenCV用於人臉檢測的一個級聯分類器,DetectionBasedTracker是通過JNI程式設計實現的人臉檢測。兩種方式我都試用了下,發現DetectionBasedTracker方式還是比CascadeClassifier穩定些,CascadeClassifier會存在一定頻率的誤檢。

public class DetectionBasedTracker {
    private long mNativeObj = 0;
    // 構造方法:初始化人臉檢測引擎
    public DetectionBasedTracker(String cascadeName, int minFaceSize) {
        mNativeObj = nativeCreateObject(cascadeName, minFaceSize);
    }
    // 開始人臉檢測
    public void start() {
        nativeStart(mNativeObj);
    }
    // 停止人臉檢測
    public void stop() {
        nativeStop(mNativeObj);
    }
    // 設定人臉最小尺寸
    public void setMinFaceSize(int size) {
        nativeSetFaceSize(mNativeObj, size);
    }


    // 檢測
    public void detect(Mat imageGray, MatOfRect faces) {
        nativeDetect(mNativeObj, imageGray.getNativeObjAddr(), faces.getNativeObjAddr());
    }
    // 釋放資源
    public void release() {
        nativeDestroyObject(mNativeObj);
        mNativeObj = 0;
    }
    // native方法
    private static native long nativeCreateObject(String cascadeName, int minFaceSize);
    private static native void nativeDestroyObject(long thiz);
    private static native void nativeStart(long thiz);
    private static native void nativeStop(long thiz);
    private static native void nativeSetFaceSize(long thiz, int size);
    private static native void nativeDetect(long thiz, long inputImage, long faces);
}

初始化DetectionBasedTracker後,我們只需要在CvCameraViewListener2介面的onCameraFrame方法中對每幀圖片進行人臉檢測即可。

@Override
public Mat onCameraFrame(CameraBridgeViewBase.CvCameraViewFrame inputFrame) {
         ….// 程式碼省略
        // 獲取檢測到的臉部資料
        MatOfRect faces = new MatOfRect();
        …// 程式碼省略
        if (mNativeDetector != null) {
            mNativeDetector.detect(mGray, faces);
         }
        // 繪製檢測框
        Rect[] facesArray = faces.toArray();
        for (int i = 0; i < facesArray.length; i++) {
            Imgproc.rectangle(mRgba, facesArray[i].tl(), facesArray[i].br(), FACE_RECT_COLOR, 3);
        }
        return mRgba;
    }

注:由於篇幅原因,關於人臉檢測的C/C++實現程式碼(原理),我們將在後續文章中討論。

三、效果演示

1. FaceDetectActivity.class

/**
 * 人臉檢測
 *
 * Created by jiangdongguo on 2018/1/4.
 */

public class FaceDetectActivity extends AppCompatActivity implements CameraBridgeViewBase.CvCameraViewListener2 {
    private static final int JAVA_DETECTOR = 0;
    private static final int NATIVE_DETECTOR = 1;
    private static final String TAG = "FaceDetectActivity";
    @BindView(R.id.cameraView_face)
    CameraBridgeViewBase mCameraView;


    private Mat mGray;
    private Mat mRgba;
    private int mDetectorType = NATIVE_DETECTOR;
    private int mAbsoluteFaceSize = 0;
    private float mRelativeFaceSize = 0.2f;
    private DetectionBasedTracker mNativeDetector;
    private CascadeClassifier mJavaDetector;
    private static final Scalar FACE_RECT_COLOR = new Scalar(0, 255, 0, 255);


    private File mCascadeFile;
    private BaseLoaderCallback mLoaderCallback = new BaseLoaderCallback(this) {
        @Override
        public void onManagerConnected(int status) {
            switch (status) {
                case LoaderCallbackInterface.SUCCESS:
                    // OpenCV初始化載入成功,再載入本地so庫
                    System.loadLibrary("detection_based_tracker");


                    try {
                        // 載入人臉檢測模式檔案
                        InputStream is = getResources().openRawResource(R.raw.lbpcascade_frontalface);
                        File cascadeDir = getDir("cascade", Context.MODE_PRIVATE);
                        mCascadeFile = new File(cascadeDir, "lbpcascade_frontalface.xml");
                        FileOutputStream os = new FileOutputStream(mCascadeFile);
                        byte[] buffer = new byte[4096];
                        int byteesRead;
                        while ((byteesRead = is.read(buffer)) != -1) {
                            os.write(buffer, 0, byteesRead);
                        }
                        is.close();
                        os.close();
                        // 使用模型檔案初始化人臉檢測引擎
                        mJavaDetector = new CascadeClassifier(mCascadeFile.getAbsolutePath());
                        if (mJavaDetector.empty()) {
                            Log.e(TAG, "載入cascade classifier失敗");
                            mJavaDetector = null;
                        } else {
                            Log.d(TAG, "Loaded cascade classifier from " + mCascadeFile.getAbsolutePath());
                        }
                        mNativeDetector = new DetectionBasedTracker(mCascadeFile.getAbsolutePath(), 0);
                        cascadeDir.delete();
                    } catch (FileNotFoundException e) {
                        e.printStackTrace();
                    } catch (IOException e) {
                        e.printStackTrace();
                    }
                    // 開啟渲染Camera
                    mCameraView.enableView();
                    break;
                default:
                    super.onManagerConnected(status);
                    break;
            }
        }
    };


    @Override
    protected void onCreate(@Nullable Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        getWindow().addFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN);
        setContentView(R.layout.activity_facedetect);
        // 繫結View
        ButterKnife.bind(this);
        mCameraView.setVisibility(CameraBridgeViewBase.VISIBLE);
        // 註冊Camera渲染事件監聽器
        mCameraView.setCvCameraViewListener(this);
    }


    @Override
    protected void onResume() {
        super.onResume();
        // 靜態初始化OpenCV
        if (!OpenCVLoader.initDebug()) {
            Log.d(TAG, "無法載入OpenCV本地庫,將使用OpenCV Manager初始化");
            OpenCVLoader.initAsync(OpenCVLoader.OPENCV_VERSION_3_3_0, this, mLoaderCallback);
        } else {
            Log.d(TAG, "成功載入OpenCV本地庫");
            mLoaderCallback.onManagerConnected(LoaderCallbackInterface.SUCCESS);
        }
    }


    @Override
    protected void onPause() {
        super.onPause();
        // 停止渲染Camera
        if (mCameraView != null) {
            mCameraView.disableView();
        }
    }


    @Override
    protected void onDestroy() {
        super.onDestroy();
        // 停止渲染Camera
        if (mCameraView != null) {
            mCameraView.disableView();
        }
    }


    @Override
    public void onCameraViewStarted(int width, int height) {
        // 灰度影象
        mGray = new Mat();
        // R、G、B彩色影象
        mRgba = new Mat();
    }


    @Override
    public void onCameraViewStopped() {
        mGray.release();
        mRgba.release();
    }


    @Override
    public Mat onCameraFrame(CameraBridgeViewBase.CvCameraViewFrame inputFrame) {
        mRgba = inputFrame.rgba();
        mGray = inputFrame.gray();
        // 設定臉部大小
        if (mAbsoluteFaceSize == 0) {
            int height = mGray.rows();
            if (Math.round(height * mRelativeFaceSize) > 0) {
                mAbsoluteFaceSize = Math.round(height * mRelativeFaceSize);
            }
            mNativeDetector.setMinFaceSize(mAbsoluteFaceSize);
        }
        // 獲取檢測到的臉部資料
        MatOfRect faces = new MatOfRect();
        if (mDetectorType == JAVA_DETECTOR) {
            if (mJavaDetector != null) {
                mJavaDetector.detectMultiScale(mGray, faces, 1.1, 2, 2,
                        new Size(mAbsoluteFaceSize, mAbsoluteFaceSize), new Size());
            }
        } else if (mDetectorType == NATIVE_DETECTOR) {
            if (mNativeDetector != null) {
                mNativeDetector.detect(mGray, faces);
            }
        } else {
            Log.e(TAG, "Detection method is not selected!");
        }
        // 繪製檢測框
        Rect[] facesArray = faces.toArray();
        for (int i = 0; i < facesArray.length; i++) {
            Imgproc.rectangle(mRgba, facesArray[i].tl(), facesArray[i].br(), FACE_RECT_COLOR, 3);
        }


        Log.i(TAG, "共檢測到 " + faces.toArray().length + " 張臉");
        return mRgba;
    }
}

2. activity_facedetect.xml

<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:opencv="http://schemas.android.com/apk/res-auto"
    android:layout_width="match_parent"
    android:layout_height="match_parent"
    android:orientation="vertical">


    <org.opencv.android.JavaCameraView
        android:id="@+id/cameraView_face"
        android:layout_width="match_parent"
        android:layout_height="match_parent"
        android:visibility="gone"
        opencv:camera_id="any"
        opencv:show_fps="true" />
</RelativeLayout>

3. AndroidMnifest.xml

<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
    package="com.jiangdg.opencv4android">
    <uses-permission android:name="android.permission.CAMERA"/>

    <uses-feature android:name="android.hardware.camera" android:required="false"/>
    <uses-feature android:name="android.hardware.camera.autofocus" android:required="false"/>
    <uses-feature android:name="android.hardware.camera.front" android:required="false"/>
    <uses-feature android:name="android.hardware.camera.front.autofocus" android:required="false"/>

    <supports-screens android:resizeable="true"
        android:smallScreens="true"
        android:normalScreens="true"
        android:largeScreens="true"
        android:anyDensity="true" />

    <application
        android:allowBackup="true"
        android:icon="@mipmap/ic_launcher"
        android:label="@string/app_name"
        android:roundIcon="@mipmap/ic_launcher_round"
        android:supportsRtl="true"
        android:theme="@style/AppTheme">
        <activity android:name=".MainActivity">
            <intent-filter>
                <action android:name="android.intent.action.MAIN" />

                <category android:name="android.intent.category.LAUNCHER" />
            </intent-filter>
        </activity>

        <activity android:name=".HelloOpenCVActivity"
            android:screenOrientation="landscape"
            android:configChanges="keyboardHidden|orientation"/>
        <activity android:name=".FaceDetectActivity"
            android:screenOrientation="landscape"
            android:configChanges="keyboardHidden|orientation"/>
    </application>
</manifest>