Android 8.0系統原始碼分析--相機createCaptureSession建立過程原始碼分析
上一次我們詳細分析了openCamera啟動過程的原始碼,從CameraServer程序建立了很多物件,比如CameraDeviceClient、Camera3Device、FrameProcessorBase,而真正開啟相機還是在驅動層中上電後才完成的,有時候真想把公司的系統元件和晶片元件的原始碼拷回來,這樣就能看到完整的過程了,但是還是不敢,哈哈哈!華為手機是仿照高通,和CameraServer程序互動的是CameraDaemon程序,HAL、演算法的程式碼都是執行在CameraDaemon程序當中的,而這部分每個手機廠商都不一樣,因為是手機廠商自定義的程式碼,所以一般都是在系統原始碼中vendor目錄下的,而且是在晶片元件當中,這裡沒有程式碼,這部分也就無法分析了。
之前我們也說過,相機最重要的四個節點就是openCamera、createCaptureSession、preview、capture,我們這節就來看一下createCaptureSession過程。在之前openCamera過程的原始碼分析中,我們最後說到,當成功開啟相機後,會通過CameraDevice.StateCallback回撥介面的public abstract void onOpened(@NonNull CameraDevice camera)方法返回一個CameraDevice物件給我們應用層,而這個CameraDevice物件真正是一個CameraDeviceImpl,那麼接下來的createCaptureSession就是呼叫它來實現的,我們就來看一下frameworks\base\core\java\android\hardware\camera2\impl\CameraDeviceImpl.java類的createCaptureSession方法是如何實現的,原始碼如下:
@Override public void createCaptureSession(List<Surface> outputs, CameraCaptureSession.StateCallback callback, Handler handler) throws CameraAccessException { List<OutputConfiguration> outConfigurations = new ArrayList<>(outputs.size()); for (Surface surface : outputs) { outConfigurations.add(new OutputConfiguration(surface)); } createCaptureSessionInternal(null, outConfigurations, callback, handler, /*operatingMode*/ICameraDeviceUser.NORMAL_MODE); }
這裡我們來看一下該方法的幾個引數,第一個引數是一個範型為Surface的List,這裡的Surface就是我們用來建立流的,一般如果沒有特殊的要求,那我們只需要下兩個Surface,一個提供預覽,一個提供拍照,預覽的Surface就是相機預覽區域,buffer輪轉時,預覽區的buffer就是要從這個預覽Surface當中獲取的,這個Surface一定要正確,否則就會導致session建立失敗,預覽區就會黑屏了,我們在平時的工作中經常碰到這樣的情況;而至於拍照Surface,我們一般使用ImageReader物件來獲取,ImageReader是系統提供的一個類,它的建立過程已經為我們建立好了一個Surface,我們直接使用它來當作拍照Surface,當拍照成功後,我們就可以從ImageReader.OnImageAvailableListener內部類的onImageAvailable回撥方法中獲取到一個ImageReader物件,再呼叫getPlanes()獲取到Plane陣列,一般取第一個Plane,繼續呼叫getBuffer()就可以獲取到拍攝的照片的byte陣列了,第二個引數callback的型別為frameworks\base\core\java\android\hardware\camera2\CameraCaptureSession.java類的內部類StateCallback,和openCamera一樣,當session建立成功後,framework也會通過這個回撥介面的public abstract void onConfigured(@NonNull CameraCaptureSession session)方法返回一個CameraCaptureSession物件給我們,而真正的實現是一個CameraCaptureSessionImpl物件,我們可以使用它來作很多的工作,比如斷開session連線呼叫abortCaptures();拍照呼叫capture()方法;下預覽呼叫setRepeatingRequest;停預覽呼叫stopRepeating(),這裡的設計和openCamera是完全一樣的。第三個引數Handler的作用和openCamera也一樣,還是為了保證執行緒不發生切換,我們在應用程序的哪個工作執行緒中執行createCaptureSession,那麼framework回撥我們時,也會通過這個handler把回撥訊息傳送到當前handler執行緒的Looper迴圈上。好了,引數分析完了,我們繼續往下看程式碼,它實際就是呼叫createCaptureSessionInternal方法進一步處理的,這裡的會把我們傳入的surface列表進行一下轉換,轉換為OutputConfiguration物件,呼叫createCaptureSessionInternal方法時的第一個引數inputConfig一般為空,我們只關注outputConfig。createCaptureSessionInternal方法的原始碼如下:
private void createCaptureSessionInternal(InputConfiguration inputConfig,
List<OutputConfiguration> outputConfigurations,
CameraCaptureSession.StateCallback callback, Handler handler,
int operatingMode) throws CameraAccessException {
synchronized (mInterfaceLock) {
if (DEBUG) {
Log.d(TAG, "createCaptureSessionInternal");
}
checkIfCameraClosedOrInError();
boolean isConstrainedHighSpeed =
(operatingMode == ICameraDeviceUser.CONSTRAINED_HIGH_SPEED_MODE);
if (isConstrainedHighSpeed && inputConfig != null) {
throw new IllegalArgumentException("Constrained high speed session doesn't support"
+ " input configuration yet.");
}
// Notify current session that it's going away, before starting camera operations
// After this call completes, the session is not allowed to call into CameraDeviceImpl
if (mCurrentSession != null) {
mCurrentSession.replaceSessionClose();
}
// TODO: dont block for this
boolean configureSuccess = true;
CameraAccessException pendingException = null;
Surface input = null;
try {
// configure streams and then block until IDLE
configureSuccess = configureStreamsChecked(inputConfig, outputConfigurations,
operatingMode);
if (configureSuccess == true && inputConfig != null) {
input = mRemoteDevice.getInputSurface();
}
} catch (CameraAccessException e) {
configureSuccess = false;
pendingException = e;
input = null;
if (DEBUG) {
Log.v(TAG, "createCaptureSession - failed with exception ", e);
}
}
// Fire onConfigured if configureOutputs succeeded, fire onConfigureFailed otherwise.
CameraCaptureSessionCore newSession = null;
if (isConstrainedHighSpeed) {
newSession = new CameraConstrainedHighSpeedCaptureSessionImpl(mNextSessionId++,
callback, handler, this, mDeviceHandler, configureSuccess,
mCharacteristics);
} else {
newSession = new CameraCaptureSessionImpl(mNextSessionId++, input,
callback, handler, this, mDeviceHandler,
configureSuccess);
}
// TODO: wait until current session closes, then create the new session
mCurrentSession = newSession;
if (pendingException != null) {
throw pendingException;
}
mSessionStateCallback = mCurrentSession.getDeviceStateCallback();
}
}
這個方法的作用就是配置surface了。該方法中最重要的就是configureSuccess = configureStreamsChecked(inputConfig, outputConfigurations, operatingMode)這句了,它會執行surface配置,如果配置成功,則configureSuccess值為true,否則為false,接下來會建立session的實現類物件,一般是執行else分支,建立CameraCaptureSessionImpl物件,frameworks\base\core\java\android\hardware\camera2\impl\CameraCaptureSessionImpl.java類的構造方法的原始碼如下:
CameraCaptureSessionImpl(int id, Surface input,
CameraCaptureSession.StateCallback callback, Handler stateHandler,
android.hardware.camera2.impl.CameraDeviceImpl deviceImpl,
Handler deviceStateHandler, boolean configureSuccess) {
if (callback == null) {
throw new IllegalArgumentException("callback must not be null");
}
mId = id;
mIdString = String.format("Session %d: ", mId);
mInput = input;
mStateHandler = checkHandler(stateHandler);
mStateCallback = createUserStateCallbackProxy(mStateHandler, callback);
mDeviceHandler = checkNotNull(deviceStateHandler, "deviceStateHandler must not be null");
mDeviceImpl = checkNotNull(deviceImpl, "deviceImpl must not be null");
/*
* Use the same handler as the device's StateCallback for all the internal coming events
*
* This ensures total ordering between CameraDevice.StateCallback and
* CameraDeviceImpl.CaptureCallback events.
*/
mSequenceDrainer = new TaskDrainer<>(mDeviceHandler, new SequenceDrainListener(),
/*name*/"seq");
mIdleDrainer = new TaskSingleDrainer(mDeviceHandler, new IdleDrainListener(),
/*name*/"idle");
mAbortDrainer = new TaskSingleDrainer(mDeviceHandler, new AbortDrainListener(),
/*name*/"abort");
// CameraDevice should call configureOutputs and have it finish before constructing us
if (configureSuccess) {
mStateCallback.onConfigured(this);
if (DEBUG) Log.v(TAG, mIdString + "Created session successfully");
mConfigureSuccess = true;
} else {
mStateCallback.onConfigureFailed(this);
mClosed = true; // do not fire any other callbacks, do not allow any other work
Log.e(TAG, mIdString + "Failed to create capture session; configuration failed");
mConfigureSuccess = false;
}
}
可以看到,在它的構造方法中,就是通過呼叫mStateCallback.onConfigured(this)通過應用層,session已經建立成功,同時將當前的impl實現類物件返回給應用層。好了,我們繼續看一下configureStreamsChecked方法的實現,來學習一下framework是如何配置surface的。configureStreamsChecked方法的原始碼如下:
public boolean configureStreamsChecked(InputConfiguration inputConfig,
List<OutputConfiguration> outputs, int operatingMode)
throws CameraAccessException {
// Treat a null input the same an empty list
if (outputs == null) {
outputs = new ArrayList<OutputConfiguration>();
}
if (outputs.size() == 0 && inputConfig != null) {
throw new IllegalArgumentException("cannot configure an input stream without " +
"any output streams");
}
checkInputConfiguration(inputConfig);
boolean success = false;
synchronized (mInterfaceLock) {
checkIfCameraClosedOrInError();
// Streams to create
HashSet<OutputConfiguration> addSet = new HashSet<OutputConfiguration>(outputs);
// Streams to delete
List<Integer> deleteList = new ArrayList<Integer>();
// Determine which streams need to be created, which to be deleted
for (int i = 0; i < mConfiguredOutputs.size(); ++i) {
int streamId = mConfiguredOutputs.keyAt(i);
OutputConfiguration outConfig = mConfiguredOutputs.valueAt(i);
if (!outputs.contains(outConfig) || outConfig.isDeferredConfiguration()) {
// Always delete the deferred output configuration when the session
// is created, as the deferred output configuration doesn't have unique surface
// related identifies.
deleteList.add(streamId);
} else {
addSet.remove(outConfig); // Don't create a stream previously created
}
}
mDeviceHandler.post(mCallOnBusy);
stopRepeating();
try {
waitUntilIdle();
mRemoteDevice.beginConfigure();
// reconfigure the input stream if the input configuration is different.
InputConfiguration currentInputConfig = mConfiguredInput.getValue();
if (inputConfig != currentInputConfig &&
(inputConfig == null || !inputConfig.equals(currentInputConfig))) {
if (currentInputConfig != null) {
mRemoteDevice.deleteStream(mConfiguredInput.getKey());
mConfiguredInput = new SimpleEntry<Integer, InputConfiguration>(
REQUEST_ID_NONE, null);
}
if (inputConfig != null) {
int streamId = mRemoteDevice.createInputStream(inputConfig.getWidth(),
inputConfig.getHeight(), inputConfig.getFormat());
mConfiguredInput = new SimpleEntry<Integer, InputConfiguration>(
streamId, inputConfig);
}
}
// Delete all streams first (to free up HW resources)
for (Integer streamId : deleteList) {
mRemoteDevice.deleteStream(streamId);
mConfiguredOutputs.delete(streamId);
}
// Add all new streams
for (OutputConfiguration outConfig : outputs) {
if (addSet.contains(outConfig)) {
int streamId = mRemoteDevice.createStream(outConfig);
mConfiguredOutputs.put(streamId, outConfig);
}
}
mRemoteDevice.endConfigure(operatingMode);
success = true;
} catch (IllegalArgumentException e) {
// OK. camera service can reject stream config if it's not supported by HAL
// This is only the result of a programmer misusing the camera2 api.
Log.w(TAG, "Stream configuration failed due to: " + e.getMessage());
return false;
} catch (CameraAccessException e) {
if (e.getReason() == CameraAccessException.CAMERA_IN_USE) {
throw new IllegalStateException("The camera is currently busy." +
" You must wait until the previous operation completes.", e);
}
throw e;
} finally {
if (success && outputs.size() > 0) {
mDeviceHandler.post(mCallOnIdle);
} else {
// Always return to the 'unconfigured' state if we didn't hit a fatal error
mDeviceHandler.post(mCallOnUnconfigured);
}
}
}
return success;
}
上面調入進來前,我們已經看過,inputConfig為空,首先呼叫mDeviceHandler.post(mCallOnBusy)往DeviceHandler的Looper迴圈上發一個CallOnBusy的訊息,表示接下來要開始配置surface,處理繁忙狀態了,然後呼叫stopRepeating()停預覽。接著呼叫mRemoteDevice.beginConfigure()通知CameraServer程序中的binder服務端物件,開始配置。這個mRemoteDevice的實現,我們在openCamera中已經分析過了,它實際上就是CameraDeviceClient物件,它的beginConfigure()方法中是空實現,接下來最重要的就是呼叫for (OutputConfiguration outConfig : outputs)配置surface列表了。配置完成後呼叫mRemoteDevice.endConfigure(operatingMode)通知CameraDeviceClient結束配置,如果中間有任何異常,則返回false,表示配置失敗,那麼構造CameraCaptureSessionImpl物件時,也就會執行mStateCallback.onConfigureFailed(this)通知應用層,session建立失敗了。for迴圈中是呼叫int streamId = mRemoteDevice.createStream(outConfig)來配置surface的,這裡需要注意,華為手機的相機應用一般包括三條流,一條預覽流,一條拍照流,一條回撥流,也就是建立session時對應有三個surface了,所以從這裡進去的邏輯相關的日誌都會列印三次,每次的streamId遞增,大家要注意這個,要區分這三條流,可以通過format格式來判斷,CameraServer在配置surface建立流時會打印出每個流的資訊,其中的format也會打印出來,比如0x21、0x22、0x23,這幾個表示什麼意思呢?這就是Android定義的畫素格式了,它的原始碼定義在system\core\libsystem\include\system\graphics-base.h標頭檔案中,該檔案的原始碼如下:
#ifndef HIDL_GENERATED_ANDROID_HARDWARE_GRAPHICS_COMMON_V1_0_EXPORTED_CONSTANTS_H_
#define HIDL_GENERATED_ANDROID_HARDWARE_GRAPHICS_COMMON_V1_0_EXPORTED_CONSTANTS_H_
#ifdef __cplusplus
extern "C" {
#endif
typedef enum {
HAL_PIXEL_FORMAT_RGBA_8888 = 1,
HAL_PIXEL_FORMAT_RGBX_8888 = 2,
HAL_PIXEL_FORMAT_RGB_888 = 3,
HAL_PIXEL_FORMAT_RGB_565 = 4,
HAL_PIXEL_FORMAT_BGRA_8888 = 5,
HAL_PIXEL_FORMAT_RGBA_1010102 = 43, // 0x2B
HAL_PIXEL_FORMAT_RGBA_FP16 = 22, // 0x16
HAL_PIXEL_FORMAT_YV12 = 842094169, // 0x32315659
HAL_PIXEL_FORMAT_Y8 = 538982489, // 0x20203859
HAL_PIXEL_FORMAT_Y16 = 540422489, // 0x20363159
HAL_PIXEL_FORMAT_RAW16 = 32, // 0x20
HAL_PIXEL_FORMAT_RAW10 = 37, // 0x25
HAL_PIXEL_FORMAT_RAW12 = 38, // 0x26
HAL_PIXEL_FORMAT_RAW_OPAQUE = 36, // 0x24
HAL_PIXEL_FORMAT_BLOB = 33, // 0x21
HAL_PIXEL_FORMAT_IMPLEMENTATION_DEFINED = 34, // 0x22
HAL_PIXEL_FORMAT_YCBCR_420_888 = 35, // 0x23
HAL_PIXEL_FORMAT_YCBCR_422_888 = 39, // 0x27
HAL_PIXEL_FORMAT_YCBCR_444_888 = 40, // 0x28
HAL_PIXEL_FORMAT_FLEX_RGB_888 = 41, // 0x29
HAL_PIXEL_FORMAT_FLEX_RGBA_8888 = 42, // 0x2A
HAL_PIXEL_FORMAT_YCBCR_422_SP = 16, // 0x10
HAL_PIXEL_FORMAT_YCRCB_420_SP = 17, // 0x11
HAL_PIXEL_FORMAT_YCBCR_422_I = 20, // 0x14
HAL_PIXEL_FORMAT_JPEG = 256, // 0x100
} android_pixel_format_t;
typedef enum {
HAL_TRANSFORM_FLIP_H = 1, // 0x01
HAL_TRANSFORM_FLIP_V = 2, // 0x02
HAL_TRANSFORM_ROT_90 = 4, // 0x04
HAL_TRANSFORM_ROT_180 = 3, // 0x03
HAL_TRANSFORM_ROT_270 = 7, // 0x07
} android_transform_t;
typedef enum {
HAL_DATASPACE_UNKNOWN = 0, // 0x0
HAL_DATASPACE_ARBITRARY = 1, // 0x1
HAL_DATASPACE_STANDARD_SHIFT = 16,
HAL_DATASPACE_STANDARD_MASK = 4128768, // (63 << STANDARD_SHIFT)
HAL_DATASPACE_STANDARD_UNSPECIFIED = 0, // (0 << STANDARD_SHIFT)
HAL_DATASPACE_STANDARD_BT709 = 65536, // (1 << STANDARD_SHIFT)
HAL_DATASPACE_STANDARD_BT601_625 = 131072, // (2 << STANDARD_SHIFT)
HAL_DATASPACE_STANDARD_BT601_625_UNADJUSTED = 196608, // (3 << STANDARD_SHIFT)
HAL_DATASPACE_STANDARD_BT601_525 = 262144, // (4 << STANDARD_SHIFT)
HAL_DATASPACE_STANDARD_BT601_525_UNADJUSTED = 327680, // (5 << STANDARD_SHIFT)
HAL_DATASPACE_STANDARD_BT2020 = 393216, // (6 << STANDARD_SHIFT)
HAL_DATASPACE_STANDARD_BT2020_CONSTANT_LUMINANCE = 458752, // (7 << STANDARD_SHIFT)
HAL_DATASPACE_STANDARD_BT470M = 524288, // (8 << STANDARD_SHIFT)
HAL_DATASPACE_STANDARD_FILM = 589824, // (9 << STANDARD_SHIFT)
HAL_DATASPACE_STANDARD_DCI_P3 = 655360, // (10 << STANDARD_SHIFT)
HAL_DATASPACE_STANDARD_ADOBE_RGB = 720896, // (11 << STANDARD_SHIFT)
HAL_DATASPACE_TRANSFER_SHIFT = 22,
HAL_DATASPACE_TRANSFER_MASK = 130023424, // (31 << TRANSFER_SHIFT)
HAL_DATASPACE_TRANSFER_UNSPECIFIED = 0, // (0 << TRANSFER_SHIFT)
HAL_DATASPACE_TRANSFER_LINEAR = 4194304, // (1 << TRANSFER_SHIFT)
HAL_DATASPACE_TRANSFER_SRGB = 8388608, // (2 << TRANSFER_SHIFT)
HAL_DATASPACE_TRANSFER_SMPTE_170M = 12582912, // (3 << TRANSFER_SHIFT)
HAL_DATASPACE_TRANSFER_GAMMA2_2 = 16777216, // (4 << TRANSFER_SHIFT)
HAL_DATASPACE_TRANSFER_GAMMA2_6 = 20971520, // (5 << TRANSFER_SHIFT)
HAL_DATASPACE_TRANSFER_GAMMA2_8 = 25165824, // (6 << TRANSFER_SHIFT)
HAL_DATASPACE_TRANSFER_ST2084 = 29360128, // (7 << TRANSFER_SHIFT)
HAL_DATASPACE_TRANSFER_HLG = 33554432, // (8 << TRANSFER_SHIFT)
HAL_DATASPACE_RANGE_SHIFT = 27,
HAL_DATASPACE_RANGE_MASK = 939524096, // (7 << RANGE_SHIFT)
HAL_DATASPACE_RANGE_UNSPECIFIED = 0, // (0 << RANGE_SHIFT)
HAL_DATASPACE_RANGE_FULL = 134217728, // (1 << RANGE_SHIFT)
HAL_DATASPACE_RANGE_LIMITED = 268435456, // (2 << RANGE_SHIFT)
HAL_DATASPACE_RANGE_EXTENDED = 402653184, // (3 << RANGE_SHIFT)
HAL_DATASPACE_SRGB_LINEAR = 512, // 0x200
HAL_DATASPACE_V0_SRGB_LINEAR = 138477568, // ((STANDARD_BT709 | TRANSFER_LINEAR) | RANGE_FULL)
HAL_DATASPACE_V0_SCRGB_LINEAR = 406913024, // ((STANDARD_BT709 | TRANSFER_LINEAR) | RANGE_EXTENDED)
HAL_DATASPACE_SRGB = 513, // 0x201
HAL_DATASPACE_V0_SRGB = 142671872, // ((STANDARD_BT709 | TRANSFER_SRGB) | RANGE_FULL)
HAL_DATASPACE_V0_SCRGB = 411107328, // ((STANDARD_BT709 | TRANSFER_SRGB) | RANGE_EXTENDED)
HAL_DATASPACE_JFIF = 257, // 0x101
HAL_DATASPACE_V0_JFIF = 146931712, // ((STANDARD_BT601_625 | TRANSFER_SMPTE_170M) | RANGE_FULL)
HAL_DATASPACE_BT601_625 = 258, // 0x102
HAL_DATASPACE_V0_BT601_625 = 281149440, // ((STANDARD_BT601_625 | TRANSFER_SMPTE_170M) | RANGE_LIMITED)
HAL_DATASPACE_BT601_525 = 259, // 0x103
HAL_DATASPACE_V0_BT601_525 = 281280512, // ((STANDARD_BT601_525 | TRANSFER_SMPTE_170M) | RANGE_LIMITED)
HAL_DATASPACE_BT709 = 260, // 0x104
HAL_DATASPACE_V0_BT709 = 281083904, // ((STANDARD_BT709 | TRANSFER_SMPTE_170M) | RANGE_LIMITED)
HAL_DATASPACE_DCI_P3_LINEAR = 139067392, // ((STANDARD_DCI_P3 | TRANSFER_LINEAR) | RANGE_FULL)
HAL_DATASPACE_DCI_P3 = 155844608, // ((STANDARD_DCI_P3 | TRANSFER_GAMMA2_6) | RANGE_FULL)
HAL_DATASPACE_DISPLAY_P3_LINEAR = 139067392, // ((STANDARD_DCI_P3 | TRANSFER_LINEAR) | RANGE_FULL)
HAL_DATASPACE_DISPLAY_P3 = 143261696, // ((STANDARD_DCI_P3 | TRANSFER_SRGB) | RANGE_FULL)
HAL_DATASPACE_ADOBE_RGB = 151715840, // ((STANDARD_ADOBE_RGB | TRANSFER_GAMMA2_2) | RANGE_FULL)
HAL_DATASPACE_BT2020_LINEAR = 138805248, // ((STANDARD_BT2020 | TRANSFER_LINEAR) | RANGE_FULL)
HAL_DATASPACE_BT2020 = 147193856, // ((STANDARD_BT2020 | TRANSFER_SMPTE_170M) | RANGE_FULL)
HAL_DATASPACE_BT2020_PQ = 163971072, // ((STANDARD_BT2020 | TRANSFER_ST2084) | RANGE_FULL)
HAL_DATASPACE_DEPTH = 4096, // 0x1000
HAL_DATASPACE_SENSOR = 4097, // 0x1001
} android_dataspace_t;
typedef enum {
HAL_COLOR_MODE_NATIVE = 0,
HAL_COLOR_MODE_STANDARD_BT601_625 = 1,
HAL_COLOR_MODE_STANDARD_BT601_625_UNADJUSTED = 2,
HAL_COLOR_MODE_STANDARD_BT601_525 = 3,
HAL_COLOR_MODE_STANDARD_BT601_525_UNADJUSTED = 4,
HAL_COLOR_MODE_STANDARD_BT709 = 5,
HAL_COLOR_MODE_DCI_P3 = 6,
HAL_COLOR_MODE_SRGB = 7,
HAL_COLOR_MODE_ADOBE_RGB = 8,
HAL_COLOR_MODE_DISPLAY_P3 = 9,
} android_color_mode_t;
typedef enum {
HAL_COLOR_TRANSFORM_IDENTITY = 0,
HAL_COLOR_TRANSFORM_ARBITRARY_MATRIX = 1,
HAL_COLOR_TRANSFORM_VALUE_INVERSE = 2,
HAL_COLOR_TRANSFORM_GRAYSCALE = 3,
HAL_COLOR_TRANSFORM_CORRECT_PROTANOPIA = 4,
HAL_COLOR_TRANSFORM_CORRECT_DEUTERANOPIA = 5,
HAL_COLOR_TRANSFORM_CORRECT_TRITANOPIA = 6,
} android_color_transform_t;
typedef enum {
HAL_HDR_DOLBY_VISION = 1,
HAL_HDR_HDR10 = 2,
HAL_HDR_HLG = 3,
} android_hdr_t;
我們可以換算一下,0x21也就是33,對應HAL_PIXEL_FORMAT_BLOB,從它的命名也可以看出,是用來儲存BLOB資料的,所以對應的就是拍照流,0x22是34,對應HAL_PIXEL_FORMAT_IMPLEMENTATION_DEFINED,它對應預覽流,該format一般是預覽surface的格式,而預覽surface是顯示系統給我們的,它不像拍照流定義一個ImageReader就可以了,它涉及到介面顯示,是一個真正的View,在實際的問題分析當中,這塊的日誌對我們也很有用,大家一定要注意。好了,明白了format之後,我們繼續往下看程式碼,int streamId = mRemoteDevice.createStream(outConfig)邏輯中,mRemoteDevice就是執行在CameraServer程序當中的CameraDeviceClient物件,所以我們就接著看一下frameworks\av\services\camera\libcameraservice\api2\CameraDeviceClient.cpp類的createStream方法的實現,原始碼如下:
binder::Status CameraDeviceClient::createStream(
const hardware::camera2::params::OutputConfiguration &outputConfiguration,
/*out*/
int32_t* newStreamId) {
ATRACE_CALL();
binder::Status res;
if (!(res = checkPidStatus(__FUNCTION__)).isOk()) return res;
Mutex::Autolock icl(mBinderSerializationLock);
const std::vector<sp<IGraphicBufferProducer>>& bufferProducers =
outputConfiguration.getGraphicBufferProducers();
size_t numBufferProducers = bufferProducers.size();
bool deferredConsumer = outputConfiguration.isDeferred();
bool isShared = outputConfiguration.isShared();
if (numBufferProducers > MAX_SURFACES_PER_STREAM) {
ALOGE("%s: GraphicBufferProducer count %zu for stream exceeds limit of %d",
__FUNCTION__, bufferProducers.size(), MAX_SURFACES_PER_STREAM);
return STATUS_ERROR(CameraService::ERROR_ILLEGAL_ARGUMENT, "Surface count is too high");
}
bool deferredConsumerOnly = deferredConsumer && numBufferProducers == 0;
int surfaceType = outputConfiguration.getSurfaceType();
bool validSurfaceType = ((surfaceType == OutputConfiguration::SURFACE_TYPE_SURFACE_VIEW) ||
(surfaceType == OutputConfiguration::SURFACE_TYPE_SURFACE_TEXTURE));
if (deferredConsumer && !validSurfaceType) {
ALOGE("%s: Target surface is invalid: bufferProducer = %p, surfaceType = %d.",
__FUNCTION__, bufferProducers[0].get(), surfaceType);
return STATUS_ERROR(CameraService::ERROR_ILLEGAL_ARGUMENT, "Target Surface is invalid");
}
if (!mDevice.get()) {
return STATUS_ERROR(CameraService::ERROR_DISCONNECTED, "Camera device no longer alive");
}
std::vector<sp<Surface>> surfaces;
std::vector<sp<IBinder>> binders;
status_t err;
// Create stream for deferred surface case.
if (deferredConsumerOnly) {
return createDeferredSurfaceStreamLocked(outputConfiguration, isShared, newStreamId);
}
OutputStreamInfo streamInfo;
bool isStreamInfoValid = false;
for (auto& bufferProducer : bufferProducers) {
// Don't create multiple streams for the same target surface
sp<IBinder> binder = IInterface::asBinder(bufferProducer);
ssize_t index = mStreamMap.indexOfKey(binder);
if (index != NAME_NOT_FOUND) {
String8 msg = String8::format("Camera %s: Surface already has a stream created for it "
"(ID %zd)", mCameraIdStr.string(), index);
ALOGW("%s: %s", __FUNCTION__, msg.string());
return STATUS_ERROR(CameraService::ERROR_ALREADY_EXISTS, msg.string());
}
sp<Surface> surface;
res = createSurfaceFromGbp(streamInfo, isStreamInfoValid, surface, bufferProducer);
if (!res.isOk())
return res;
if (!isStreamInfoValid) {
// Streaming sharing is only supported for IMPLEMENTATION_DEFINED
// formats.
if (isShared && streamInfo.format != HAL_PIXEL_FORMAT_IMPLEMENTATION_DEFINED) {
String8 msg = String8::format("Camera %s: Stream sharing is only supported for "
"IMPLEMENTATION_DEFINED format", mCameraIdStr.string());
ALOGW("%s: %s", __FUNCTION__, msg.string());
return STATUS_ERROR(CameraService::ERROR_ILLEGAL_ARGUMENT, msg.string());
}
isStreamInfoValid = true;
}
binders.push_back(IInterface::asBinder(bufferProducer));
surfaces.push_back(surface);
}
int streamId = camera3::CAMERA3_STREAM_ID_INVALID;
err = mDevice->createStream(surfaces, deferredConsumer, streamInfo.width,
streamInfo.height, streamInfo.format, streamInfo.dataSpace,
static_cast<camera3_stream_rotation_t>(outputConfiguration.getRotation()),
&streamId, outputConfiguration.getSurfaceSetID(), isShared);
if (err != OK) {
res = STATUS_ERROR_FMT(CameraService::ERROR_INVALID_OPERATION,
"Camera %s: Error creating output stream (%d x %d, fmt %x, dataSpace %x): %s (%d)",
mCameraIdStr.string(), streamInfo.width, streamInfo.height, streamInfo.format,
streamInfo.dataSpace, strerror(-err), err);
} else {
int i = 0;
for (auto& binder : binders) {
ALOGV("%s: mStreamMap add binder %p streamId %d, surfaceId %d",
__FUNCTION__, binder.get(), streamId, i);
mStreamMap.add(binder, StreamSurfaceId(streamId, i++));
}
mStreamInfoMap[streamId] = streamInfo;
ALOGV("%s: Camera %s: Successfully created a new stream ID %d for output surface"
" (%d x %d) with format 0x%x.",
__FUNCTION__, mCameraIdStr.string(), streamId, streamInfo.width,
streamInfo.height, streamInfo.format);
// Set transform flags to ensure preview to be rotated correctly.
res = setStreamTransformLocked(streamId);
*newStreamId = streamId;
}
return res;
}
該方法的第二個引數是輸出引數,也就是我們配置完成的streamId,我們忽略掉其它無關的邏輯,該方法中主要就是for (auto& bufferProducer : bufferProducers)、mDevice->createStream這兩句邏輯了,for迴圈中使用outputConfiguration.getGraphicBufferProducers()得到的GraphicBufferProducers創建出對應的surface,同時會對這些surface物件進行判斷,檢查它們的合法性,合法的話就會將它們加入到surfaces集合中,然後呼叫mDevice->createStream進一步執行流的建立。
這裡就要說一說Android顯示系統的一些知識了,大家要清楚,Android上最終繪製在螢幕上的buffer都是在視訊記憶體中分配的,而除了這部分外,其他都是在記憶體中分配的,buffer管理的模組有兩個,一個是framebuffer,一個是gralloc,framebuffer用來將渲染好的buffer顯示到螢幕上,而gralloc用於分配buffer,我們相機預覽的buffer輪轉也不例外,它所申請的buffer根上也是由gralloc來分配的,在native層的描述是一個private_handle_t指標,而中間會經過多層的封裝,這些buffer都是共享的,只不過它的生命週期中的某個時刻只能屬於一個所有者,而這些所有者的角色在不斷的變換,這也就是Android中最經典的生產者--消費者的迴圈模型了,生產者就是BufferProducer,消費者就是BufferConsumer,每一個buffer在它的生命週期過程中轉換時都會被鎖住,這樣它的所有者角色發生變化,而其他物件想要修改它就不可能了,這樣就保證了buffer同步。
好了,for迴圈中的邏輯我們就不分析了,繼續看一下mDevice->createStream,在openCamera分析過程,我們已經看到,mDevice的型別為Camera3Device,那我們就來看一下frameworks\av\services\camera\libcameraservice\device3\Camera3Device.cpp類的createStream方法的實現,原始碼如下:
status_t Camera3Device::createStream(sp<Surface> consumer,
uint32_t width, uint32_t height, int format,
android_dataspace dataSpace, camera3_stream_rotation_t rotation, int *id,
int streamSetId, bool isShared, uint32_t consumerUsage) {
ATRACE_CALL();
if (consumer == nullptr) {
ALOGE("%s: consumer must not be null", __FUNCTION__);
return BAD_VALUE;
}
std::vector<sp<Surface>> consumers;
consumers.push_back(consumer);
return createStream(consumers, /*hasDeferredConsumer*/ false, width, height,
format, dataSpace, rotation, id, streamSetId, isShared, consumerUsage);
}
status_t Camera3Device::createStream(const std::vector<sp<Surface>>& consumers,
bool hasDeferredConsumer, uint32_t width, uint32_t height, int format,
android_dataspace dataSpace, camera3_stream_rotation_t rotation, int *id,
int streamSetId, bool isShared, uint32_t consumerUsage) {
ATRACE_CALL();
Mutex::Autolock il(mInterfaceLock);
Mutex::Autolock l(mLock);
ALOGV("Camera %s: Creating new stream %d: %d x %d, format %d, dataspace %d rotation %d"
" consumer usage 0x%x, isShared %d", mId.string(), mNextStreamId, width, height, format,
dataSpace, rotation, consumerUsage, isShared);
status_t res;
bool wasActive = false;
switch (mStatus) {
case STATUS_ERROR:
CLOGE("Device has encountered a serious error");
return INVALID_OPERATION;
case STATUS_UNINITIALIZED:
CLOGE("Device not initialized");
return INVALID_OPERATION;
case STATUS_UNCONFIGURED:
case STATUS_CONFIGURED:
// OK
break;
case STATUS_ACTIVE:
ALOGV("%s: Stopping activity to reconfigure streams", __FUNCTION__);
res = internalPauseAndWaitLocked();
if (res != OK) {
SET_ERR_L("Can't pause captures to reconfigure streams!");
return res;
}
wasActive = true;
break;
default:
SET_ERR_L("Unexpected status: %d", mStatus);
return INVALID_OPERATION;
}
assert(mStatus != STATUS_ACTIVE);
sp<Camera3OutputStream> newStream;
if (consumers.size() == 0 && !hasDeferredConsumer) {
ALOGE("%s: Number of consumers cannot be smaller than 1", __FUNCTION__);
return BAD_VALUE;
}
if (hasDeferredConsumer && format != HAL_PIXEL_FORMAT_IMPLEMENTATION_DEFINED) {
ALOGE("Deferred consumer stream creation only support IMPLEMENTATION_DEFINED format");
return BAD_VALUE;
}
if (format == HAL_PIXEL_FORMAT_BLOB) {
ssize_t blobBufferSize;
if (dataSpace != HAL_DATASPACE_DEPTH) {
blobBufferSize = getJpegBufferSize(width, height);
if (blobBufferSize <= 0) {
SET_ERR_L("Invalid jpeg buffer size %zd", blobBufferSize);
return BAD_VALUE;
}
} else {
blobBufferSize = getPointCloudBufferSize();
if (blobBufferSize <= 0) {
SET_ERR_L("Invalid point cloud buffer size %zd", blobBufferSize);
return BAD_VALUE;
}
}
newStream = new Camera3OutputStream(mNextStreamId, consumers[0],
width, height, blobBufferSize, format, dataSpace, rotation,
mTimestampOffset, streamSetId);
} else if (format == HAL_PIXEL_FORMAT_RAW_OPAQUE) {
ssize_t rawOpaqueBufferSize = getRawOpaqueBufferSize(width, height);
if (rawOpaqueBufferSize <= 0) {
SET_ERR_L("Invalid RAW opaque buffer size %zd", rawOpaqueBufferSize);
return BAD_VALUE;
}
newStream = new Camera3OutputStream(mNextStreamId, consumers[0],
width, height, rawOpaqueBufferSize, format, dataSpace, rotation,
mTimestampOffset, streamSetId);
} else if (isShared) {
newStream = new Camera3SharedOutputStream(mNextStreamId, consumers,
width, height, format, consumerUsage, dataSpace, rotation,
mTimestampOffset, streamSetId);
} else if (consumers.size() == 0 && hasDeferredConsumer) {
newStream = new Camera3OutputStream(mNextStreamId,
width, height, format, consumerUsage, dataSpace, rotation,
mTimestampOffset, streamSetId);
} else {
newStream = new Camera3OutputStream(mNextStreamId, consumers[0],
width, height, format, dataSpace, rotation,
mTimestampOffset, streamSetId);
}
newStream->setStatusTracker(mStatusTracker);
newStream->setBufferManager(mBufferManager);
res = mOutputStreams.add(mNextStreamId, newStream);
if (res < 0) {
SET_ERR_L("Can't add new stream to set: %s (%d)", strerror(-res), res);
return res;
}
*id = mNextStreamId++;
mNeedConfig = true;
// Continue captures if active at start
if (wasActive) {
ALOGV("%s: Restarting activity to reconfigure streams", __FUNCTION__);
// Reuse current operating mode for new stream config
res = configureStreamsLocked(mOperatingMode);
if (res != OK) {
CLOGE("Can't reconfigure device for new stream %d: %s (%d)",
mNextStreamId, strerror(-res), res);
return res;
}
internalResumeLocked();
}
ALOGV("Camera %s: Created new stream", mId.string());
return OK;
}
首先將上一步傳入的surface,也就是以後的consumer加入到佇列中,然後呼叫過載的createStream方法進一步處理。這裡的引數width就表示我們要配置的surface的寬度,height表示高度,format表示格式,我們前面已經說過,dataSpace的型別為android_dataspace,它表示我們buffer輪轉時,buffer的大小,接下來定義一個Camera3OutputStream區域性變數,這個也就是我們說的配置流了,接下來的if/else判斷會根據我們的意圖,建立不同的流物件,比如我們要配置拍照流,它的format格式為HAL_PIXEL_FORMAT_BLOB,所以就執行第一個if分支,建立一個Camera3OutputStream,建立完成後,執行*id = mNextStreamId++,給id指標賦值,這也就是當前流的id了,所以它是遞增的。一般情況下,mStatus的狀態在initializeCommonLocked()初始化通過呼叫internalUpdateStatusLocked方法被賦值為STATUS_UNCONFIGURED狀態,所以這裡的switch/case分支中就進入case STATUS_UNCONFIGURED,然後直接break跳出了,所以區域性變數wasActive的值為false,最後直接返回OK。
到這裡,createStream的邏輯就執行完成了,還是要提醒大家,createStream的邏輯是在framework中的for迴圈裡執行的,我們的建立相當於只配置了一個surface,如果有多個surface的話,這裡會執行多次,相應的Camera3OutputStream流的日誌也會列印多次,這對於大家定位問題也非常有幫助。
流建立完成後,for迴圈執行完成,在CameraDeviceImpl類的configureStreamsChecked方法中最後呼叫mRemoteDevice.endConfigure(operatingMode)結束配置,它的實現在CameraDeviceClient中,原始碼如下:
binder::Status CameraDeviceClient::endConfigure(int operatingMode) {
ALOGV("%s: ending configure (%d input stream, %zu output surfaces)",
__FUNCTION__, mInputStream.configured ? 1 : 0,
mStreamMap.size());
binder::Status res;
if (!(res = checkPidStatus(__FUNCTION__)).isOk()) return res;
Mutex::Autolock icl(mBinderSerializationLock);
if (!mDevice.get()) {
return STATUS_ERROR(CameraService::ERROR_DISCONNECTED, "Camera device no longer alive");
}
if (operatingMode < 0) {
String8 msg = String8::format(
"Camera %s: Invalid operating mode %d requested", mCameraIdStr.string(), operatingMode);
ALOGE("%s: %s", __FUNCTION__, msg.string());
return STATUS_ERROR(CameraService::ERROR_ILLEGAL_ARGUMENT,
msg.string());
}
// Sanitize the high speed session against necessary capability bit.
bool isConstrainedHighSpeed = (operatingMode == ICameraDeviceUser::CONSTRAINED_HIGH_SPEED_MODE);
if (isConstrainedHighSpeed) {
CameraMetadata staticInfo = mDevice->info();
camera_metadata_entry_t entry = staticInfo.find(ANDROID_REQUEST_AVAILABLE_CAPABILITIES);
bool isConstrainedHighSpeedSupported = false;
for(size_t i = 0; i < entry.count; ++i) {
uint8_t capability = entry.data.u8[i];
if (capability == ANDROID_REQUEST_AVAILABLE_CAPABILITIES_CONSTRAINED_HIGH_SPEED_VIDEO) {
isConstrainedHighSpeedSupported = true;
break;
}
}
if (!isConstrainedHighSpeedSupported) {
String8 msg = String8::format(
"Camera %s: Try to create a constrained high speed configuration on a device"
" that doesn't support it.", mCameraIdStr.string());
ALOGE("%s: %s", __FUNCTION__, msg.string());
return STATUS_ERROR(CameraService::ERROR_ILLEGAL_ARGUMENT,
msg.string());
}
}
status_t err = mDevice->configureStreams(operatingMode);
if (err == BAD_VALUE) {
String8 msg = String8::format("Camera %s: Unsupported set of inputs/outputs provided",
mCameraIdStr.string());
ALOGE("%s: %s", __FUNCTION__, msg.string());
res = STATUS_ERROR(CameraService::ERROR_ILLEGAL_ARGUMENT, msg.string());
} else if (err != OK) {
String8 msg = String8::format("Camera %s: Error configuring streams: %s (%d)",
mCameraIdStr.string(), strerror(-err), err);
ALOGE("%s: %s", __FUNCTION__, msg.string());
res = STATUS_ERROR(CameraService::ERROR_INVALID_OPERATION, msg.string());
}
return res;
}
該方法中就是呼叫mDevice->configureStreams(operatingMode)進一步處理,mDevice的型別為Camera3Device,configureStreams方法的原始碼如下:
status_t Camera3Device::configureStreams(int operatingMode) {
ATRACE_CALL();
ALOGV("%s: E", __FUNCTION__);
Mutex::Autolock il(mInterfaceLock);
Mutex::Autolock l(mLock);
return configureStreamsLocked(operatingMode);
}
這裡就是呼叫configureStreamsLocked進一步處理,原始碼如下:status_t Camera3Device::configureStreamsLocked(int operatingMode) {
ATRACE_CALL();
status_t res;
if (mStatus != STATUS_UNCONFIGURED && mStatus != STATUS_CONFIGURED) {
CLOGE("Not idle");
return INVALID_OPERATION;
}
if (operatingMode < 0) {
CLOGE("Invalid operating mode: %d", operatingMode);
return BAD_VALUE;
}
bool isConstrainedHighSpeed =
static_cast<int>(StreamConfigurationMode::CONSTRAINED_HIGH_SPEED_MODE) ==
operatingMode;
if (mOperatingMode != operatingMode) {
mNeedConfig = true;
mIsConstrainedHighSpeedConfiguration = isConstrainedHighSpeed;
mOperatingMode = operatingMode;
}
if (!mNeedConfig) {
ALOGV("%s: Skipping config, no stream changes", __FUNCTION__);
return OK;
}
// Workaround for device HALv3.2 or older spec bug - zero streams requires
// adding a dummy stream instead.
// TODO: Bug: 17321404 for fixing the HAL spec and removing this workaround.
if (mOutputStreams.size() == 0) {
addDummyStreamLocked();
} else {
tryRemoveDummyStreamLocked();
}
// Start configuring the streams
ALOGV("%s: Camera %s: Starting stream configuration", __FUNCTION__, mId.string());
camera3_stream_configuration config;
config.operation_mode = mOperatingMode;
config.num_streams = (mInputStream != NULL) + mOutputStreams.size();
Vector<camera3_stream_t*> streams;
streams.setCapacity(config.num_streams);
if (mInputStream != NULL) {
camera3_stream_t *inputStream;
inputStream = mInputStream->startConfiguration();
if (inputStream == NULL) {
CLOGE("Can't start input stream configuration");
cancelStreamsConfigurationLocked();
return INVALID_OPERATION;
}
streams.add(inputStream);
}
for (size_t i = 0; i < mOutputStreams.size(); i++) {
// Don't configure bidi streams twice, nor add them twice to the list
if (mOutputStreams[i].get() ==
static_cast<Camera3StreamInterface*>(mInputStream.get())) {
config.num_streams--;
continue;
}
camera3_stream_t *outputStream;
outputStream = mOutputStreams.editValueAt(i)->startConfiguration();
if (outputStream == NULL) {
CLOGE("Can't start output stream configuration");
cancelStreamsConfigurationLocked();
return INVALID_OPERATION;
}
streams.add(outputStream);
}
config.streams = streams.editArray();
// Do the HAL configuration; will potentially touch stream
// max_buffers, usage, priv fields.
res = mInterface->configureStreams(&config);
if (res == BAD_VALUE) {
// HAL rejected this set of streams as unsupported, clean up config
// attempt and return to unconfigured state
CLOGE("Set of requested inputs/outputs not supported by HAL");
cancelStreamsConfigurationLocked();
return BAD_VALUE;
} else if (res != OK) {
// Some other kind of error from configure_streams - this is not
// expected
SET_ERR_L("Unable to configure streams with HAL: %s (%d)",
strerror(-res), res);
return res;
}
// Finish all stream configuration immediately.
// TODO: Try to relax this later back to lazy completion, which should be
// faster
if (mInputStream != NULL && mInputStream->isConfiguring()) {
res = mInputStream->finishConfiguration();
if (res != OK) {
CLOGE("Can't finish configuring input stream %d: %s (%d)",
mInputStream->getId(), strerror(-res), res);
cancelStreamsConfigurationLocked();
return BAD_VALUE;
}
}
for (size_t i = 0; i < mOutputStreams.size(); i++) {
sp<Camera3OutputStreamInterface> outputStream =
mOutputStreams.editValueAt(i);
if (outputStream->isConfiguring() && !outputStream->isConsumerConfigurationDeferred()) {
res = outputStream->finishConfiguration();
if (res != OK) {
CLOGE("Can't finish configuring output stream %d: %s (%d)",
outputStream->getId(), strerror(-res), res);
cancelStreamsConfigurationLocked();
return BAD_VALUE;
}
}
}
// Request thread needs to know to avoid using repeat-last-settings protocol
// across configure_streams() calls
mRequestThread->configurationComplete(mIsConstrainedHighSpeedConfiguration);
char value[PROPERTY_VALUE_MAX];
property_get("camera.fifo.disable", value, "0");
int32_t disableFifo = atoi(value);
if (disableFifo != 1) {
// Boost priority of request thread to SCHED_FIFO.
pid_t requestThreadTid = mRequestThread->getTid();
res = requestPriority(getpid(), requestThreadTid,
kRequestThreadPriority, /*isForApp*/ false, /*asynchronous*/ false);
if (res != OK) {
ALOGW("Can't set realtime priority for request processing thread: %s (%d)",
strerror(-res), res);
} else {
ALOGD("Set real time priority for request queue thread (tid %d)", requestThreadTid);
}
}
// Update device state
mNeedConfig = false;
internalUpdateStatusLocked((mDummyStreamId == NO_STREAM) ?
STATUS_CONFIGURED : STATUS_UNCONFIGURED);
ALOGV("%s: Camera %s: Stream configuration complete", __FUNCTION__, mId.string());
// tear down the deleted streams after configure streams.
mDeletedStreams.clear();
return OK;
}
該方法中會呼叫res = mInterface->configureStreams(&config)提供給HAL層去執行配置,因為HAL層的邏輯每個廠商都不一樣,我們就不進入去分析了,大家可以自己看一下。配置完成後,在保證正確的情況下,呼叫outputStream->finishConfiguration()結束配置,finishConfiguration方法的實現在父類frameworks\av\services\camera\libcameraservice\device3\Camera3Stream.cpp當中,原始碼如下:
status_t Camera3Stream::finishConfiguration() {
ATRACE_CALL();
Mutex::Autolock l(mLock);
switch (mState) {
case STATE_ERROR:
ALOGE("%s: In error state", __FUNCTION__);
return INVALID_OPERATION;
case STATE_IN_CONFIG:
case STATE_IN_RECONFIG:
// OK
break;
case STATE_CONSTRUCTED:
case STATE_CONFIGURED:
ALOGE("%s: Cannot finish configuration that hasn't been started",
__FUNCTION__);
return INVALID_OPERATION;
default:
ALOGE("%s: Unknown state", __FUNCTION__);
return INVALID_OPERATION;
}
// Register for idle tracking
sp<StatusTracker> statusTracker = mStatusTracker.promote();
if (statusTracker != 0) {
mStatusId = statusTracker->addComponent();
}
// Check if the stream configuration is unchanged, and skip reallocation if
// so. As documented in hardware/camera3.h:configure_streams().
if (mState == STATE_IN_RECONFIG &&
mOldUsage == camera3_stream::usage &&
mOldMaxBuffers == camera3_stream::max_buffers) {
mState = STATE_CONFIGURED;
return OK;
}
// Reset prepared state, since buffer config has changed, and existing
// allocations are no longer valid
mPrepared = false;
mStreamUnpreparable = false;
status_t res;
res = configureQueueLocked();
if (res != OK) {
ALOGE("%s: Unable to configure stream %d queue: %s (%d)",
__FUNCTION__, mId, strerror(-res), res);
mState = STATE_ERROR;
return res;
}
mState = STATE_CONFIGURED;
return res;
}
這裡又會呼叫configureQueueLocked回到子類中去處理,我們看一下frameworks\av\services\camera\libcameraservice\device3\Camera3OutputStream.cpp類的configureQueueLocked方法,原始碼如下:
status_t Camera3OutputStream::configureQueueLocked() {
status_t res;
mTraceFirstBuffer = true;
if ((res = Camera3IOStreamBase::configureQueueLocked()) != OK) {
return res;
}
if ((res = configureConsumerQueueLocked()) != OK) {
return res;
}
// Set dequeueBuffer/attachBuffer timeout if the consumer is not hw composer or hw texture.
// We need skip these cases as timeout will disable the non-blocking (async) mode.
if (!(isConsumedByHWComposer() || isConsumedByHWTexture())) {
mConsumer->setDequeueTimeout(kDequeueBufferTimeout);
}
return OK;
}
這裡又呼叫configureConsumerQueueLocked進一步處理,該方法原始碼如下:
status_t Camera3OutputStream::configureConsumerQueueLocked() {
status_t res;
mTraceFirstBuffer = true;
ALOG_ASSERT(mConsumer != 0, "mConsumer should never be NULL");
// Configure consumer-side ANativeWindow interface. The listener may be used
// to notify buffer manager (if it is used) of the returned buffers.
res = mConsumer->connect(NATIVE_WINDOW_API_CAMERA,
/*listener*/mBufferReleasedListener,
/*reportBufferRemoval*/true);
if (res != OK) {
ALOGE("%s: Unable to connect to native window for stream %d",
__FUNCTION__, mId);
return res;
}
mConsumerName = mConsumer->getConsumerName();
res = native_window_set_usage(mConsumer.get(), camera3_stream::usage);
if (res != OK) {
ALOGE("%s: Unable to configure usage %08x for stream %d",
__FUNCTION__, camera3_stream::usage, mId);
return res;
}
res = native_window_set_scaling_mode(mConsumer.get(),
NATIVE_WINDOW_SCALING_MODE_SCALE_TO_WINDOW);
if (res != OK) {
ALOGE("%s: Unable to configure stream scaling: %s (%d)",
__FUNCTION__, strerror(-res), res);
return res;
}
if (mMaxSize == 0) {
// For buffers of known size
res = native_window_set_buffers_dimensions(mConsumer.get(),
camera3_stream::width, camera3_stream::height);
} else {
// For buffers with bounded size
res = native_window_set_buffers_dimensions(mConsumer.get(),
mMaxSize, 1);
}
if (res != OK) {
ALOGE("%s: Unable to configure stream buffer dimensions"
" %d x %d (maxSize %zu) for stream %d",
__FUNCTION__, camera3_stream::width, camera3_stream::height,
mMaxSize, mId);
return res;
}
res = native_window_set_buffers_format(mConsumer.get(),
camera3_stream::format);
if (res != OK) {
ALOGE("%s: Unable to configure stream buffer format %#x for stream %d",
__FUNCTION__, camera3_stream::format, mId);
return res;
}
res = native_window_set_buffers_data_space(mConsumer.get(),
camera3_stream::data_space);
if (res != OK) {
ALOGE("%s: Unable to configure stream dataspace %#x for stream %d",
__FUNCTION__, camera3_stream::data_space, mId);
return res;
}
int maxConsumerBuffers;
res = static_cast<ANativeWindow*>(mConsumer.get())->query(
mConsumer.get(),
NATIVE_WINDOW_MIN_UNDEQUEUED_BUFFERS, &maxConsumerBuffers);
if (res != OK) {
ALOGE("%s: Unable to query consumer undequeued"
" buffer count for stream %d", __FUNCTION__, mId);
return res;
}
ALOGV("%s: Consumer wants %d buffers, HAL wants %d", __FUNCTION__,
maxConsumerBuffers, camera3_stream::max_buffers);
if (camera3_stream::max_buffers == 0) {
ALOGE("%s: Camera HAL requested max_buffer count: %d, requires at least 1",
__FUNCTION__, camera3_stream::max_buffers);
return INVALID_OPERATION;
}
mTotalBufferCount = maxConsumerBuffers + camera3_stream::max_buffers;
mHandoutTotalBufferCount = 0;
mFrameCount = 0;
mLastTimestamp = 0;
mUseMonoTimestamp = (isConsumedByHWComposer() | isVideoStream());
res = native_window_set_buffer_count(mConsumer.get(),
mTotalBufferCount);
if (res != OK) {
ALOGE("%s: Unable to set buffer count for stream %d",
__FUNCTION__, mId);
return res;
}
res = native_window_set_buffers_transform(mConsumer.get(),
mTransform);
if (res != OK) {
ALOGE("%s: Unable to configure stream transform to %x: %s (%d)",
__FUNCTION__, mTransform, strerror(-res), res);
return res;
}
/**
* Camera3 Buffer manager is only supported by HAL3.3 onwards, as the older HALs requires
* buffers to be statically allocated for internal static buffer registration, while the
* buffers provided by buffer manager are really dynamically allocated. Camera3Device only
* sets the mBufferManager if device version is > HAL3.2, which guarantees that the buffer
* manager setup is skipped in below code. Note that HAL3.2 is also excluded here, as some
* HAL3.2 devices may not support the dynamic buffer registeration.
*/
if (mBufferManager != 0 && mSetId > CAMERA3_STREAM_SET_ID_INVALID) {
uint32_t consumerUsage = 0;
getEndpointUsage(&consumerUsage);
StreamInfo streamInfo(
getId(), getStreamSetId(), getWidth(), getHeight(), getFormat(), getDataSpace(),
camera3_stream::usage | consumerUsage, mTotalBufferCount,
/*isConfigured*/true);
wp<Camera3OutputStream> weakThis(this);
res = mBufferManager->registerStream(weakThis,
streamInfo);
if (res == OK) {
// Disable buffer allocation for this BufferQueue, buffer manager will take over
// the buffer allocation responsibility.
mConsumer->getIGraphicBufferProducer()->allowAllocation(false);
mUseBufferManager = true;
} else {
ALOGE("%s: Unable to register stream %d to camera3 buffer manager, "
"(error %d %s), fall back to BufferQueue for buffer management!",
__FUNCTION__, mId, res, strerror(-res));
}
}
return OK;
}
這裡有一句比較重要,就是res = mConsumer->connect(NATIVE_WINDOW_API_CAMERA, /*listener*/mBufferReleasedListener, /*reportBufferRemoval*/true),mConsumer就是我們配置流時建立的surface,這裡會執行連線,如果我們當前surface的GraphicBufferProducer生產者有被其他連線,那麼配置也會失敗,這裡一定要注意,配置失敗的話,也會有日誌輸出,可以幫助我們定位問題。ANativeWindow是OpenGL定義的視窗型別了,在Android裝置上,該型別一般有兩類,一類就是Surface,一類就是SurfaceFlinger了。
好,到這裡,createCaptureSession的過程就分析完了,它是我們相機預覽最重要的條件,一般session建立成功,那麼我們的預覽就會正常,session建立失敗,則預覽一定黑屏,大家如果有碰到相機黑屏的問題,最大的疑點就是這裡,session建立完成後,framework會通過CameraCaptureSession.StateCallback類的public abstract void onConfigured(@NonNull CameraCaptureSession session)回撥到應用層,通知我們session建立成功了,那麼我們就可以使用回撥方法中的CameraCaptureSession引數,呼叫它的setRepeatingRequest方法來下預覽了,該邏輯執行完成後,我們相機的預覽就起來了。
還有一點需要說一下,Android 8.0的原始碼中在CameraCaptureSession類中提供的finalizeOutputConfigurations方法非常巧妙,它可以縮短相機的啟動時間,它的實現思路是這樣的,原本建立session時,要求我們必須提供預覽surface,而該介面可以先不提供預覽surface,而只以預覽的size構造一個OutputConfiguration物件,將它下到CameraServer程序當中,然後等到預覽surface準備就緒以後,再呼叫該介面去對接,這樣也可以保證預覽的正常,中間就可以節省出預覽surface準備的部分時間,也就達到了縮短相機啟動時長的目的了。
關於session建立這塊的邏輯,HAL層還有很大一段,在公司的晶片元件的原始碼中有分析過一部分,家裡沒有條件,只能先放下了。大家如果有看過Android相機的原始碼,就可以感覺到,camera部分的邏輯真是太複雜了,framework中(包括Java層和native層的CameraServer程序)還是比較簡單的一部分,往下的CameraDaemon程序中還有執行演算法、ISP的邏輯,再到核心、驅動,還有很大一部分,想要完全搞通,還是要費很大氣力的。
好了,今天就到這裡,晚安!!