<output id="qn6qe"></output>

    1. <output id="qn6qe"><tt id="qn6qe"></tt></output>
    2. <strike id="qn6qe"></strike>

      亚洲 日本 欧洲 欧美 视频,日韩中文字幕有码av,一本一道av中文字幕无码,国产线播放免费人成视频播放,人妻少妇偷人无码视频,日夜啪啪一区二区三区,国产尤物精品自在拍视频首页,久热这里只有精品12

      使用 Android NDK 獲取 YUV420p攝像頭原始數據·

      使用 Android NDK 獲取 YUV420p攝像頭原始數據

      首先frameworks/av/camera/Camera.cpp已經過時了不要再使用它了, 當然想要更換舊的Camera的成本也不小,一般公司也不會做.
      先介紹一一些常見的數據格式,然后介紹一下使用方式即可,然后下篇文件在探索一下源碼.
      脈絡大概如下:
      CameraManager → CameraService → Camera HAL v3 → Sensor/Driver.

      常見的視頻原始數據格式

      本質上視頻就是一張一張的圖片,利用人眼視覺暫留的原理,24幀率的時候人眼就會無法辨別出單幅的靜態畫面.
      編碼就是利用算法算出每張圖片之間的關系然后進行壓縮.
      解碼就是一個逆向的過程,將壓縮后的數據利用逆向算法恢復成一張一張的圖片,然后播放.

      yuv420p

      最常見得
      這個是最常見的.舉個例子:
      4x2像素的圖片存儲格式如下:
      首先Y分量和像素一樣,如下:
      YYYY
      YYYY
      接著是U分量,4個Y分量共用一個U分量.
      UU
      接著是V分量,同理
      VV
      最終在內存中如下:

      YYYY
      YYYY
      UU
      VV
      

      5x3像素的圖片存儲格式如下:
      首先Y分量和像素一樣,如下:

      YYYYY
      YYYYY
      UU
      VV
      

      他們一共在內存中占用15 + 2 + 2 = 19字節.
      YU12

      YYYY
      YYYY
      YYYY
      YYYY
      YYYY
      YYYY
      YYYY
      YYYY
      UU
      VV
      UU
      VV
      

      YU21

      YYYY
      YYYY
      YYYY
      YYYY
      YYYY
      YYYY
      YYYY
      YYYY
      VV
      UU
      VV
      UU
      

      yuv420sp

      它和yuv420p得區別在于前者UV是順序存儲,后者是交替存儲.
      yuv420sp分為NV12NV21
      NV12
      4x8

      YYYY
      YYYY
      YYYY
      YYYY
      YYYY
      YYYY
      YYYY
      YYYY
      UV
      UV
      UV
      UV
      

      NV21
      4x8

      YYYY
      YYYY
      YYYY
      YYYY
      YYYY
      YYYY
      YYYY
      YYYY
      VU
      VU
      VU
      VU
      

      源碼封裝

      cmake

      
      # For more information about using CMake with Android Studio, read the
      # documentation: https://d.android.com/studio/projects/add-native-code.html.
      # For more examples on how to use CMake, see https://github.com/android/ndk-samples.
      
      # Sets the minimum CMake version required for this project.
      cmake_minimum_required(VERSION 3.22.1)
      
      # Declares the project name. The project name can be accessed via ${ PROJECT_NAME},
      # Since this is the top level CMakeLists.txt, the project name is also accessible
      # with ${CMAKE_PROJECT_NAME} (both CMake variables are in-sync within the top level
      # build script scope).
      project(openslLearn VERSION 0.1.0 LANGUAGES C CXX)
      
      # ? 設置 C++ 標準
      set(CMAKE_CXX_STANDARD 23)  # 使用 C++26 標準
      set(CMAKE_CXX_STANDARD_REQUIRED ON)  # 強制使用指定標準
      set(CMAKE_CXX_EXTENSIONS OFF)        # 禁用編譯器擴展(使用純標準)
      
      # Creates and names a library, sets it as either STATIC
      # or SHARED, and provides the relative paths to its source code.
      # You can define multiple libraries, and CMake builds them for you.
      # Gradle automatically packages shared libraries with your APK.
      #
      # In this top level CMakeLists.txt, ${CMAKE_PROJECT_NAME} is used to define
      # the target library name; in the sub-module's CMakeLists.txt, ${PROJECT_NAME}
      # is preferred for the same purpose.
      #
      # In order to load a library into your app from Java/Kotlin, you must call
      # System.loadLibrary() and pass the name of the library defined here;
      # for GameActivity/NativeActivity derived applications, the same library name must be
      # used in the AndroidManifest.xml file.
      
      # 第一個庫
      # 查找源文件
      file(GLOB_RECURSE LEARN01_SOURCES CONFIGURE_DEPENDS
              "src/learn01/*.cpp"
              "src/learn01/*.c"
      )
      add_library(${CMAKE_PROJECT_NAME} SHARED ${LEARN01_SOURCES})
      
      # 設置頭文件包含路徑
      target_include_directories(${CMAKE_PROJECT_NAME}
              PUBLIC ${CMAKE_CURRENT_SOURCE_DIR}/include/learn01
              PUBLIC ${CMAKE_CURRENT_SOURCE_DIR}/include/logging
      )
      
      # Specifies libraries CMake should link to your target library. You
      # can link libraries from various origins, such as libraries defined in this
      # build script, prebuilt third-party libraries, or Android system libraries.
      target_link_libraries(${CMAKE_PROJECT_NAME}
          # List libraries link to the target library
          android
          log
          OpenSLES
      )
      
      # 新增第二個庫 (openslLearn2)
      file(GLOB_RECURSE LEARN02_SOURCES CONFIGURE_DEPENDS
              "src/learn02/*.cpp"
              "src/learn02/*.c"
              "src/sqlite/*.cpp"
              "src/sqlite/*.c"
      )
      set(LIBRARY_NAME2 ${CMAKE_PROJECT_NAME}2)
      message("LIBRARY_NAME2: ${LIBRARY_NAME2}")
      add_library(${LIBRARY_NAME2} SHARED ${LEARN02_SOURCES})  # 使用不同源文件
      target_include_directories(${LIBRARY_NAME2}
              PUBLIC ${CMAKE_CURRENT_SOURCE_DIR}/include/sqlite
              PUBLIC ${CMAKE_CURRENT_SOURCE_DIR}/include/learn02
              PUBLIC ${CMAKE_CURRENT_SOURCE_DIR}/include/logging
      )
      find_package (oboe REQUIRED CONFIG)
      target_link_libraries(${LIBRARY_NAME2}
              android
              log
              aaudio
              oboe::oboe
              camera2ndk
              mediandk
      )
      

      頭文件

      //
      // Created by 29051 on 2025/10/25.
      //
      
      #ifndef OPENSL_LEARN_CAMERA_HPP
      #define OPENSL_LEARN_CAMERA_HPP
      
      extern "C" {
      #include <camera/NdkCameraManager.h>
      #include <media/NdkImageReader.h>
      }
      
      #include <string>
      #include <fstream>
      
      #include "logging.hpp"
      
      class NDKCamera {
      private:
          int mWidth;
          int wHeight;
          ACameraManager *aCameraManager = nullptr;
          ACameraDevice *device = nullptr;
          ACameraCaptureSession *session = nullptr;
          AImageReader *aImageReader = nullptr;
          ACaptureSessionOutputContainer *aCaptureSessionOutputContainer = nullptr;
          ACaptureSessionOutput *sessionOutput = nullptr;
          std::string yuvPath;
          std::ofstream *yuvStream = nullptr;
      public:
          NDKCamera(int width, int height, std::string yuvPath);
          ~NDKCamera();
          /**
           * Capabilities 功能
           */
          void printCameraCapabilities(const char * cameraId);
      };
      
      #endif //OPENSL_LEARN_CAMERA_HPP
      

      源文件

      //
      // Created by 29051 on 2025/10/25.
      //
      #include "NDKCamera.hpp"
      
      #include <utility>
      
      const char * const TAG = "NDKCamera";
      
      /**
       * CameraManager → CameraService → Camera HAL v3 → Sensor/Driver
       * @param width
       * @param height
       */
      NDKCamera::NDKCamera(int width, int height, std::string yuvPath) : mWidth(width), wHeight(height), yuvPath(std::move(yuvPath)) {
          logger::info(TAG, "width: %d, height: %d, yuvPath: %s", this -> mWidth, this -> wHeight, this -> yuvPath.c_str());
          this->yuvStream = new std::ofstream(this->yuvPath, std::ios::binary);
          if (!this->yuvStream->is_open()){
              logger::error(TAG, "文件打開失敗...");
              return;
          }
          aCameraManager = ACameraManager_create();
          if (aCameraManager == nullptr){
              logger::error(TAG, "aCameraManager is null");
              return;
          }
          ACameraIdList *cameraIdList = nullptr;
          camera_status_t status = ACameraManager_getCameraIdList(aCameraManager, &cameraIdList);
          if (status != ACAMERA_OK){
              logger::error(TAG, "開啟 getCameraIdList 失敗");
              return;
          }
          if (cameraIdList->numCameras <= 0){
              logger::error(TAG, "此設備沒有攝像頭");
              return;
          }
          for(int i = 0; i < cameraIdList->numCameras; i ++ ){
              logger::info(TAG, "index: %d, cameraId: %s", i, cameraIdList->cameraIds[i]);
          }
          const char* cameraId = cameraIdList->cameraIds[1];
          this->printCameraCapabilities(cameraId);
          ACameraDevice_StateCallbacks deviceStateCallbacks = {
                  .context = nullptr,
                  .onDisconnected = [](void*, ACameraDevice* aCameraDevice) -> void {},
                  .onError = [](void*, ACameraDevice* aCameraDevice, int errorCode) -> void {},
          };
          status = ACameraManager_openCamera(aCameraManager, cameraId, &deviceStateCallbacks, &device);
          if (status != ACAMERA_OK){
              logger::error(TAG, "開啟 camera 失敗");
              return;
          }
          media_status_t mediaStatus = AImageReader_new(width, height, AIMAGE_FORMAT_YUV_420_888, 4, &aImageReader);
          if (mediaStatus != AMEDIA_OK){
              logger::error(TAG, "AImageReader_new 失敗");
              return;
          }
          AImageReader_ImageListener imageListener = {
                  .context = this,
                  .onImageAvailable = [](void* context, AImageReader* reader) -> void {
                      AImage *image = nullptr;
                      media_status_t mediaStatus = AImageReader_acquireNextImage(reader, &image);
                      if (mediaStatus != AMEDIA_OK || image == nullptr){
                          logger::error(TAG, "獲取當前yuv幀失敗");
                          AImage_delete(image);
                          return;
                      }
                      int32_t width = 0, height = 0;
                      mediaStatus = AImage_getWidth(image, &width);
                      if (mediaStatus != AMEDIA_OK || image == nullptr){
                          logger::error(TAG, "獲取當前yuv幀寬度失敗");
                          AImage_delete(image);
                          return;
                      }
                      mediaStatus = AImage_getHeight(image, &height);
                      if (mediaStatus != AMEDIA_OK || image == nullptr){
                          logger::error(TAG, "獲取當前yuv幀高度失敗");
                          AImage_delete(image);
                          return;
                      }
                      // ==========
                      const auto *ndkCamera = reinterpret_cast<NDKCamera*>(context);
                      for (int plane = 0; plane < 3; ++plane) {
                          uint8_t* planeData = nullptr;
                          int planeDataLen = 0;
                          if (AImage_getPlaneData(image, plane, &planeData, &planeDataLen) != AMEDIA_OK) {
                              logger::error(TAG, "AImage_getPlaneData failed plane=%d", plane);
                              AImage_delete(image);
                              return;
                          }
                          int rowStride = 0, pixelStride = 0;
                          AImage_getPlaneRowStride(image, plane, &rowStride);
                          AImage_getPlanePixelStride(image, plane, &pixelStride);
      
                          int planeWidth = (plane == 0) ? width : (width + 1) / 2;
                          int planeHeight = (plane == 0) ? height : (height + 1) / 2;
      
                          // 按行按 pixelStride 寫入,確保是連續的 Y then U then V
                          for (int y = 0; y < planeHeight; ++y) {
                              const uint8_t* rowPtr = planeData + y * rowStride;
                              if (pixelStride == 1) {
                                  // 直接寫 planeWidth 字節
                                  ndkCamera->yuvStream->write(reinterpret_cast<const char*>(rowPtr), planeWidth);
                              } else {
                                  // 需要按 pixelStride 抽取
                                  for (int x = 0; x < planeWidth; ++x) {
                                      ndkCamera->yuvStream->put(rowPtr[x * pixelStride]);
                                  }
                              }
                          }
                      }
                      AImage_delete(image);
                      logger::info(TAG, "yuv width: %d, height: %d", width, height);
                  },
          };
          AImageReader_setImageListener(aImageReader, &imageListener);
          ANativeWindow* window = nullptr;
          mediaStatus = AImageReader_getWindow(aImageReader, &window);
          if (mediaStatus != AMEDIA_OK){
              logger::error(TAG, "AImageReader_getWindow 失敗");
              return;
          }
          ACaptureRequest *request = nullptr;
          status = ACameraDevice_createCaptureRequest(device, TEMPLATE_PREVIEW, &request);
          if (status != ACAMERA_OK){
              logger::error(TAG, "開啟 ACameraDevice_createCaptureRequest 失敗");
              return;
          }
          // 設置幀率范圍
          int32_t range[2] = {30, 30}; // 固定 30fps
          ACaptureRequest_setEntry_i32(request,
                                       ACAMERA_CONTROL_AE_TARGET_FPS_RANGE,
                                       2, range);
          ACameraOutputTarget *aCameraOutputTarget = nullptr;
          status = ACameraOutputTarget_create(window, &aCameraOutputTarget);
          if (status != ACAMERA_OK){
              logger::error(TAG, "開啟 ACameraOutputTarget_create 失敗");
              return;
          }
          status = ACaptureRequest_addTarget(request, aCameraOutputTarget);
          if (status != ACAMERA_OK){
              logger::error(TAG, "開啟 ACaptureRequest_addTarget 失敗");
              return;
          }
          ACameraCaptureSession_stateCallbacks sessionStateCallbacks = {
                  .context = nullptr,
                  .onClosed = [](void* context, ACameraCaptureSession *session) -> void {
                      logger::info(TAG, "onClosed...");
                  },
                  .onReady = [](void* context, ACameraCaptureSession *session) -> void {
                      logger::info(TAG, "onReady...");
                  },
                  .onActive = [](void* context, ACameraCaptureSession *session) -> void {
                      logger::info(TAG, "onActive...");
                  },
          };
          ACameraCaptureSession_captureCallbacks captureCallbacks = {
                  .context = nullptr,
                  .onCaptureStarted = [](void* context, ACameraCaptureSession* session,
                                         const ACaptureRequest* request, int64_t timestamp) -> void {
                      logger::info(TAG, "onCaptureStarted timestamp: %d", timestamp);
                  },
                  .onCaptureProgressed = [](void* context, ACameraCaptureSession* session,
                                            ACaptureRequest* request, const ACameraMetadata* result) -> void {
                      logger::info(TAG, "onCaptureProgressed...");
                  },
                  .onCaptureCompleted = [](void* context, ACameraCaptureSession* session,
                                           ACaptureRequest* request, const ACameraMetadata* result) -> void {
                      ACameraMetadata_const_entry fpsEntry = {};
                      if (ACameraMetadata_getConstEntry(result,
                                                        ACAMERA_CONTROL_AE_TARGET_FPS_RANGE, &fpsEntry) == ACAMERA_OK) {
                          if (fpsEntry.count >= 2) {
                              int32_t minFps = fpsEntry.data.i32[0];
                              int32_t maxFps = fpsEntry.data.i32[1];
                              logger::info(TAG, "onCaptureCompleted 當前幀率范圍: [%d, %d]", minFps, maxFps);
                          }
                      }
                  },
                  .onCaptureFailed = [](void* context, ACameraCaptureSession* session,
                                        ACaptureRequest* request, ACameraCaptureFailure* failure) -> void {
                      logger::info(TAG, "onCaptureFailed frameNumber: %d, reason: %d, sequenceId: %d, wasImageCaptured: %d", failure->frameNumber, failure->reason, failure->sequenceId, failure->wasImageCaptured);
                  },
                  .onCaptureSequenceCompleted = [](void* context, ACameraCaptureSession* session,
                                                   int sequenceId, int64_t frameNumber) -> void {
                      logger::info(TAG, "onCaptureSequenceCompleted sequenceId: %d, frameNumber: %d", sequenceId, frameNumber);
                  },
                  .onCaptureSequenceAborted = [](void* context, ACameraCaptureSession* session,
                                                 int sequenceId) -> void {
                      logger::info(TAG, "onCaptureSequenceAborted sequenceId: %d", sequenceId);
                  },
                  .onCaptureBufferLost = [](void* context, ACameraCaptureSession* session,
                                            ACaptureRequest* request, ACameraWindowType* window, int64_t frameNumber) -> void {
                      logger::info(TAG, "onCaptureBufferLost frameNumber: %d", frameNumber);
                  },
          };
      
          status = ACaptureSessionOutputContainer_create(&aCaptureSessionOutputContainer);
      
          if (status != ACAMERA_OK){
              logger::error(TAG, "開啟 ACaptureSessionOutputContainer_create 失敗");
              return;
          }
          status = ACaptureSessionOutput_create(window, &sessionOutput);
      
          if (status != ACAMERA_OK){
              logger::error(TAG, "開啟 ACaptureSessionOutput_create 失敗");
              return;
          }
          status = ACaptureSessionOutputContainer_add(aCaptureSessionOutputContainer, sessionOutput);
          if (status != ACAMERA_OK){
              logger::error(TAG, "開啟 ACaptureSessionOutputContainer_add 失敗");
              return;
          }
          status = ACameraDevice_createCaptureSession(device, aCaptureSessionOutputContainer, &sessionStateCallbacks, &session);
          if (status != ACAMERA_OK){
              logger::error(TAG, "開啟 ACameraDevice_createCaptureSession 失敗");
              return;
          }
      #if __ANDROID_API__ >= 33
          ACameraCaptureSession_captureCallbacksV2 captureCallbacksV2 = {
                  .context = nullptr,
                  .onCaptureStarted = [](void* context, ACameraCaptureSession* session,
                                         const ACaptureRequest* request, int64_t timestamp, int64_t frameNumber) -> void {
      
                  },
                  .onCaptureProgressed = [](void* context, ACameraCaptureSession* session,
                                            ACaptureRequest* request, const ACameraMetadata* result) -> void {
      
                  },
                  .onCaptureCompleted = [](void* context, ACameraCaptureSession* session,
                                           ACaptureRequest* request, const ACameraMetadata* result) -> void {
      
                  },
                  .onCaptureFailed = [](void* context, ACameraCaptureSession* session,
                                        ACaptureRequest* request, ACameraCaptureFailure* failure) -> void {
      
                  },
                  .onCaptureSequenceCompleted = [](void* context, ACameraCaptureSession* session,
                                                   int sequenceId, int64_t frameNumber) -> void {
      
                  },
                  .onCaptureSequenceAborted = [](void* context, ACameraCaptureSession* session,
                                                 int sequenceId) -> void {
      
                  },
                  .onCaptureBufferLost = [](void* context, ACameraCaptureSession* session,
                                            ACaptureRequest* request, ACameraWindowType* window, int64_t frameNumber) -> void {
      
                  },
          };
          status = ACameraCaptureSession_setRepeatingRequestV2(session, &captureCallbacksV2, 1, &request, nullptr);
          if (status != ACAMERA_OK){
              logger::error(TAG, "開啟 ACameraCaptureSession_setRepeatingRequestV2 失敗");
              return;
          }
      #else
          status = ACameraCaptureSession_setRepeatingRequest(session, &captureCallbacks, 1, &request, nullptr);
          if (status != ACAMERA_OK){
              logger::error(TAG, "開啟 ACameraCaptureSession_setRepeatingRequest 失敗");
              return;
          }
      #endif
      }
      NDKCamera::~NDKCamera() {
          logger::info(TAG, "~NDKCamera...");
          if (this->aImageReader != nullptr){
              AImageReader_delete(this->aImageReader);
          }
          if (session != nullptr){
              ACameraCaptureSession_close(session);
          }
          if (device != nullptr){
              ACameraDevice_close(device);
          }
          if (aCameraManager != nullptr) {
              ACameraManager_delete(aCameraManager);
          }
          if (this->yuvStream != nullptr){
              this->yuvStream->close();
          }
          if (this->aCaptureSessionOutputContainer != nullptr){
              ACaptureSessionOutputContainer_free(this->aCaptureSessionOutputContainer);
          }
          if (this->sessionOutput != nullptr){
              ACaptureSessionOutput_free(this->sessionOutput);
          }
      }
      
      void NDKCamera::printCameraCapabilities(const char * const cameraId){
          ACameraMetadata *metadata = nullptr;
          camera_status_t status = ACameraManager_getCameraCharacteristics(this->aCameraManager, cameraId, &metadata);
          if(status != ACAMERA_OK){
              logger::error(TAG, "獲取攝像頭信息失敗");
              return;
          }
          ACameraMetadata_const_entry entry = {};
          if (ACameraMetadata_getConstEntry(metadata, ACAMERA_SCALER_AVAILABLE_STREAM_CONFIGURATIONS, &entry) == ACAMERA_OK){
              logger::info(TAG, "支持的分辨率:");
              for(uint32_t i = 0; i + 3 < entry.count; i += 4){
                  int32_t format = entry.data.i32[i + 0];
                  int32_t width = entry.data.i32[i + 1];
                  int32_t height = entry.data.i32[i + 2];
                  int32_t isInput = entry.data.i32[i + 3];
                  if (isInput == 0 && format == AIMAGE_FORMAT_YUV_420_888){
                      logger::info(TAG, "format: %d, width: %d, height: %d, isInput: %d", format, width, height, isInput);
                  }
              }
          }
          if (ACameraMetadata_getConstEntry(metadata, ACAMERA_CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES, &entry) == ACAMERA_OK){
              logger::info(TAG, "支持的幀率范圍:");
              for (uint32_t i = 0; i + 1 < entry.count; i += 2) {
                  logger::info(TAG, "幀率范圍: [%d, %d]", entry.data.i32[i], entry.data.i32[i + 1]);
              }
          }
          ACameraMetadata_free(metadata);
      }
      

      暴露給Kotlin

      extern "C"
      JNIEXPORT jlong JNICALL
      Java_io_github_opensllearn_utils_Utils_initCamera(JNIEnv *env, jobject, jint width, jint height, jstring pcmPath) {
          NDKCamera *ndkCamera = nullptr;
          try {
              jboolean isCopy = false;
              const char * const pcmPathStr = env->GetStringUTFChars(pcmPath, &isCopy);
              ndkCamera = new NDKCamera(width, height, pcmPathStr);
              if (isCopy){
                  env->ReleaseStringUTFChars(pcmPath, pcmPathStr);
              }
          } catch (const std::exception &e) {
              delete ndkCamera;
              ndkCamera = nullptr;
              env->ThrowNew(env->FindClass("java/lang/RuntimeException"), e.what());
          }
          return reinterpret_cast<jlong>(ndkCamera);
      }
      extern "C"
      JNIEXPORT void JNICALL
      Java_io_github_opensllearn_utils_Utils_releaseCamera(JNIEnv*, jobject, jlong ptr) {
          const auto* const ndkKCamera = reinterpret_cast<NDKCamera*>(ptr);
          delete ndkKCamera;
      }
      

      結束.后續如果想渲染得話可以使用Surface,然后傳入Native,使用OpenGL,先將yuv420p轉為RGB然后交給OpenGL.不是很復雜.

      核心邏輯

      for (int plane = 0; plane < 3; ++plane) {
      	uint8_t* planeData = nullptr;
      	int planeDataLen = 0;
      	if (AImage_getPlaneData(image, plane, &planeData, &planeDataLen) != AMEDIA_OK) {
      		logger::error(TAG, "AImage_getPlaneData failed plane=%d", plane);
      		AImage_delete(image);
      		return;
      	}
      	int rowStride = 0, pixelStride = 0;
      	AImage_getPlaneRowStride(image, plane, &rowStride);
      	AImage_getPlanePixelStride(image, plane, &pixelStride);
      
      	int planeWidth = (plane == 0) ? width : (width + 1) / 2;
      	int planeHeight = (plane == 0) ? height : (height + 1) / 2;
      
      	// 按行按 pixelStride 寫入,確保是連續的 Y then U then V
      	for (int y = 0; y < planeHeight; ++y) {
      		const uint8_t* rowPtr = planeData + y * rowStride;
      		if (pixelStride == 1) {
      			// 直接寫 planeWidth 字節
      			ndkCamera->yuvStream->write(reinterpret_cast<const char*>(rowPtr), planeWidth);
      		} else {
      			// 需要按 pixelStride 抽取
      			for (int x = 0; x < planeWidth; ++x) {
      				ndkCamera->yuvStream->put(rowPtr[x * pixelStride]);
      			}
      		}
      	}
      }
      

      AIMAGE_FORMAT_YUV_420_888: 后面得888表示Y,U,V占一字節.
      這個特殊得結果兼任了上文所說得yuv420pyuv420sp.

      int32_t planes = 0;
      AImage_getNumberOfPlanes(image, &planes);
      

      AImage_getNumberOfPlanes可以獲得planes得分量,一般是3(RGB,YUV)或者4(ARGB).
      AImage_getPlaneData(image, plane, &planeData, &planeDataLen)獲取得是對于得分量的Plane.
      planeData是個char類型的二維數組指針,planeDataLen就是把二維數組看成一維數組以后的長度.
      比如:

      
      planeData
      |
      YYYY
      YYYY
      

      又比如

      planeData
      |
      UPUP
      

      高潮時刻到了,打起精神! 先整一個AI笑話.

      「對著代碼改到凌晨,突然靈魂拷問:我費這勁學 YUV 格式、調 AImage 有啥卵用?。俊埂敢谴丝棠軟_進來個富婆,啪給我一巴掌說‘別卷這些破玩意了’,再扔張黑卡‘姐帶你環球旅行’,我當場能把編譯器刪了!」
      「調試 YUV420P 轉碼又卡了兩小時,盯著屏幕發呆:會這些到底能換幾毛錢???」「突然腦補一個場景:富婆推門進來,反手給我一巴掌,特霸氣地說‘別跟像素較勁了’,然后拽著我就走‘現在就去馬爾代夫曬太陽’—— 唉,夢該醒了,繼續改 bug 吧?!?br> 「寫 AImage 提取數據的代碼時,突然擺爛:學這些冷門技術,除了掉頭發還有啥用?」「要是有富婆能過來,輕輕扇我一下說‘別學了沒用’,再補一句‘我帶你去環游世界’,我現在就把項目文件夾拖進回收站,絕不猶豫!」
      006bllTKly1frnu8cgiksj305i03sjr8

      夢醒了!
      AImage_getPlaneRowStride會返回每行的數據量,且會包含無效數據
      如下

      planeData
      |
      UPUP
      

      P就是無效數據,所以就需要下一個函數登場.
      AImage_getPlanePixelStride代表每行有效像素的距離.
      這時候你就需要一個char一個char的寫了.
      結束.

      android.hardware.Camera 源碼解析

      雖說這玩意已經廢棄,但不影響我們解析源碼

      open->native_setup

      frameworks/base/core/jni/android_hardware_Camera.cpp

      // connect to camera service
      static jint android_hardware_Camera_native_setup(JNIEnv *env, jobject thiz, jobject weak_this,
                                                       jint cameraId, jint rotationOverride,
                                                       jboolean forceSlowJpegMode,
                                                       jobject jClientAttributionParcel,
                                                       jint devicePolicy) {
          AttributionSourceState clientAttribution;
          if (!attributionSourceStateForJavaParcel(env, jClientAttributionParcel,
                                                   /* useContextAttributionSource= */ true,
                                                   clientAttribution)) {
              return -EACCES;
          }
      
          int targetSdkVersion = android_get_application_target_sdk_version();
          sp<Camera> camera = Camera::connect(cameraId, targetSdkVersion, rotationOverride,
                                              forceSlowJpegMode, clientAttribution, devicePolicy); // 1
          if (camera == NULL) {
              return -EACCES;
          }
          //...
      }
      

      1處代碼連接CameraService.
      frameworks/av/camera/Camera.cpp

      sp<Camera> Camera::connect(int cameraId, int targetSdkVersion, int rotationOverride,
              bool forceSlowJpegMode, const AttributionSourceState& clientAttribution,
              int32_t devicePolicy)
      {
          return CameraBaseT::connect(cameraId, targetSdkVersion, rotationOverride,
                  forceSlowJpegMode, clientAttribution, devicePolicy);
      }
      

      frameworks/av/camera/CameraBase.cpp

      template <typename TCam, typename TCamTraits>
      sp<TCam> CameraBase<TCam, TCamTraits>::connect(int cameraId,
                                                     int targetSdkVersion, int rotationOverride,
                                                     bool forceSlowJpegMode,
                                                     const AttributionSourceState& clientAttribution,
                                                     int32_t devicePolicy)
      {
          ALOGV("%s: connect", __FUNCTION__);
          sp<TCam> c = new TCam(cameraId);
          sp<TCamCallbacks> cl = c;
          const sp<::android::hardware::ICameraService> cs = getCameraService();
      
          binder::Status ret;
          if (cs != nullptr) {
              TCamConnectService fnConnectService = TCamTraits::fnConnectService; // 1
              ALOGI("Connect camera (legacy API) - rotationOverride %d, forceSlowJpegMode %d",
                      rotationOverride, forceSlowJpegMode);
              ret = (cs.get()->*fnConnectService)(cl, cameraId, targetSdkVersion,
                      rotationOverride, forceSlowJpegMode, clientAttribution, devicePolicy,
                      /*out*/ &c->mCamera);
          }
          if (ret.isOk() && c->mCamera != nullptr) {
              IInterface::asBinder(c->mCamera)->linkToDeath(c);
              c->mStatus = NO_ERROR;
          } else {
              ALOGW("An error occurred while connecting to camera %d: %s", cameraId,
                      (cs == nullptr) ? "Service not available" : ret.toString8().c_str());
              c.clear();
          }
          return c;
      }
      // establish binder interface to camera service
      namespace {
          sp<::android::hardware::ICameraService> gCameraService;
          const char*               kCameraServiceName      = "media.camera";
          // ...
      }
      template <typename TCam, typename TCamTraits>
      const sp<::android::hardware::ICameraService> CameraBase<TCam, TCamTraits>::getCameraService()
      {
          Mutex::Autolock _l(gLock);
          if (gCameraService.get() == 0) {
              if (CameraUtils::isCameraServiceDisabled()) {
                  return gCameraService;
              }
      
              sp<IServiceManager> sm = defaultServiceManager();
              sp<IBinder> binder;
              binder = sm->waitForService(toString16(kCameraServiceName));
              if (binder == nullptr) {
                  return nullptr;
              }
              if (gDeathNotifier == NULL) {
                  gDeathNotifier = new DeathNotifier();
              }
              binder->linkToDeath(gDeathNotifier);
              gCameraService = interface_cast<::android::hardware::ICameraService>(binder);
          }
          ALOGE_IF(gCameraService == 0, "no CameraService!?");
          return gCameraService;
      }
      

      代碼1處:
      frameworks/av/camera/Camera.cpp

      CameraTraits<Camera>::TCamConnectService CameraTraits<Camera>::fnConnectService =
              &::android::hardware::ICameraService::connect;
      

      這里實現了BpCameraServiceBnCameraServiceIPC調用。CameraService實現了BnCameraService.
      frameworks/av/services/camera/libcameraservice/CameraService.h

      class CameraService :
          public BinderService<CameraService>,
          public virtual ::android::hardware::BnCameraService,
          public virtual IBinder::DeathRecipient,
          public virtual CameraProviderManager::StatusListener,
          public virtual IServiceManager::LocalRegistrationCallback,
          public AttributionAndPermissionUtilsEncapsulator
      {
          friend class BinderService<CameraService>;
          friend class CameraOfflineSessionClient;
          // ...
      }
      

      frameworks/av/services/camera/libcameraservice/CameraService.cpp

      Status CameraService::connect(
              const sp<ICameraClient>& cameraClient,
              int api1CameraId,
              int targetSdkVersion,
              int rotationOverride,
              bool forceSlowJpegMode,
              const AttributionSourceState& clientAttribution,
              int32_t devicePolicy,
              /*out*/
              sp<ICamera>* device) {
          ATRACE_CALL();
          Status ret = Status::ok();
      
          std::string cameraIdStr =
                  cameraIdIntToStr(api1CameraId, clientAttribution.deviceId, devicePolicy);
          if (cameraIdStr.empty()) {
              std::string msg = fmt::sprintf("Camera %d: Invalid camera id for device id %d",
                      api1CameraId, clientAttribution.deviceId);
              ALOGE("%s: %s", __FUNCTION__, msg.c_str());
              return STATUS_ERROR(CameraService::ERROR_ILLEGAL_ARGUMENT, msg.c_str());
          }
      
          std::string clientPackageNameMaybe = clientAttribution.packageName.value_or("");
          bool isNonSystemNdk = clientPackageNameMaybe.size() == 0;
      
          AttributionSourceState resolvedClientAttribution(clientAttribution);
          ret = resolveAttributionSource(resolvedClientAttribution, __FUNCTION__, cameraIdStr);
          if (!ret.isOk()) {
              logRejected(cameraIdStr, getCallingPid(),
                          clientAttribution.packageName.value_or(kUnknownPackageName),
                          toStdString(ret.toString8()));
              return ret;
          }
      
          const int clientPid = resolvedClientAttribution.pid;
          const int clientUid = resolvedClientAttribution.uid;
          const std::string& clientPackageName = *resolvedClientAttribution.packageName;
      
          logConnectionAttempt(clientPid, clientPackageName, cameraIdStr, API_1);
      
          sp<Client> client = nullptr;
          ret = connectHelper<ICameraClient, Client>(
                  cameraClient, cameraIdStr, api1CameraId, resolvedClientAttribution,
                  /*systemNativeClient*/ false, API_1,
                  /*shimUpdateOnly*/ false, /*oomScoreOffset*/ 0, targetSdkVersion, rotationOverride,
                  forceSlowJpegMode, cameraIdStr, isNonSystemNdk, /*sharedMode*/false,
                  /*isVendorClient*/ false, /*out*/ client); // 1
      
          if (!ret.isOk()) {
              logRejected(cameraIdStr, getCallingPid(),
                          clientAttribution.packageName.value_or(kUnknownPackageName),
                          toStdString(ret.toString8()));
              return ret;
          }
      
          *device = client;
      
          const sp<IServiceManager> sm(defaultServiceManager());
          const auto& mActivityManager = getActivityManager();
          if (mActivityManager) {
              mActivityManager->logFgsApiBegin(LOG_FGS_CAMERA_API,
                  getCallingUid(),
                  getCallingPid());
          }
      
          return ret;
      }
      template <class CALLBACK, class CLIENT>
      Status CameraService::connectHelper(const sp<CALLBACK>& cameraCb, const std::string& cameraId,
                                          int api1CameraId,
                                          const AttributionSourceState& clientAttribution,
                                          bool systemNativeClient, apiLevel effectiveApiLevel,
                                          bool shimUpdateOnly, int oomScoreOffset, int targetSdkVersion,
                                          int rotationOverride, bool forceSlowJpegMode,
                                          const std::string& originalCameraId, bool isNonSystemNdk,
                                          bool sharedMode, bool isVendorClient,
                                          /*out*/ sp<CLIENT>& device) {
          binder::Status ret = binder::Status::ok();
      
          nsecs_t openTimeNs = systemTime();
      
          sp<CLIENT> client = nullptr;
          int facing = -1;
          int orientation = 0;
      
          const std::string clientPackageName =
                  clientAttribution.packageName.value_or(kUnknownPackageName);
      
          {
              // Acquire mServiceLock and prevent other clients from connecting
              std::unique_ptr<AutoConditionLock> lock =
                      AutoConditionLock::waitAndAcquire(mServiceLockWrapper, DEFAULT_CONNECT_TIMEOUT_NS);
      
              if (lock == nullptr) {
                  ALOGE("CameraService::connect (PID %d) rejected (too many other clients connecting).",
                        clientAttribution.pid);
                  return STATUS_ERROR_FMT(
                          ERROR_MAX_CAMERAS_IN_USE,
                          "Cannot open camera %s for \"%s\" (PID %d): Too many other clients connecting",
                          cameraId.c_str(), clientPackageName.c_str(), clientAttribution.pid);
              }
      
              // Enforce client permissions and do basic validity checks
              if (!(ret = validateConnectLocked(cameraId, clientAttribution, sharedMode)).isOk()) {
                  return ret;
              }
      
              // Check the shim parameters after acquiring lock, if they have already been updated and
              // we were doing a shim update, return immediately
              if (shimUpdateOnly) {
                  auto cameraState = getCameraState(cameraId);
                  if (cameraState != nullptr) {
                      if (!cameraState->getShimParams().isEmpty()) return ret;
                  }
              }
      
              status_t err;
      
              sp<BasicClient> clientTmp = nullptr;
              std::shared_ptr<resource_policy::ClientDescriptor<std::string, sp<BasicClient>>> partial;
              if ((err = handleEvictionsLocked(
                           cameraId, clientAttribution.pid, effectiveApiLevel,
                           IInterface::asBinder(cameraCb),
                           clientAttribution.packageName.value_or(kUnknownPackageName), oomScoreOffset,
                           systemNativeClient, sharedMode, /*out*/ &clientTmp,
                           /*out*/ &partial)) != NO_ERROR) {
                  switch (err) {
                      case -ENODEV:
                          return STATUS_ERROR_FMT(ERROR_DISCONNECTED,
                                  "No camera device with ID \"%s\" currently available",
                                  cameraId.c_str());
                      case -EBUSY:
                          return STATUS_ERROR_FMT(ERROR_CAMERA_IN_USE,
                                  "Higher-priority client using camera, ID \"%s\" currently unavailable",
                                  cameraId.c_str());
                      case -EUSERS:
                          return STATUS_ERROR_FMT(ERROR_MAX_CAMERAS_IN_USE,
                                  "Too many cameras already open, cannot open camera \"%s\"",
                                  cameraId.c_str());
                      default:
                          return STATUS_ERROR_FMT(ERROR_INVALID_OPERATION,
                                  "Unexpected error %s (%d) opening camera \"%s\"",
                                  strerror(-err), err, cameraId.c_str());
                  }
              }
      
              if (clientTmp.get() != nullptr) {
                  // Handle special case for API1 MediaRecorder where the existing client is returned
                  device = static_cast<CLIENT*>(clientTmp.get());
                  return ret;
              }
      
              // give flashlight a chance to close devices if necessary.
              mFlashlight->prepareDeviceOpen(cameraId);
      
              int portraitRotation;
              auto deviceVersionAndTransport =
                      getDeviceVersion(cameraId, rotationOverride, /*out*/&portraitRotation,
                              /*out*/&facing, /*out*/&orientation);
              if (facing == -1) {
                  ALOGE("%s: Unable to get camera device \"%s\"  facing", __FUNCTION__, cameraId.c_str());
                  return STATUS_ERROR_FMT(ERROR_INVALID_OPERATION,
                          "Unable to get camera device \"%s\" facing", cameraId.c_str());
              }
      
              sp<BasicClient> tmp = nullptr;
              bool overrideForPerfClass = SessionConfigurationUtils::targetPerfClassPrimaryCamera(
                      mPerfClassPrimaryCameraIds, cameraId, targetSdkVersion);
      
              // Only use passed in clientPid to check permission. Use calling PID as the client PID
              // that's connected to camera service directly.
              if (!(ret = makeClient(this, cameraCb, clientAttribution, getCallingPid(),
                                     systemNativeClient, cameraId, api1CameraId, facing, orientation,
                                     getpid(), deviceVersionAndTransport, effectiveApiLevel,
                                     overrideForPerfClass, rotationOverride, forceSlowJpegMode,
                                     originalCameraId, sharedMode, isVendorClient,
                                     /*out*/ &tmp))
                           .isOk()) { // 2
                  return ret;
              }
              client = static_cast<CLIENT*>(tmp.get());
      
              LOG_ALWAYS_FATAL_IF(client.get() == nullptr, "%s: CameraService in invalid state",
                      __FUNCTION__);
      
              std::string monitorTags = isClientWatched(client.get()) ? mMonitorTags : std::string();
              err = client->initialize(mCameraProviderManager, monitorTags); // 3
              if (err != OK) {
                  ALOGE("%s: Could not initialize client from HAL.", __FUNCTION__);
                  // Errors could be from the HAL module open call or from AppOpsManager
                  mServiceLock.unlock();
                  client->disconnect();
                  mServiceLock.lock();
                  switch(err) {
                      case BAD_VALUE:
                          return STATUS_ERROR_FMT(ERROR_ILLEGAL_ARGUMENT,
                                  "Illegal argument to HAL module for camera \"%s\"", cameraId.c_str());
                      case -EBUSY:
                          return STATUS_ERROR_FMT(ERROR_CAMERA_IN_USE,
                                  "Camera \"%s\" is already open", cameraId.c_str());
                      case -EUSERS:
                          return STATUS_ERROR_FMT(ERROR_MAX_CAMERAS_IN_USE,
                                  "Too many cameras already open, cannot open camera \"%s\"",
                                  cameraId.c_str());
                      case PERMISSION_DENIED:
                          return STATUS_ERROR_FMT(ERROR_PERMISSION_DENIED,
                                  "No permission to open camera \"%s\"", cameraId.c_str());
                      case -EACCES:
                          return STATUS_ERROR_FMT(ERROR_DISABLED,
                                  "Camera \"%s\" disabled by policy", cameraId.c_str());
                      case -ENODEV:
                      default:
                          return STATUS_ERROR_FMT(ERROR_INVALID_OPERATION,
                                  "Failed to initialize camera \"%s\": %s (%d)", cameraId.c_str(),
                                  strerror(-err), err);
                  }
              }
      
              // Update shim paremeters for legacy clients
              if (effectiveApiLevel == API_1) {
                  // Assume we have always received a Client subclass for API1
                  sp<Client> shimClient = reinterpret_cast<Client*>(client.get());
                  String8 rawParams = shimClient->getParameters();
                  CameraParameters params(rawParams);
      
                  auto cameraState = getCameraState(cameraId);
                  if (cameraState != nullptr) {
                      cameraState->setShimParams(params);
                  } else {
                      ALOGE("%s: Cannot update shim parameters for camera %s, no such device exists.",
                              __FUNCTION__, cameraId.c_str());
                  }
              }
      
              // Enable/disable camera service watchdog
              client->setCameraServiceWatchdog(mCameraServiceWatchdogEnabled);
      
              CameraMetadata chars;
              bool rotateAndCropSupported = true;
              err = mCameraProviderManager->getCameraCharacteristics(cameraId, overrideForPerfClass,
                      &chars, rotationOverride);
              if (err == OK) {
                  auto availableRotateCropEntry = chars.find(
                          ANDROID_SCALER_AVAILABLE_ROTATE_AND_CROP_MODES);
                  if (availableRotateCropEntry.count <= 1) {
                      rotateAndCropSupported = false;
                  }
              } else {
                  ALOGE("%s: Unable to query static metadata for camera %s: %s (%d)", __FUNCTION__,
                          cameraId.c_str(), strerror(-err), err);
              }
      
              if (rotateAndCropSupported) {
                  // Set rotate-and-crop override behavior
                  if (mOverrideRotateAndCropMode != ANDROID_SCALER_ROTATE_AND_CROP_AUTO) {
                      client->setRotateAndCropOverride(mOverrideRotateAndCropMode);
                  } else if (rotationOverride != hardware::ICameraService::ROTATION_OVERRIDE_NONE &&
                          portraitRotation != 0) {
                      uint8_t rotateAndCropMode = ANDROID_SCALER_ROTATE_AND_CROP_AUTO;
                      switch (portraitRotation) {
                          case 90:
                              rotateAndCropMode = ANDROID_SCALER_ROTATE_AND_CROP_90;
                              break;
                          case 180:
                              rotateAndCropMode = ANDROID_SCALER_ROTATE_AND_CROP_180;
                              break;
                          case 270:
                              rotateAndCropMode = ANDROID_SCALER_ROTATE_AND_CROP_270;
                              break;
                          default:
                              ALOGE("Unexpected portrait rotation: %d", portraitRotation);
                              break;
                      }
                      // Here we're communicating to the client the chosen rotate
                      // and crop mode to send to the HAL
                      client->setRotateAndCropOverride(rotateAndCropMode);
                  } else {
                      client->setRotateAndCropOverride(
                              mCameraServiceProxyWrapper->getRotateAndCropOverride(
                                      clientPackageName, facing,
                                      multiuser_get_user_id(clientAttribution.uid)));
                  }
              }
      
              bool autoframingSupported = true;
              auto availableAutoframingEntry = chars.find(ANDROID_CONTROL_AUTOFRAMING_AVAILABLE);
              if ((availableAutoframingEntry.count == 1) && (availableAutoframingEntry.data.u8[0] ==
                          ANDROID_CONTROL_AUTOFRAMING_AVAILABLE_FALSE)) {
                  autoframingSupported = false;
              }
      
              if (autoframingSupported) {
                  // Set autoframing override behaviour
                  if (mOverrideAutoframingMode != ANDROID_CONTROL_AUTOFRAMING_AUTO) {
                      client->setAutoframingOverride(mOverrideAutoframingMode);
                  } else {
                      client->setAutoframingOverride(
                          mCameraServiceProxyWrapper->getAutoframingOverride(
                              clientPackageName));
                  }
              }
      
              bool isCameraPrivacyEnabled;
              if (flags::camera_privacy_allowlist()) {
                  // Set camera muting behavior.
                  isCameraPrivacyEnabled =
                          this->isCameraPrivacyEnabled(toString16(client->getPackageName()), cameraId,
                                                       clientAttribution.pid, clientAttribution.uid);
              } else {
                  isCameraPrivacyEnabled =
                          mSensorPrivacyPolicy->isCameraPrivacyEnabled();
              }
      
              if (client->supportsCameraMute()) {
                  client->setCameraMute(
                          mOverrideCameraMuteMode || isCameraPrivacyEnabled);
              } else if (isCameraPrivacyEnabled) {
                  // no camera mute supported, but privacy is on! => disconnect
                  ALOGI("Camera mute not supported for package: %s, camera id: %s",
                          client->getPackageName().c_str(), cameraId.c_str());
                  // Do not hold mServiceLock while disconnecting clients, but
                  // retain the condition blocking other clients from connecting
                  // in mServiceLockWrapper if held.
                  mServiceLock.unlock();
                  // Clear caller identity temporarily so client disconnect PID
                  // checks work correctly
                  int64_t token = clearCallingIdentity();
                  // Note AppOp to trigger the "Unblock" dialog
                  client->noteAppOp();
                  client->disconnect();
                  restoreCallingIdentity(token);
                  // Reacquire mServiceLock
                  mServiceLock.lock();
      
                  return STATUS_ERROR_FMT(ERROR_DISABLED,
                          "Camera \"%s\" disabled due to camera mute", cameraId.c_str());
              }
      
              if (shimUpdateOnly) {
                  // If only updating legacy shim parameters, immediately disconnect client
                  mServiceLock.unlock();
                  client->disconnect();
                  mServiceLock.lock();
              } else {
                  // Otherwise, add client to active clients list
                  finishConnectLocked(client, partial, oomScoreOffset, systemNativeClient);
              }
      
              client->setImageDumpMask(mImageDumpMask);
              client->setStreamUseCaseOverrides(mStreamUseCaseOverrides);
              client->setZoomOverride(mZoomOverrideValue);
          } // lock is destroyed, allow further connect calls
      
          // Important: release the mutex here so the client can call back into the service from its
          // destructor (can be at the end of the call)
          device = client;
      
          int32_t openLatencyMs = ns2ms(systemTime() - openTimeNs);
          mCameraServiceProxyWrapper->logOpen(cameraId, facing, clientPackageName,
                  effectiveApiLevel, isNonSystemNdk, openLatencyMs);
      
          {
              Mutex::Autolock lock(mInjectionParametersLock);
              if (cameraId == mInjectionInternalCamId && mInjectionInitPending) {
                  mInjectionInitPending = false;
                  status_t res = NO_ERROR;
                  auto clientDescriptor = mActiveClientManager.get(mInjectionInternalCamId);
                  if (clientDescriptor != nullptr) {
                      sp<BasicClient> clientSp = clientDescriptor->getValue();
                      res = checkIfInjectionCameraIsPresent(mInjectionExternalCamId, clientSp);
                      if(res != OK) {
                          return STATUS_ERROR_FMT(ERROR_DISCONNECTED,
                                  "No camera device with ID \"%s\" currently available",
                                  mInjectionExternalCamId.c_str());
                      }
                      res = clientSp->injectCamera(mInjectionExternalCamId, mCameraProviderManager);
                      if (res != OK) {
                          mInjectionStatusListener->notifyInjectionError(mInjectionExternalCamId, res);
                      }
                  } else {
                      ALOGE("%s: Internal camera ID = %s 's client does not exist!",
                              __FUNCTION__, mInjectionInternalCamId.c_str());
                      res = NO_INIT;
                      mInjectionStatusListener->notifyInjectionError(mInjectionExternalCamId, res);
                  }
              }
          }
      
          return ret;
      }
      Status CameraService::makeClient(
              const sp<CameraService>& cameraService, const sp<IInterface>& cameraCb,
              const AttributionSourceState& clientAttribution, int callingPid, bool systemNativeClient,
              const std::string& cameraId, int api1CameraId, int facing, int sensorOrientation,
              int servicePid, std::pair<int, IPCTransport> deviceVersionAndTransport,
              apiLevel effectiveApiLevel, bool overrideForPerfClass, int rotationOverride,
              bool forceSlowJpegMode, const std::string& originalCameraId, bool sharedMode,
              bool isVendorClient,
              /*out*/sp<BasicClient>* client) {
          // For HIDL devices
          if (deviceVersionAndTransport.second == IPCTransport::HIDL) {
              // Create CameraClient based on device version reported by the HAL.
              int deviceVersion = deviceVersionAndTransport.first;
              switch(deviceVersion) {
                  case CAMERA_DEVICE_API_VERSION_1_0:
                      ALOGE("Camera using old HAL version: %d", deviceVersion);
                      return STATUS_ERROR_FMT(ERROR_DEPRECATED_HAL,
                              "Camera device \"%s\" HAL version %d no longer supported",
                              cameraId.c_str(), deviceVersion);
                      break;
                  case CAMERA_DEVICE_API_VERSION_3_0:
                  case CAMERA_DEVICE_API_VERSION_3_1:
                  case CAMERA_DEVICE_API_VERSION_3_2:
                  case CAMERA_DEVICE_API_VERSION_3_3:
                  case CAMERA_DEVICE_API_VERSION_3_4:
                  case CAMERA_DEVICE_API_VERSION_3_5:
                  case CAMERA_DEVICE_API_VERSION_3_6:
                  case CAMERA_DEVICE_API_VERSION_3_7:
                      break;
                  default:
                      // Should not be reachable
                      ALOGE("Unknown camera device HAL version: %d", deviceVersion);
                      return STATUS_ERROR_FMT(ERROR_INVALID_OPERATION,
                              "Camera device \"%s\" has unknown HAL version %d",
                              cameraId.c_str(), deviceVersion);
              }
          }
          if (effectiveApiLevel == API_1) { // Camera1 API route
              sp<ICameraClient> tmp = static_cast<ICameraClient*>(cameraCb.get());
              *client = new Camera2Client(cameraService, tmp, cameraService->mCameraServiceProxyWrapper,
                                          cameraService->mAttributionAndPermissionUtils,
                                          clientAttribution, callingPid, cameraId, api1CameraId, facing,
                                          sensorOrientation, servicePid, overrideForPerfClass,
                                          rotationOverride, forceSlowJpegMode, /*sharedMode*/false);
              ALOGI("%s: Camera1 API (legacy), rotationOverride %d, forceSlowJpegMode %d",
                      __FUNCTION__, rotationOverride, forceSlowJpegMode);
          } else { // Camera2 API route
              sp<hardware::camera2::ICameraDeviceCallbacks> tmp =
                      static_cast<hardware::camera2::ICameraDeviceCallbacks*>(cameraCb.get());
              *client = new CameraDeviceClient(
                      cameraService, tmp, cameraService->mCameraServiceProxyWrapper,
                      cameraService->mAttributionAndPermissionUtils, clientAttribution, callingPid,
                      systemNativeClient, cameraId, facing, sensorOrientation, servicePid,
                      overrideForPerfClass, rotationOverride, originalCameraId, sharedMode,
                      isVendorClient);
              ALOGI("%s: Camera2 API, rotationOverride %d", __FUNCTION__, rotationOverride);
          }
          return Status::ok();
      }
      

      frameworks/av/services/camera/libcameraservice/api2/CameraDeviceClient.cpp
      CameraDeviceClient的構造函數和initialize函數如下:

      CameraDeviceClient::CameraDeviceClient(
              const sp<CameraService>& cameraService,
              const sp<hardware::camera2::ICameraDeviceCallbacks>& remoteCallback,
              std::shared_ptr<CameraServiceProxyWrapper> cameraServiceProxyWrapper,
              std::shared_ptr<AttributionAndPermissionUtils> attributionAndPermissionUtils,
              const AttributionSourceState& clientAttribution, int callingPid, bool systemNativeClient,
              const std::string& cameraId, int cameraFacing, int sensorOrientation, int servicePid,
              bool overrideForPerfClass, int rotationOverride, const std::string& originalCameraId,
              bool sharedMode, bool isVendorClient)
          : Camera2ClientBase(cameraService, remoteCallback, cameraServiceProxyWrapper,
                              attributionAndPermissionUtils, clientAttribution, callingPid,
                              systemNativeClient, cameraId, /*API1 camera ID*/ -1, cameraFacing,
                              sensorOrientation, servicePid, overrideForPerfClass, rotationOverride,
                              sharedMode),
            mInputStream(),
            mStreamingRequestId(REQUEST_ID_NONE),
            mRequestIdCounter(0),
            mOverrideForPerfClass(overrideForPerfClass),
            mOriginalCameraId(originalCameraId),
            mIsVendorClient(isVendorClient) {
          ATRACE_CALL();
          ALOGI("CameraDeviceClient %s: Opened", cameraId.c_str());
      }
      status_t CameraDeviceClient::initialize(sp<CameraProviderManager> manager,
              const std::string& monitorTags) {
          return initializeImpl(manager, monitorTags);
      }
      
      template<typename TProviderPtr>
      status_t CameraDeviceClient::initializeImpl(TProviderPtr providerPtr,
              const std::string& monitorTags) {
          ATRACE_CALL();
          status_t res;
      
          res = Camera2ClientBase::initialize(providerPtr, monitorTags); // 1
          if (res != OK) {
              return res;
          }
      
          mFrameProcessor = new FrameProcessorBase(mDevice); // 幀數據處理器
          std::string threadName = std::string("CDU-") + mCameraIdStr + "-FrameProc";
          res = mFrameProcessor->run(threadName.c_str());
          if (res != OK) {
              ALOGE("%s: Unable to start frame processor thread: %s (%d)",
                      __FUNCTION__, strerror(-res), res);
              return res;
          }
      
          mFrameProcessor->registerListener(camera2::FrameProcessorBase::FRAME_PROCESSOR_LISTENER_MIN_ID,
                                            camera2::FrameProcessorBase::FRAME_PROCESSOR_LISTENER_MAX_ID,
                                            /*listener*/this,
                                            /*sendPartials*/true);
      
          const CameraMetadata &deviceInfo = mDevice->info();
          camera_metadata_ro_entry_t physicalKeysEntry = deviceInfo.find(
                  ANDROID_REQUEST_AVAILABLE_PHYSICAL_CAMERA_REQUEST_KEYS);
          if (physicalKeysEntry.count > 0) {
              mSupportedPhysicalRequestKeys.insert(mSupportedPhysicalRequestKeys.begin(),
                      physicalKeysEntry.data.i32,
                      physicalKeysEntry.data.i32 + physicalKeysEntry.count);
          }
      
          auto entry = deviceInfo.find(ANDROID_REQUEST_AVAILABLE_CAPABILITIES);
          mDynamicProfileMap.emplace(
                  ANDROID_REQUEST_AVAILABLE_DYNAMIC_RANGE_PROFILES_MAP_STANDARD,
                  ANDROID_REQUEST_AVAILABLE_DYNAMIC_RANGE_PROFILES_MAP_STANDARD);
          if (entry.count > 0) {
              const auto it = std::find(entry.data.u8, entry.data.u8 + entry.count,
                      ANDROID_REQUEST_AVAILABLE_CAPABILITIES_DYNAMIC_RANGE_TEN_BIT);
              if (it != entry.data.u8 + entry.count) {
                  entry = deviceInfo.find(ANDROID_REQUEST_AVAILABLE_DYNAMIC_RANGE_PROFILES_MAP);
                  if (entry.count > 0 || ((entry.count % 3) != 0)) {
                      int64_t standardBitmap =
                              ANDROID_REQUEST_AVAILABLE_DYNAMIC_RANGE_PROFILES_MAP_STANDARD;
                      for (size_t i = 0; i < entry.count; i += 3) {
                          if (entry.data.i64[i] !=
                                  ANDROID_REQUEST_AVAILABLE_DYNAMIC_RANGE_PROFILES_MAP_STANDARD) {
                              mDynamicProfileMap.emplace(entry.data.i64[i], entry.data.i64[i+1]);
                              if ((entry.data.i64[i+1] == 0) || (entry.data.i64[i+1] &
                                      ANDROID_REQUEST_AVAILABLE_DYNAMIC_RANGE_PROFILES_MAP_STANDARD)) {
                                  standardBitmap |= entry.data.i64[i];
                              }
                          } else {
                              ALOGE("%s: Device %s includes unexpected profile entry: 0x%" PRIx64 "!",
                                      __FUNCTION__, mCameraIdStr.c_str(), entry.data.i64[i]);
                          }
                      }
                      mDynamicProfileMap[ANDROID_REQUEST_AVAILABLE_DYNAMIC_RANGE_PROFILES_MAP_STANDARD] =
                              standardBitmap;
                  } else {
                      ALOGE("%s: Device %s supports 10-bit output but doesn't include a dynamic range"
                              " profile map!", __FUNCTION__, mCameraIdStr.c_str());
                  }
              }
          }
      
          mProviderManager = providerPtr;
          // Cache physical camera ids corresponding to this device and also the high
          // resolution sensors in this device + physical camera ids
          mProviderManager->isLogicalCamera(mCameraIdStr, &mPhysicalCameraIds);
          if (supportsUltraHighResolutionCapture(mCameraIdStr)) {
              mHighResolutionSensors.insert(mCameraIdStr);
          }
          for (auto &physicalId : mPhysicalCameraIds) {
              if (supportsUltraHighResolutionCapture(physicalId)) {
                  mHighResolutionSensors.insert(physicalId);
              }
          }
          int32_t resultMQSize =
                  property_get_int32("ro.vendor.camera.res.fmq.size", /*default*/METADATA_QUEUE_SIZE);
          res = CreateMetadataQueue(&mResultMetadataQueue, resultMQSize);
          if (res != OK) {
              ALOGE("%s: Creating result metadata queue failed: %s(%d)", __FUNCTION__,
                  strerror(-res), res);
              return res;
          }
          return OK;
      }
      

      CameraProviderManager的對象代表HIDL服務的客戶端,對應的服務端為ICameraProvider,可以使用lshal命令查看.
      frameworks/av/services/camera/libcameraservice/common/Camera2ClientBase.cpp

      template <typename TClientBase>
      status_t Camera2ClientBase<TClientBase>::initialize(sp<CameraProviderManager> manager,
              const std::string& monitorTags) {
          return initializeImpl(manager, monitorTags);
      }
      
      template <typename TClientBase>
      template <typename TProviderPtr>
      status_t Camera2ClientBase<TClientBase>::initializeImpl(TProviderPtr providerPtr,
              const std::string& monitorTags) {
          ATRACE_CALL();
          ALOGV("%s: Initializing client for camera %s", __FUNCTION__,
                TClientBase::mCameraIdStr.c_str());
          status_t res;
      
          IPCTransport providerTransport = IPCTransport::INVALID;
          res = providerPtr->getCameraIdIPCTransport(TClientBase::mCameraIdStr,
                  &providerTransport);
          if (res != OK) {
              return res;
          }
          switch (providerTransport) {
              case IPCTransport::HIDL:
                  mDevice =
                          new HidlCamera3Device(mCameraServiceProxyWrapper,
                                  TClientBase::mAttributionAndPermissionUtils,
                                  TClientBase::mCameraIdStr, mOverrideForPerfClass,
                                  TClientBase::mRotationOverride, mLegacyClient);
                  break;
              case IPCTransport::AIDL:
                  if (flags::camera_multi_client() && TClientBase::mSharedMode) {
                      mDevice = AidlCamera3SharedDevice::getInstance(mCameraServiceProxyWrapper,
                                  TClientBase::mAttributionAndPermissionUtils,
                                  TClientBase::mCameraIdStr, mOverrideForPerfClass,
                                  TClientBase::mRotationOverride, mLegacyClient);
                  } else {
                      mDevice =
                          new AidlCamera3Device(mCameraServiceProxyWrapper,
                                  TClientBase::mAttributionAndPermissionUtils,
                                  TClientBase::mCameraIdStr, mOverrideForPerfClass,
                                  TClientBase::mRotationOverride, mLegacyClient);
                  }
                  break;
              default:
                  ALOGE("%s Invalid transport for camera id %s", __FUNCTION__,
                          TClientBase::mCameraIdStr.c_str());
                  return NO_INIT;
          }
          if (mDevice == NULL) {
              ALOGE("%s: Camera %s: No device connected",
                      __FUNCTION__, TClientBase::mCameraIdStr.c_str());
              return NO_INIT;
          }
      
          // Notify camera opening (check op if check_full_attribution_source_chain flag is off).
          res = TClientBase::notifyCameraOpening();
          if (res != OK) {
              TClientBase::notifyCameraClosing();
              return res;
          }
      
          res = mDevice->initialize(providerPtr, monitorTags); // 1
          if (res != OK) {
              ALOGE("%s: Camera %s: unable to initialize device: %s (%d)",
                      __FUNCTION__, TClientBase::mCameraIdStr.c_str(), strerror(-res), res);
              TClientBase::notifyCameraClosing();
              return res;
          }
      
          wp<NotificationListener> weakThis(this);
          res = mDevice->setNotifyCallback(weakThis);
          if (res != OK) {
              ALOGE("%s: Camera %s: Unable to set notify callback: %s (%d)",
                      __FUNCTION__, TClientBase::mCameraIdStr.c_str(), strerror(-res), res);
              return res;
          }
      
          return OK;
      }
      

      我們這邊是通過AIDL而不是HIDL的方式,所以mDeviceAidlCamera3Device.
      frameworks/av/services/camera/libcameraservice/device3/aidl/AidlCamera3Device.cpp

      
      AidlCamera3Device::AidlCamera3Device(
              std::shared_ptr<CameraServiceProxyWrapper>& cameraServiceProxyWrapper,
              std::shared_ptr<AttributionAndPermissionUtils> attributionAndPermissionUtils,
              const std::string& id, bool overrideForPerfClass, int rotationOverride,
              bool legacyClient) :
              Camera3Device(cameraServiceProxyWrapper, attributionAndPermissionUtils, id,
                      overrideForPerfClass, rotationOverride, legacyClient) {
          mCallbacks = ndk::SharedRefBase::make<AidlCameraDeviceCallbacks>(this);
      }
      
      status_t AidlCamera3Device::initialize(sp<CameraProviderManager> manager,
              const std::string& monitorTags) {
          ATRACE_CALL();
          Mutex::Autolock il(mInterfaceLock);
          Mutex::Autolock l(mLock);
      
          ALOGV("%s: Initializing AIDL device for camera %s", __FUNCTION__, mId.c_str());
          if (mStatus != STATUS_UNINITIALIZED) {
              CLOGE("Already initialized!");
              return INVALID_OPERATION;
          }
          if (manager == nullptr) return INVALID_OPERATION;
      
          std::shared_ptr<camera::device::ICameraDeviceSession> session;
          ATRACE_BEGIN("CameraHal::openSession");
          status_t res = manager->openAidlSession(mId, mCallbacks,
                  /*out*/ &session); // 1
          ATRACE_END();
          if (res != OK) {
              SET_ERR_L("Could not open camera session: %s (%d)", strerror(-res), res);
              return res;
          }
          if (session == nullptr) {
            SET_ERR("Session iface returned is null");
            return INVALID_OPERATION;
          }
          res = manager->getCameraCharacteristics(mId, mOverrideForPerfClass, &mDeviceInfo,
                  mRotationOverride); // 2
          if (res != OK) {
              SET_ERR_L("Could not retrieve camera characteristics: %s (%d)", strerror(-res), res);
              session->close();
              return res;
          }
          mSupportNativeZoomRatio = manager->supportNativeZoomRatio(mId);
          mIsCompositeJpegRDisabled = manager->isCompositeJpegRDisabled(mId);
      
          std::vector<std::string> physicalCameraIds;
          bool isLogical = manager->isLogicalCamera(mId, &physicalCameraIds);
          if (isLogical) {
              for (auto& physicalId : physicalCameraIds) {
                  // Do not override characteristics for physical cameras
                  res = manager->getCameraCharacteristics(
                          physicalId, /*overrideForPerfClass*/false, &mPhysicalDeviceInfoMap[physicalId],
                          mRotationOverride);
                  if (res != OK) {
                      SET_ERR_L("Could not retrieve camera %s characteristics: %s (%d)",
                              physicalId.c_str(), strerror(-res), res);
                      session->close();
                      return res;
                  }
      
                  bool usePrecorrectArray =
                          DistortionMapper::isDistortionSupported(mPhysicalDeviceInfoMap[physicalId]);
                  if (usePrecorrectArray) {
                      res = mDistortionMappers[physicalId].setupStaticInfo(
                              mPhysicalDeviceInfoMap[physicalId]);
                      if (res != OK) {
                          SET_ERR_L("Unable to read camera %s's calibration fields for distortion "
                                  "correction", physicalId.c_str());
                          session->close();
                          return res;
                      }
                  }
      
                  mZoomRatioMappers[physicalId] = ZoomRatioMapper(
                          &mPhysicalDeviceInfoMap[physicalId],
                          mSupportNativeZoomRatio, usePrecorrectArray);
      
                  if (SessionConfigurationUtils::supportsUltraHighResolutionCapture(
                          mPhysicalDeviceInfoMap[physicalId])) {
                      mUHRCropAndMeteringRegionMappers[physicalId] =
                              UHRCropAndMeteringRegionMapper(mPhysicalDeviceInfoMap[physicalId],
                                      usePrecorrectArray);
                  }
              }
          }
      
          std::shared_ptr<AidlRequestMetadataQueue> queue;
          ::aidl::android::hardware::common::fmq::MQDescriptor<
                  int8_t, ::aidl::android::hardware::common::fmq::SynchronizedReadWrite> desc;
      
          ::ndk::ScopedAStatus requestQueueRet = session->getCaptureRequestMetadataQueue(&desc);
          if (!requestQueueRet.isOk()) {
              ALOGE("Transaction error when getting result metadata queue from camera session: %s",
                      requestQueueRet.getMessage());
              return AidlProviderInfo::mapToStatusT(requestQueueRet);
          }
          queue = std::make_unique<AidlRequestMetadataQueue>(desc);
          if (!queue->isValid() || queue->availableToWrite() <= 0) {
              ALOGE("HAL returns empty result metadata fmq, not use it");
              queue = nullptr;
              // Don't use resQueue onwards.
          }
      
          std::unique_ptr<AidlResultMetadataQueue>& resQueue = mResultMetadataQueue;
          ::aidl::android::hardware::common::fmq::MQDescriptor<
              int8_t, ::aidl::android::hardware::common::fmq::SynchronizedReadWrite> resDesc;
          ::ndk::ScopedAStatus resultQueueRet = session->getCaptureResultMetadataQueue(&resDesc);
          if (!resultQueueRet.isOk()) {
              ALOGE("Transaction error when getting result metadata queue from camera session: %s",
                      resultQueueRet.getMessage());
              return AidlProviderInfo::mapToStatusT(resultQueueRet);
          }
          resQueue = std::make_unique<AidlResultMetadataQueue>(resDesc);
          if (!resQueue->isValid() || resQueue->availableToWrite() <= 0) {
              ALOGE("HAL returns empty result metadata fmq, not use it");
              resQueue = nullptr;
              // Don't use resQueue onwards.
          }
      
          camera_metadata_entry bufMgrMode =
                  mDeviceInfo.find(ANDROID_INFO_SUPPORTED_BUFFER_MANAGEMENT_VERSION);
          if (bufMgrMode.count > 0) {
              mUseHalBufManager = (bufMgrMode.data.u8[0] ==
                      ANDROID_INFO_SUPPORTED_BUFFER_MANAGEMENT_VERSION_HIDL_DEVICE_3_5);
              mSessionHalBufManager = (bufMgrMode.data.u8[0] ==
                      ANDROID_INFO_SUPPORTED_BUFFER_MANAGEMENT_VERSION_SESSION_CONFIGURABLE);
          }
      
          camera_metadata_entry_t capabilities = mDeviceInfo.find(ANDROID_REQUEST_AVAILABLE_CAPABILITIES);
          for (size_t i = 0; i < capabilities.count; i++) {
              uint8_t capability = capabilities.data.u8[i];
              if (capability == ANDROID_REQUEST_AVAILABLE_CAPABILITIES_OFFLINE_PROCESSING) {
                  mSupportOfflineProcessing = true;
              }
          }
      
          mInterface =
                  new AidlHalInterface(session, queue, mUseHalBufManager, mSupportOfflineProcessing,
                          mSessionHalBufManager);
      
          std::string providerType;
          mVendorTagId = manager->getProviderTagIdLocked(mId);
          mTagMonitor.initialize(mVendorTagId);
          if (!monitorTags.empty()) {
              mTagMonitor.parseTagsToMonitor(monitorTags);
          }
      
          for (size_t i = 0; i < capabilities.count; i++) {
              uint8_t capability = capabilities.data.u8[i];
              if (capability == ANDROID_REQUEST_AVAILABLE_CAPABILITIES_MONOCHROME) {
                  mNeedFixupMonochromeTags = true;
              }
          }
      
          // batch size limit is applied to the device with camera device version larger than 3.2 which is
          // AIDL v2
          hardware::hidl_version maxVersion{0, 0};
          IPCTransport transport = IPCTransport::AIDL;
          res = manager->getHighestSupportedVersion(mId, &maxVersion, &transport);
          if (res != OK) {
              ALOGE("%s: Error in getting camera device version id: %s (%d)", __FUNCTION__,
                    strerror(-res), res);
              return res;
          }
          int deviceVersion = HARDWARE_DEVICE_API_VERSION(maxVersion.get_major(), maxVersion.get_minor());
      
          mBatchSizeLimitEnabled = (deviceVersion >= CAMERA_DEVICE_API_VERSION_1_2);
      
          camera_metadata_entry readoutSupported = mDeviceInfo.find(ANDROID_SENSOR_READOUT_TIMESTAMP);
          if (readoutSupported.count == 0) {
              ALOGW("%s: Could not find value corresponding to ANDROID_SENSOR_READOUT_TIMESTAMP. "
                    "Assuming true.", __FUNCTION__);
              mSensorReadoutTimestampSupported = true;
          } else {
              mSensorReadoutTimestampSupported =
                      readoutSupported.data.u8[0] == ANDROID_SENSOR_READOUT_TIMESTAMP_HARDWARE;
          }
      
          return initializeCommonLocked(manager); // 3
      }
      

      代碼1處:manager->openAidlSession(mId, mCallbacks, /*out*/ &session);
      frameworks/av/services/camera/libcameraservice/common/CameraProviderManager.cpp

      
      status_t CameraProviderManager::openAidlSession(const std::string &id,
              const std::shared_ptr<
                      aidl::android::hardware::camera::device::ICameraDeviceCallback>& callback,
              /*out*/
              std::shared_ptr<aidl::android::hardware::camera::device::ICameraDeviceSession> *session) {
      
          std::lock_guard<std::mutex> lock(mInterfaceMutex);
      
          auto deviceInfo = findDeviceInfoLocked(id);
          if (deviceInfo == nullptr) return NAME_NOT_FOUND;
      
          auto *aidlDeviceInfo3 = static_cast<AidlProviderInfo::AidlDeviceInfo3*>(deviceInfo);
          sp<ProviderInfo> parentProvider = deviceInfo->mParentProvider.promote();
          if (parentProvider == nullptr) {
              return DEAD_OBJECT;
          }
          auto provider =
                  static_cast<AidlProviderInfo *>(parentProvider.get())->startProviderInterface();
          if (provider == nullptr) {
              return DEAD_OBJECT;
          }
          std::shared_ptr<HalCameraProvider> halCameraProvider =
                  std::make_shared<AidlHalCameraProvider>(provider, provider->descriptor);
          saveRef(DeviceMode::CAMERA, id, halCameraProvider);
      
          auto interface = aidlDeviceInfo3->startDeviceInterface(); // 1
          if (interface == nullptr) {
              removeRef(DeviceMode::CAMERA, id);
              return DEAD_OBJECT;
          }
      
          auto ret = interface->open(callback, session); // 2
          if (!ret.isOk()) {
              removeRef(DeviceMode::CAMERA, id);
              ALOGE("%s: Transaction error opening a session for camera device %s: %s",
                      __FUNCTION__, id.c_str(), ret.getMessage());
              return AidlProviderInfo::mapToStatusT(ret);
          }
          return OK;
      }
      

      查看startDeviceInterface源碼
      frameworks/av/services/camera/libcameraservice/common/aidl/AidlProviderInfo.cpp

      
      std::shared_ptr<aidl::android::hardware::camera::device::ICameraDevice>
      AidlProviderInfo::AidlDeviceInfo3::startDeviceInterface() {
          Mutex::Autolock l(mDeviceAvailableLock);
          std::shared_ptr<camera::device::ICameraDevice> device;
          ATRACE_CALL();
          if (mSavedInterface == nullptr) {
              sp<AidlProviderInfo> parentProvider =
                      static_cast<AidlProviderInfo *>(mParentProvider.promote().get());
              if (parentProvider != nullptr) {
                  // Wait for lazy HALs to confirm device availability
                  if (parentProvider->isExternalLazyHAL() && !mIsDeviceAvailable) {
                      ALOGV("%s: Wait for external device to become available %s",
                            __FUNCTION__,
                            mId.c_str());
      
                      auto res = mDeviceAvailableSignal.waitRelative(mDeviceAvailableLock,
                                                               kDeviceAvailableTimeout);
                      if (res != OK) {
                          ALOGE("%s: Failed waiting for device to become available",
                                __FUNCTION__);
                          return nullptr;
                      }
                  }
      
                  device = parentProvider->startDeviceInterface(mName); // 1
              }
          } else {
              device = mSavedInterface;
          }
          return device;
      }
      
      std::shared_ptr<camera::device::ICameraDevice>
      AidlProviderInfo::startDeviceInterface(const std::string &name) {
          ::ndk::ScopedAStatus status;
          std::shared_ptr<camera::device::ICameraDevice> cameraInterface;
          const std::shared_ptr<ICameraProvider> interface = startProviderInterface(); // 1
          if (interface == nullptr) {
              return nullptr;
          }
          status = interface->getCameraDeviceInterface(name, &cameraInterface); // 2
          if (!status.isOk()) {
              ALOGE("%s: Transaction error trying to obtain interface for camera device %s: %s",
                      __FUNCTION__, name.c_str(), status.getMessage());
              return nullptr;
          }
          return cameraInterface;
      }
      
      const std::shared_ptr<ICameraProvider> AidlProviderInfo::startProviderInterface() {
          ATRACE_CALL();
          ALOGV("Request to start camera provider: %s", mProviderName.c_str());
          if (mSavedInterface != nullptr) {
              return mSavedInterface;
          }
      
          if (!kEnableLazyHal) {
              ALOGE("Bad provider state! Should not be here on a non-lazy HAL!");
              return nullptr;
          }
      
          auto interface = mActiveInterface.lock();
          if (interface != nullptr) {
              ALOGV("Camera provider (%s) already in use. Re-using instance.", mProviderName.c_str());
              return interface;
          }
      
          // Try to get service without starting
          interface = ICameraProvider::fromBinder(
                  ndk::SpAIBinder(AServiceManager_checkService(mProviderName.c_str())));
          if (interface != nullptr) {
              // Service is already running. Cache and return.
              mActiveInterface = interface;
              return interface;
          }
      
          ALOGV("Camera provider actually needs restart, calling getService(%s)", mProviderName.c_str());
          interface = mManager->mAidlServiceProxy->getService(mProviderName); // 1
      
          if (interface == nullptr) {
              ALOGE("%s: %s service not started", __FUNCTION__, mProviderName.c_str());
              return nullptr;
          }
      
          // Set all devices as ENUMERATING, provider should update status
          // to PRESENT after initializing.
          // This avoids failing getCameraDeviceInterface_V3_x before devices
          // are ready.
          for (auto& device : mDevices) {
            device->mIsDeviceAvailable = false;
          }
      
          interface->setCallback(mCallbacks);
          auto link = AIBinder_linkToDeath(interface->asBinder().get(), mDeathRecipient.get(),
                  this);
          if (link != STATUS_OK) {
              ALOGW("%s: Unable to link to provider '%s' death notifications",
                      __FUNCTION__, mProviderName.c_str());
              mManager->removeProvider(std::string(mProviderInstance));
              return nullptr;
          }
      
          // Send current device state
          interface->notifyDeviceStateChange(mDeviceState);
          // Cache interface to return early for future calls.
          mActiveInterface = interface;
      
          return interface;
      }
      

      image
      所有實現RefBase的智能指針管理的類,在首次引用時都會回調onFirstRef.
      frameworks/av/services/camera/libcameraservice/CameraService.cpp

      status_t CameraService::enumerateProviders() {
          status_t res;
      
          std::vector<std::string> deviceIds;
          std::unordered_map<std::string, std::set<std::string>> unavailPhysicalIds;
          {
              Mutex::Autolock l(mServiceLock);
      
              if (nullptr == mCameraProviderManager.get()) {
                  mCameraProviderManager = new CameraProviderManager();
                  res = mCameraProviderManager->initialize(this); // 1
                  if (res != OK) {
                      ALOGE("%s: Unable to initialize camera provider manager: %s (%d)",
                              __FUNCTION__, strerror(-res), res);
                      logServiceError("Unable to initialize camera provider manager",
                              ERROR_DISCONNECTED);
                      return res;
                  }
              }
      
              // Setup vendor tags before we call get_camera_info the first time
              // because HAL might need to setup static vendor keys in get_camera_info
              // TODO: maybe put this into CameraProviderManager::initialize()?
              mCameraProviderManager->setUpVendorTags();
      
              if (nullptr == mFlashlight.get()) {
                  mFlashlight = new CameraFlashlight(mCameraProviderManager, this);
              }
      
              res = mFlashlight->findFlashUnits();
              if (res != OK) {
                  ALOGE("Failed to enumerate flash units: %s (%d)", strerror(-res), res);
              }
      
              deviceIds = mCameraProviderManager->getCameraDeviceIds(&unavailPhysicalIds);
          }
      
          for (auto& cameraId : deviceIds) {
              if (getCameraState(cameraId) == nullptr) {
                  onDeviceStatusChanged(cameraId, CameraDeviceStatus::PRESENT);
              }
              if (unavailPhysicalIds.count(cameraId) > 0) {
                  for (const auto& physicalId : unavailPhysicalIds[cameraId]) {
                      onDeviceStatusChanged(cameraId, physicalId, CameraDeviceStatus::NOT_PRESENT);
                  }
              }
          }
      
          // Derive primary rear/front cameras, and filter their charactierstics.
          // This needs to be done after all cameras are enumerated and camera ids are sorted.
          if (SessionConfigurationUtils::IS_PERF_CLASS) {
              // Assume internal cameras are advertised from the same
              // provider. If multiple providers are registered at different time,
              // and each provider contains multiple internal color cameras, the current
              // logic may filter the characteristics of more than one front/rear color
              // cameras.
              Mutex::Autolock l(mServiceLock);
              filterSPerfClassCharacteristicsLocked();
          }
      
          return OK;
      }
      

      mCameraProviderManager->initialize(this);使用了默認參數,默認參數在頭文件中定義,實例化默認參數在CameraProviderManager.cpp文件
      frameworks/av/services/camera/libcameraservice/common/CameraProviderManager.h

      class CameraProviderManager : virtual public hidl::manager::V1_0::IServiceNotification,
              public virtual IServiceManager::LocalRegistrationCallback {
      public:
          // needs to be made friend strict since HidlProviderInfo needs to inherit
          // from CameraProviderManager::ProviderInfo which isn't a public member.
          friend struct HidlProviderInfo;
          friend struct AidlProviderInfo;
          ~CameraProviderManager();
      
          // Tiny proxy for the static methods in a HIDL interface that communicate with the hardware
          // service manager, to be replacable in unit tests with a fake.
          struct HidlServiceInteractionProxy {
              virtual bool registerForNotifications(
                      const std::string &serviceName,
                      const sp<hidl::manager::V1_0::IServiceNotification>
                      &notification) = 0;
              // Will not wait for service to start if it's not already running
              virtual sp<hardware::camera::provider::V2_4::ICameraProvider> tryGetService(
                      const std::string &serviceName) = 0;
              // Will block for service if it exists but isn't running
              virtual sp<hardware::camera::provider::V2_4::ICameraProvider> getService(
                      const std::string &serviceName) = 0;
              virtual hardware::hidl_vec<hardware::hidl_string> listServices() = 0;
              virtual ~HidlServiceInteractionProxy() {}
          };
      
          // Standard use case - call into the normal generated static methods which invoke
          // the real hardware service manager
          struct HidlServiceInteractionProxyImpl : public HidlServiceInteractionProxy {
              virtual bool registerForNotifications(
                      const std::string &serviceName,
                      const sp<hidl::manager::V1_0::IServiceNotification>
                      &notification) override {
                  return hardware::camera::provider::V2_4::ICameraProvider::registerForNotifications(
                          serviceName, notification);
              }
              virtual sp<hardware::camera::provider::V2_4::ICameraProvider> tryGetService(
                      const std::string &serviceName) override {
                  return hardware::camera::provider::V2_4::ICameraProvider::tryGetService(serviceName);
              }
              virtual sp<hardware::camera::provider::V2_4::ICameraProvider> getService(
                      const std::string &serviceName) override {
                  return hardware::camera::provider::V2_4::ICameraProvider::getService(serviceName);
              }
      
              virtual hardware::hidl_vec<hardware::hidl_string> listServices() override;
          };
      
          // Proxy to inject fake services in test.
          class AidlServiceInteractionProxy {
            public:
              // Returns the Aidl service with the given serviceName. Will wait indefinitely
              // for the service to come up if not running.
              virtual std::shared_ptr<aidl::android::hardware::camera::provider::ICameraProvider>
              getService(const std::string& serviceName) = 0;
      
              // Attempts to get an already running AIDL service of the given serviceName.
              // Returns nullptr immediately if service is not running.
              virtual std::shared_ptr<aidl::android::hardware::camera::provider::ICameraProvider>
              tryGetService(const std::string& serviceName) = 0;
      
              virtual ~AidlServiceInteractionProxy() = default;
          };
      
          // Standard use case - call into the normal static methods which invoke
          // the real service manager
          class AidlServiceInteractionProxyImpl : public AidlServiceInteractionProxy {
            public:
              virtual std::shared_ptr<aidl::android::hardware::camera::provider::ICameraProvider>
              getService(const std::string& serviceName) override;
      
              virtual std::shared_ptr<aidl::android::hardware::camera::provider::ICameraProvider>
              tryGetService(const std::string& serviceName) override;
          };
      
          /**
           * Listener interface for device/torch status changes
           */
          struct StatusListener : virtual public RefBase {
              ~StatusListener() {}
      
              virtual void onDeviceStatusChanged(const std::string &cameraId,
                      CameraDeviceStatus newStatus) = 0;
              virtual void onDeviceStatusChanged(const std::string &cameraId,
                      const std::string &physicalCameraId,
                      CameraDeviceStatus newStatus) = 0;
              virtual void onTorchStatusChanged(const std::string &cameraId,
                      TorchModeStatus newStatus,
                      SystemCameraKind kind) = 0;
              virtual void onTorchStatusChanged(const std::string &cameraId,
                      TorchModeStatus newStatus) = 0;
              virtual void onNewProviderRegistered() = 0;
          };
      
          /**
           * Represents the mode a camera device is currently in
           */
          enum class DeviceMode {
              TORCH,
              CAMERA
          };
      
          /**
           * Initialize the manager and give it a status listener; optionally accepts a service
           * interaction proxy.
           *
           * The default proxy communicates via the hardware service manager; alternate proxies can be
           * used for testing. The lifetime of the proxy must exceed the lifetime of the manager.
           */
          status_t initialize(wp<StatusListener> listener,
                              HidlServiceInteractionProxy* hidlProxy = &sHidlServiceInteractionProxy,
                              AidlServiceInteractionProxy* aidlProxy = &sAidlServiceInteractionProxy);
      	// ...
      }
      

      實現在frameworks/av/services/camera/libcameraservice/common/CameraProviderManager.cpp

      CameraProviderManager::HidlServiceInteractionProxyImpl
      CameraProviderManager::sHidlServiceInteractionProxy{};
      CameraProviderManager::AidlServiceInteractionProxyImpl
      CameraProviderManager::sAidlServiceInteractionProxy{};
      std::shared_ptr<aidl::android::hardware::camera::provider::ICameraProvider>
      CameraProviderManager::AidlServiceInteractionProxyImpl::getService(
              const std::string& serviceName) {
          using aidl::android::hardware::camera::provider::ICameraProvider;
      
          AIBinder* binder = nullptr;
          binder = AServiceManager_waitForService(serviceName.c_str()); // "android.hardware.camera.provider.ICameraProvider"
      
          if (binder == nullptr) {
              ALOGE("%s: AIDL Camera provider HAL '%s' is not actually available, despite waiting "
                    "indefinitely?", __FUNCTION__, serviceName.c_str());
              return nullptr;
          }
          std::shared_ptr<ICameraProvider> interface =
                  ICameraProvider::fromBinder(ndk::SpAIBinder(binder));
      
          return interface;
      }
      
      status_t CameraProviderManager::tryToAddAidlProvidersLocked() {
          const char * aidlHalServiceDescriptor =
                  aidl::android::hardware::camera::provider::ICameraProvider::descriptor;
          auto sm = defaultServiceManager();
          auto aidlProviders = sm->getDeclaredInstances(
                  String16(aidlHalServiceDescriptor));
      
          if (isVirtualCameraHalEnabled()) {
              // Virtual Camera provider is not declared in the VINTF manifest so we
              // manually add it if the binary is present.
              aidlProviders.push_back(String16(kVirtualProviderName.c_str()));
          }
      
          for (const auto &aidlInstance : aidlProviders) {
              std::string aidlServiceName =
                      getFullAidlProviderName(toStdString(aidlInstance));
              auto res = sm->registerForNotifications(String16(aidlServiceName.c_str()), this);
              if (res != OK) {
                  ALOGE("%s Unable to register for notifications with AIDL service manager",
                          __FUNCTION__);
                  return res;
              }
              addAidlProviderLocked(aidlServiceName);
          }
          return OK;
      }
      
      

      startProviderInterface -> getCameraDeviceInterface;
      找到android.hardware.camera.provider.ICameraProvider服務,然后調用getCameraDeviceInterface方法
      這里也是IPC通信,找到ICameraProvider.aidl會生成BpCameraProviderBnCameraProvider.
      查看誰實現了BnCameraProvider即可,是AidlCameraProvider,它實現了getCameraDeviceInterface方法。
      hardware/google/camera/common/hal/aidl_service/aidl_camera_provider.cc

      
      ScopedAStatus AidlCameraProvider::getCameraDeviceInterface(
          const std::string& camera_device_name,
          std::shared_ptr<ICameraDevice>* device) {
        std::unique_ptr<CameraDevice> google_camera_device;
        if (device == nullptr) {
          ALOGE("%s: device is nullptr. ", __FUNCTION__);
          return ScopedAStatus::fromServiceSpecificError(
              static_cast<int32_t>(Status::ILLEGAL_ARGUMENT));
        }
      
        // Parse camera_device_name.
        std::string camera_id, device_version;
      
        bool match = ParseDeviceName(camera_device_name, &device_version, &camera_id);
        if (!match) {
          ALOGE("%s: Device name parse fail. ", __FUNCTION__);
          return ScopedAStatus::fromServiceSpecificError(
              static_cast<int32_t>(Status::ILLEGAL_ARGUMENT));
        }
      
        int camera_id_int = atoi(camera_id.c_str());
        status_t res = google_camera_provider_->CreateCameraDevice(
            camera_id_int, &google_camera_device);
        if (res != OK) {
          ALOGE("%s: Creating CameraDevice failed: %s(%d)", __FUNCTION__,
                strerror(-res), res);
          return aidl_utils::ConvertToAidlReturn(res);
        }
      
        *device = device::implementation::AidlCameraDevice::Create(
            std::move(google_camera_device)); // 1
        if (*device == nullptr) {
          ALOGE("%s: Creating AidlCameraDevice failed", __FUNCTION__);
          return ScopedAStatus::fromServiceSpecificError(
              static_cast<int32_t>(Status::INTERNAL_ERROR));
        }
      
      #ifdef __ANDROID_APEX__
        available_camera_ids_.erase(camera_id_int);
        if (!camera_device_initialized_ && available_camera_ids_.empty()) {
          camera_device_initialized_ = true;
      
          std::string ready_property_name = "vendor.camera.hal.ready.count";
          int ready_count = property_get_int32(ready_property_name.c_str(), 0);
          property_set(ready_property_name.c_str(),
                       std::to_string(++ready_count).c_str());
          ALOGI(
              "AidlCameraProvider::getCameraDeviceInterface() first time ready "
              "count: %d ",
              ready_count);
        }
      #endif
        return ScopedAStatus::ok();
      }
      

      frameworks/av/services/camera/libcameraservice/common/CameraProviderManager.cpp

      status_t CameraProviderManager::openAidlSession(const std::string &id,
              const std::shared_ptr<
                      aidl::android::hardware::camera::device::ICameraDeviceCallback>& callback,
              /*out*/
              std::shared_ptr<aidl::android::hardware::camera::device::ICameraDeviceSession> *session) {
      
          std::lock_guard<std::mutex> lock(mInterfaceMutex);
      
          auto deviceInfo = findDeviceInfoLocked(id);
          if (deviceInfo == nullptr) return NAME_NOT_FOUND;
      
          auto *aidlDeviceInfo3 = static_cast<AidlProviderInfo::AidlDeviceInfo3*>(deviceInfo);
          sp<ProviderInfo> parentProvider = deviceInfo->mParentProvider.promote();
          if (parentProvider == nullptr) {
              return DEAD_OBJECT;
          }
          auto provider =
                  static_cast<AidlProviderInfo *>(parentProvider.get())->startProviderInterface();
          if (provider == nullptr) {
              return DEAD_OBJECT;
          }
          std::shared_ptr<HalCameraProvider> halCameraProvider =
                  std::make_shared<AidlHalCameraProvider>(provider, provider->descriptor);
          saveRef(DeviceMode::CAMERA, id, halCameraProvider);
      
          auto interface = aidlDeviceInfo3->startDeviceInterface();
          if (interface == nullptr) {
              removeRef(DeviceMode::CAMERA, id);
              return DEAD_OBJECT;
          }
      
          auto ret = interface->open(callback, session); // 1
          if (!ret.isOk()) {
              removeRef(DeviceMode::CAMERA, id);
              ALOGE("%s: Transaction error opening a session for camera device %s: %s",
                      __FUNCTION__, id.c_str(), ret.getMessage());
              return AidlProviderInfo::mapToStatusT(ret);
          }
          return OK;
      }
      

      綜上所述代碼1處返回的是
      AidlCameraDeviceCreate方法創建的AidlCameraDevice,已經通過AIDL直通HAL.
      hardware/google/camera/common/hal/aidl_service/aidl_camera_device.cc

      
      std::shared_ptr<AidlCameraDevice> AidlCameraDevice::Create(
          std::unique_ptr<CameraDevice> google_camera_device) {
        auto device = ndk::SharedRefBase::make<AidlCameraDevice>();
        if (device == nullptr) {
          ALOGE("%s: Cannot create a AidlCameraDevice.", __FUNCTION__);
          return nullptr;
        }
      
        status_t res = device->Initialize(std::move(google_camera_device));
        if (res != OK) {
          ALOGE("%s: Initializing AidlCameraDevice failed: %s(%d)", __FUNCTION__,
                strerror(-res), res);
          return nullptr;
        }
      
        return device;
      }
      status_t AidlCameraDevice::Initialize(
          std::unique_ptr<CameraDevice> google_camera_device) {
        if (google_camera_device == nullptr) {
          ALOGE("%s: google_camera_device is nullptr.", __FUNCTION__);
          return BAD_VALUE;
        }
      
        camera_id_ = google_camera_device->GetPublicCameraId();
        google_camera_device_ = std::move(google_camera_device);
        aidl_profiler_ = google_camera_hal::AidlProfiler::Create(camera_id_);
        if (aidl_profiler_ == nullptr) {
          ALOGE("%s: Failed to create AidlProfiler.", __FUNCTION__);
          return UNKNOWN_ERROR;
        }
        return OK;
      }
      

      繼續看AidlCameraDeviceopen方法.
      hardware/google/camera/common/hal/aidl_service/aidl_camera_device.cc

      ScopedAStatus AidlCameraDevice::open(
          const std::shared_ptr<ICameraDeviceCallback>& callback,
          std::shared_ptr<ICameraDeviceSession>* session_ret) {
        if (session_ret == nullptr) {
          return ScopedAStatus::fromServiceSpecificError(
              static_cast<int32_t>(Status::ILLEGAL_ARGUMENT));
        }
        *session_ret = nullptr;
        auto profiler = aidl_profiler_->MakeScopedProfiler(
            google_camera_hal::EventType::kOpen,
            google_camera_device_->GetProfiler(camera_id_,
                                               aidl_profiler_->GetLatencyFlag()),
            google_camera_device_->GetProfiler(camera_id_,
                                               aidl_profiler_->GetFpsFlag()));
      
        std::unique_ptr<google_camera_hal::CameraDeviceSession> session;
        status_t res = google_camera_device_->CreateCameraDeviceSession(&session);
        if (res != OK || session == nullptr) {
          ALOGE("%s: Creating CameraDeviceSession failed: %s(%d)", __FUNCTION__,
                strerror(-res), res);
          return aidl_utils::ConvertToAidlReturn(res);
        }
      
        auto aidl_session = AidlCameraDeviceSession::Create(
            callback, std::move(session), aidl_profiler_);
        if (aidl_session == nullptr) {
          ALOGE("%s: Creating AidlCameraDeviceSession failed.", __FUNCTION__);
          return aidl_utils::ConvertToAidlReturn(res);
        }
        *session_ret = aidl_session;
        return ScopedAStatus::ok();
      }
      
      HidlProviderInfo::getIPCTransport() 返回 IPCTransport::HIDL。
      AidlProviderInfo::getIPCTransport() 返回 IPCTransport::AIDL。
      

      選擇AIDL Provider還是HIDL Provider,是由相機 HAL 層的實現方式決定的(廠商在開發 HAL 時選擇基于 AIDL 還是 HIDL 接口)??蚣芡ㄟ^這種方式自動適配不同 HAL 實現,無需上層關心底層細節。
      但是更推薦AIDL Provider.
      frameworks/av/services/camera/libcameraservice/device3/Camera3Device.cpp

      status_t Camera3Device::initializeCommonLocked(sp<CameraProviderManager> manager) {
      
          /** Start up status tracker thread */
          mStatusTracker = new StatusTracker(this);
          status_t res = mStatusTracker->run((std::string("C3Dev-") + mId + "-Status").c_str());
          if (res != OK) {
              SET_ERR_L("Unable to start status tracking thread: %s (%d)",
                      strerror(-res), res);
              mInterface->close();
              mStatusTracker.clear();
              return res;
          }
      
          /** Register in-flight map to the status tracker */
          mInFlightStatusId = mStatusTracker->addComponent("InflightRequests");
      
          /** Create buffer manager */
          mBufferManager = new Camera3BufferManager();
      
          Vector<int32_t> sessionParamKeys;
          camera_metadata_entry_t sessionKeysEntry = mDeviceInfo.find(
                  ANDROID_REQUEST_AVAILABLE_SESSION_KEYS);
          if (sessionKeysEntry.count > 0) {
              sessionParamKeys.insertArrayAt(sessionKeysEntry.data.i32, 0, sessionKeysEntry.count);
          }
      
          camera_metadata_entry_t availableTestPatternModes = mDeviceInfo.find(
                  ANDROID_SENSOR_AVAILABLE_TEST_PATTERN_MODES);
          for (size_t i = 0; i < availableTestPatternModes.count; i++) {
              if (availableTestPatternModes.data.i32[i] ==
                      ANDROID_SENSOR_TEST_PATTERN_MODE_SOLID_COLOR) {
                  mSupportCameraMute = true;
                  mSupportTestPatternSolidColor = true;
                  break;
              } else if (availableTestPatternModes.data.i32[i] ==
                      ANDROID_SENSOR_TEST_PATTERN_MODE_BLACK) {
                  mSupportCameraMute = true;
                  mSupportTestPatternSolidColor = false;
              }
          }
      
          camera_metadata_entry_t availableSettingsOverrides = mDeviceInfo.find(
                  ANDROID_CONTROL_AVAILABLE_SETTINGS_OVERRIDES);
          for (size_t i = 0; i < availableSettingsOverrides.count; i++) {
              if (availableSettingsOverrides.data.i32[i] ==
                      ANDROID_CONTROL_SETTINGS_OVERRIDE_ZOOM) {
                  mSupportZoomOverride = true;
                  break;
              }
          }
      
          /** Start up request queue thread */
          mRequestThread = createNewRequestThread(
                  this, mStatusTracker, mInterface, sessionParamKeys,
                  mUseHalBufManager, mSupportCameraMute, mRotationOverride,
                  mSupportZoomOverride);
          res = mRequestThread->run((std::string("C3Dev-") + mId + "-ReqQueue").c_str());
          if (res != OK) {
              SET_ERR_L("Unable to start request queue thread: %s (%d)",
                      strerror(-res), res);
              mInterface->close();
              mRequestThread.clear();
              return res;
          }
      
          setCameraMuteLocked(mCameraMuteInitial);
      
          mPreparerThread = new PreparerThread();
      
          internalUpdateStatusLocked(STATUS_UNCONFIGURED);
          mNextStreamId = 0;
          mFakeStreamId = NO_STREAM;
          mNeedConfig = true;
          mPauseStateNotify = false;
          mIsInputStreamMultiResolution = false;
      
          // Measure the clock domain offset between camera and video/hw_composer
          mTimestampOffset = getMonoToBoottimeOffset();
          camera_metadata_entry timestampSource =
                  mDeviceInfo.find(ANDROID_SENSOR_INFO_TIMESTAMP_SOURCE);
          if (timestampSource.count > 0 && timestampSource.data.u8[0] ==
                  ANDROID_SENSOR_INFO_TIMESTAMP_SOURCE_REALTIME) {
              mDeviceTimeBaseIsRealtime = true;
          }
      
          // Will the HAL be sending in early partial result metadata?
          camera_metadata_entry partialResultsCount =
                  mDeviceInfo.find(ANDROID_REQUEST_PARTIAL_RESULT_COUNT);
          if (partialResultsCount.count > 0) {
              mNumPartialResults = partialResultsCount.data.i32[0];
              mUsePartialResult = (mNumPartialResults > 1);
          }
      
          bool usePrecorrectArray = DistortionMapper::isDistortionSupported(mDeviceInfo);
          if (usePrecorrectArray) {
              res = mDistortionMappers[mId].setupStaticInfo(mDeviceInfo);
              if (res != OK) {
                  SET_ERR_L("Unable to read necessary calibration fields for distortion correction");
                  return res;
              }
          }
      
          mZoomRatioMappers[mId] = ZoomRatioMapper(&mDeviceInfo,
                  mSupportNativeZoomRatio, usePrecorrectArray);
      
          if (SessionConfigurationUtils::supportsUltraHighResolutionCapture(mDeviceInfo)) {
              mUHRCropAndMeteringRegionMappers[mId] =
                      UHRCropAndMeteringRegionMapper(mDeviceInfo, usePrecorrectArray);
          }
      
          if (RotateAndCropMapper::isNeeded(&mDeviceInfo)) {
              mRotateAndCropMappers.emplace(mId, &mDeviceInfo);
          }
      
          // Hidl/AidlCamera3DeviceInjectionMethods
          mInjectionMethods = createCamera3DeviceInjectionMethods(this);
      
          /** Start watchdog thread */
          mCameraServiceWatchdog = new CameraServiceWatchdog(
                  manager->getProviderPids(), mId, mCameraServiceProxyWrapper);
          res = mCameraServiceWatchdog->run("CameraServiceWatchdog");
          if (res != OK) {
              SET_ERR_L("Unable to start camera service watchdog thread: %s (%d)",
                      strerror(-res), res);
              return res;
          }
      
          mSupportsExtensionKeys = areExtensionKeysSupported(mDeviceInfo);
      
          return OK;
      }
      

      下來分析 HidlCamera3Device

      
      template <typename TClientBase>
      template <typename TProviderPtr>
      status_t Camera2ClientBase<TClientBase>::initializeImpl(TProviderPtr providerPtr,
              const std::string& monitorTags) {
          ATRACE_CALL();
          ALOGV("%s: Initializing client for camera %s", __FUNCTION__,
                TClientBase::mCameraIdStr.c_str());
          status_t res;
      
          IPCTransport providerTransport = IPCTransport::INVALID;
          res = providerPtr->getCameraIdIPCTransport(TClientBase::mCameraIdStr,
                  &providerTransport);
          if (res != OK) {
              return res;
          }
          switch (providerTransport) {
              case IPCTransport::HIDL:
                  mDevice =
                          new HidlCamera3Device(mCameraServiceProxyWrapper,
                                  TClientBase::mAttributionAndPermissionUtils,
                                  TClientBase::mCameraIdStr, mOverrideForPerfClass,
                                  TClientBase::mRotationOverride, mLegacyClient);
                  break;
              case IPCTransport::AIDL:
                  if (flags::camera_multi_client() && TClientBase::mSharedMode) {
                      mDevice = AidlCamera3SharedDevice::getInstance(mCameraServiceProxyWrapper,
                                  TClientBase::mAttributionAndPermissionUtils,
                                  TClientBase::mCameraIdStr, mOverrideForPerfClass,
                                  TClientBase::mRotationOverride, mLegacyClient);
                  } else {
                      mDevice =
                          new AidlCamera3Device(mCameraServiceProxyWrapper,
                                  TClientBase::mAttributionAndPermissionUtils,
                                  TClientBase::mCameraIdStr, mOverrideForPerfClass,
                                  TClientBase::mRotationOverride, mLegacyClient);
                  }
                  break;
              default:
                  ALOGE("%s Invalid transport for camera id %s", __FUNCTION__,
                          TClientBase::mCameraIdStr.c_str());
                  return NO_INIT;
          }
          if (mDevice == NULL) {
              ALOGE("%s: Camera %s: No device connected",
                      __FUNCTION__, TClientBase::mCameraIdStr.c_str());
              return NO_INIT;
          }
      
          // Notify camera opening (check op if check_full_attribution_source_chain flag is off).
          res = TClientBase::notifyCameraOpening();
          if (res != OK) {
              TClientBase::notifyCameraClosing();
              return res;
          }
      
          res = mDevice->initialize(providerPtr, monitorTags);
          if (res != OK) {
              ALOGE("%s: Camera %s: unable to initialize device: %s (%d)",
                      __FUNCTION__, TClientBase::mCameraIdStr.c_str(), strerror(-res), res);
              TClientBase::notifyCameraClosing();
              return res;
          }
      
          wp<NotificationListener> weakThis(this);
          res = mDevice->setNotifyCallback(weakThis);
          if (res != OK) {
              ALOGE("%s: Camera %s: Unable to set notify callback: %s (%d)",
                      __FUNCTION__, TClientBase::mCameraIdStr.c_str(), strerror(-res), res);
              return res;
          }
      
          return OK;
      }
      
      
      
      status_t HidlCamera3Device::initialize(sp<CameraProviderManager> manager,
              const std::string& monitorTags) {
          ATRACE_CALL();
          Mutex::Autolock il(mInterfaceLock);
          Mutex::Autolock l(mLock);
      
          ALOGV("%s: Initializing HIDL device for camera %s", __FUNCTION__, mId.c_str());
          if (mStatus != STATUS_UNINITIALIZED) {
              CLOGE("Already initialized!");
              return INVALID_OPERATION;
          }
          if (manager == nullptr) return INVALID_OPERATION;
      
          sp<ICameraDeviceSession> session;
          ATRACE_BEGIN("CameraHal::openSession");
          status_t res = manager->openHidlSession(mId, this,
                  /*out*/ &session); // 1
          ATRACE_END();
          if (res != OK) {
              SET_ERR_L("Could not open camera session: %s (%d)", strerror(-res), res);
              return res;
          }
      
          res = manager->getCameraCharacteristics(mId, mOverrideForPerfClass, &mDeviceInfo,
                  hardware::ICameraService::ROTATION_OVERRIDE_NONE);
          if (res != OK) {
              SET_ERR_L("Could not retrieve camera characteristics: %s (%d)", strerror(-res), res);
              session->close();
              return res;
          }
          mSupportNativeZoomRatio = manager->supportNativeZoomRatio(mId);
      
          std::vector<std::string> physicalCameraIds;
          bool isLogical = manager->isLogicalCamera(mId, &physicalCameraIds);
          if (isLogical) {
              for (auto& physicalId : physicalCameraIds) {
                  // Do not override characteristics for physical cameras
                  res = manager->getCameraCharacteristics(
                          physicalId, /*overrideForPerfClass*/false, &mPhysicalDeviceInfoMap[physicalId],
                          hardware::ICameraService::ROTATION_OVERRIDE_NONE);
                  if (res != OK) {
                      SET_ERR_L("Could not retrieve camera %s characteristics: %s (%d)",
                              physicalId.c_str(), strerror(-res), res);
                      session->close();
                      return res;
                  }
      
                  bool usePrecorrectArray =
                          DistortionMapper::isDistortionSupported(mPhysicalDeviceInfoMap[physicalId]);
                  if (usePrecorrectArray) {
                      res = mDistortionMappers[physicalId].setupStaticInfo(
                              mPhysicalDeviceInfoMap[physicalId]);
                      if (res != OK) {
                          SET_ERR_L("Unable to read camera %s's calibration fields for distortion "
                                  "correction", physicalId.c_str());
                          session->close();
                          return res;
                      }
                  }
      
                  mZoomRatioMappers[physicalId] = ZoomRatioMapper(
                          &mPhysicalDeviceInfoMap[physicalId],
                          mSupportNativeZoomRatio, usePrecorrectArray);
      
                  if (SessionConfigurationUtils::supportsUltraHighResolutionCapture(
                          mPhysicalDeviceInfoMap[physicalId])) {
                      mUHRCropAndMeteringRegionMappers[physicalId] =
                              UHRCropAndMeteringRegionMapper(mPhysicalDeviceInfoMap[physicalId],
                                      usePrecorrectArray);
                  }
              }
          }
      
          std::shared_ptr<RequestMetadataQueue> queue;
          auto requestQueueRet = session->getCaptureRequestMetadataQueue(
              [&queue](const auto& descriptor) {
                  queue = std::make_shared<RequestMetadataQueue>(descriptor);
                  if (!queue->isValid() || queue->availableToWrite() <= 0) {
                      ALOGE("HAL returns empty request metadata fmq, not use it");
                      queue = nullptr;
                      // don't use the queue onwards.
                  }
              });
          if (!requestQueueRet.isOk()) {
              ALOGE("Transaction error when getting request metadata fmq: %s, not use it",
                      requestQueueRet.description().c_str());
              return DEAD_OBJECT;
          }
      
          std::unique_ptr<ResultMetadataQueue>& resQueue = mResultMetadataQueue;
          auto resultQueueRet = session->getCaptureResultMetadataQueue(
              [&resQueue](const auto& descriptor) {
                  resQueue = std::make_unique<ResultMetadataQueue>(descriptor);
                  if (!resQueue->isValid() || resQueue->availableToWrite() <= 0) {
                      ALOGE("HAL returns empty result metadata fmq, not use it");
                      resQueue = nullptr;
                      // Don't use the resQueue onwards.
                  }
              });
          if (!resultQueueRet.isOk()) {
              ALOGE("Transaction error when getting result metadata queue from camera session: %s",
                      resultQueueRet.description().c_str());
              return DEAD_OBJECT;
          }
          IF_ALOGV() {
              session->interfaceChain([](
                  ::android::hardware::hidl_vec<::android::hardware::hidl_string> interfaceChain) {
                      ALOGV("Session interface chain:");
                      for (const auto& iface : interfaceChain) {
                          ALOGV("  %s", iface.c_str());
                      }
                  });
          }
      
          camera_metadata_entry bufMgrMode =
                  mDeviceInfo.find(ANDROID_INFO_SUPPORTED_BUFFER_MANAGEMENT_VERSION);
          if (bufMgrMode.count > 0) {
               mUseHalBufManager = (bufMgrMode.data.u8[0] ==
                  ANDROID_INFO_SUPPORTED_BUFFER_MANAGEMENT_VERSION_HIDL_DEVICE_3_5);
          }
      
          camera_metadata_entry_t capabilities = mDeviceInfo.find(ANDROID_REQUEST_AVAILABLE_CAPABILITIES);
          for (size_t i = 0; i < capabilities.count; i++) {
              uint8_t capability = capabilities.data.u8[i];
              if (capability == ANDROID_REQUEST_AVAILABLE_CAPABILITIES_OFFLINE_PROCESSING) {
                  mSupportOfflineProcessing = true;
              }
          }
      
          mInterface = new HidlHalInterface(session, queue, mUseHalBufManager, mSupportOfflineProcessing);
      
          std::string providerType;
          mVendorTagId = manager->getProviderTagIdLocked(mId);
          mTagMonitor.initialize(mVendorTagId);
          if (!monitorTags.empty()) {
              mTagMonitor.parseTagsToMonitor(monitorTags);
          }
      
          // Metadata tags needs fixup for monochrome camera device version less
          // than 3.5.
          hardware::hidl_version maxVersion{0,0};
          IPCTransport transport = IPCTransport::HIDL;
          res = manager->getHighestSupportedVersion(mId, &maxVersion, &transport);
          if (res != OK) {
              ALOGE("%s: Error in getting camera device version id: %s (%d)",
                      __FUNCTION__, strerror(-res), res);
              return res;
          }
          int deviceVersion = HARDWARE_DEVICE_API_VERSION(
                  maxVersion.get_major(), maxVersion.get_minor());
      
          bool isMonochrome = false;
          for (size_t i = 0; i < capabilities.count; i++) {
              uint8_t capability = capabilities.data.u8[i];
              if (capability == ANDROID_REQUEST_AVAILABLE_CAPABILITIES_MONOCHROME) {
                  isMonochrome = true;
              }
          }
          mNeedFixupMonochromeTags = (isMonochrome && deviceVersion < CAMERA_DEVICE_API_VERSION_3_5);
      
          return initializeCommonLocked(manager);
      }
      

      中間和HIDL Provider一樣, 區別是
      frameworks/av/services/camera/libcameraservice/common/CameraProviderManager.h

       // Standard use case - call into the normal generated static methods which invoke
          // the real hardware service manager
          struct HidlServiceInteractionProxyImpl : public HidlServiceInteractionProxy {
              virtual bool registerForNotifications(
                      const std::string &serviceName,
                      const sp<hidl::manager::V1_0::IServiceNotification>
                      &notification) override {
                  return hardware::camera::provider::V2_4::ICameraProvider::registerForNotifications(
                          serviceName, notification);
              }
              virtual sp<hardware::camera::provider::V2_4::ICameraProvider> tryGetService(
                      const std::string &serviceName) override {
                  return hardware::camera::provider::V2_4::ICameraProvider::tryGetService(serviceName);
              }
              virtual sp<hardware::camera::provider::V2_4::ICameraProvider> getService(
                      const std::string &serviceName) override {
                  return hardware::camera::provider::V2_4::ICameraProvider::getService(serviceName);
              }
      
              virtual hardware::hidl_vec<hardware::hidl_string> listServices() override;
          };
      

      這個就是下面的ICameraProvider.hidl的接口,會生成如下接口。
      out/soong/.intermediates/hardware/interfaces/camera/provider/2.4/android.hardware.camera.provider@2.4_genc++_headers/gen/android/hardware/camera/provider/2.4/ICameraProvider.h。

       /**
           * This gets the service of this type with the specified instance name. If the
           * service is not in the VINTF manifest on a Trebilized device, this will return
           * nullptr. If the service is not available, this will wait for the service to
           * become available. If the service is a lazy service, this will start the service
           * and return when it becomes available. If getStub is true, this will try to
           * return an unwrapped passthrough implementation in the same process. This is
           * useful when getting an implementation from the same partition/compilation group.
           */
          static ::android::sp<ICameraProvider> getService(const std::string &serviceName="default", bool getStub=false);
      

      實現
      out/soong/.intermediates/hardware/interfaces/camera/provider/2.4/android.hardware.camera.provider@2.4_genc++/gen/android/hardware/camera/provider/2.4/CameraProviderAll.cpp

      ::android::sp<ICameraProvider> ICameraProvider::getService(const std::string &serviceName, const bool getStub) {
          return ::android::hardware::details::getServiceInternal<BpHwCameraProvider>(serviceName, true, getStub);
      }
      

      getServiceInternal方法實現
      system/libhidl/transport/include/hidl/HidlTransportSupport.h

      
      template <typename BpType, typename IType = typename BpType::Pure,
                typename = std::enable_if_t<std::is_same<i_tag, typename IType::_hidl_tag>::value>,
                typename = std::enable_if_t<std::is_same<bphw_tag, typename BpType::_hidl_tag>::value>>
      sp<IType> getServiceInternal(const std::string& instance, bool retry, bool getStub) {
          using ::android::hidl::base::V1_0::IBase;
      
          sp<IBase> base = getRawServiceInternal(IType::descriptor, instance, retry, getStub);
      
          if (base == nullptr) {
              return nullptr;
          }
      
          if (base->isRemote()) {
              // getRawServiceInternal guarantees we get the proper class
              return sp<IType>(new BpType(getOrCreateCachedBinder(base.get())));
          }
      
          return IType::castFrom(base);
      }
      

      system/libhidl/transport/ServiceManagement.cpp

      
      sp<::android::hidl::base::V1_0::IBase> getRawServiceInternal(const std::string& descriptor,
                                                                   const std::string& instance,
                                                                   bool retry, bool getStub) {
          using Transport = IServiceManager1_0::Transport;
          sp<Waiter> waiter;
      
          sp<IServiceManager1_1> sm;
          Transport transport = Transport::EMPTY;
          if (kIsRecovery) {
              transport = Transport::PASSTHROUGH;
          } else {
              sm = defaultServiceManager1_1();
              if (sm == nullptr) {
                  ALOGE("getService: defaultServiceManager() is null");
                  return nullptr;
              }
      
              Return<Transport> transportRet = sm->getTransport(descriptor, instance);
      
              if (!transportRet.isOk()) {
                  ALOGE("getService: defaultServiceManager()->getTransport returns %s",
                        transportRet.description().c_str());
                  return nullptr;
              }
              transport = transportRet;
          }
      
          const bool vintfHwbinder = (transport == Transport::HWBINDER);
          const bool vintfPassthru = (transport == Transport::PASSTHROUGH);
          const bool trebleTestingOverride = isTrebleTestingOverride();
          const bool allowLegacy = !kEnforceVintfManifest || (trebleTestingOverride && isDebuggable());
          const bool vintfLegacy = (transport == Transport::EMPTY) && allowLegacy;
      
          if (!kEnforceVintfManifest) {
              ALOGE("getService: Potential race detected. The VINTF manifest is not being enforced. If "
                    "a HAL server has a delay in starting and it is not in the manifest, it will not be "
                    "retrieved. Please make sure all HALs on this device are in the VINTF manifest and "
                    "enable PRODUCT_ENFORCE_VINTF_MANIFEST on this device (this is also enabled by "
                    "PRODUCT_FULL_TREBLE). PRODUCT_ENFORCE_VINTF_MANIFEST will ensure that no race "
                    "condition is possible here.");
              sleep(1);
          }
      
          for (int tries = 0; !getStub && (vintfHwbinder || vintfLegacy); tries++) {
              if (waiter == nullptr && tries > 0) {
                  waiter = new Waiter(descriptor, instance, sm);
              }
              if (waiter != nullptr) {
                  waiter->reset();  // don't reorder this -- see comments on reset()
              }
              Return<sp<IBase>> ret = sm->get(descriptor, instance);
              if (!ret.isOk()) {
                  ALOGE("getService: defaultServiceManager()->get returns %s for %s/%s.",
                        ret.description().c_str(), descriptor.c_str(), instance.c_str());
                  break;
              }
              sp<IBase> base = ret;
              if (base != nullptr) {
                  Return<bool> canCastRet =
                      details::canCastInterface(base.get(), descriptor.c_str(), true /* emitError */);
      
                  if (canCastRet.isOk() && canCastRet) {
                      if (waiter != nullptr) {
                          waiter->done();
                      }
                      return base; // still needs to be wrapped by Bp class.
                  }
      
                  if (!handleCastError(canCastRet, descriptor, instance)) break;
              }
      
              // In case of legacy or we were not asked to retry, don't.
              if (vintfLegacy || !retry) break;
      
              if (waiter != nullptr) {
                  ALOGI("getService: Trying again for %s/%s...", descriptor.c_str(), instance.c_str());
                  waiter->wait(true /* timeout */);
              }
          }
      
          if (waiter != nullptr) {
              waiter->done();
          }
      
          if (getStub || vintfPassthru || vintfLegacy) {
              const sp<IServiceManager1_0> pm = getPassthroughServiceManager();
              if (pm != nullptr) {
                  sp<IBase> base = pm->get(descriptor, instance).withDefault(nullptr);
                  if (!getStub || trebleTestingOverride) {
                      base = wrapPassthrough(base);
                  }
                  return base;
              }
          }
      
          return nullptr;
      }
      

      接下來看下hidlDeviceInfo3->startDeviceInterface()

      sp<hardware::camera::device::V3_2::ICameraDevice>
      HidlProviderInfo::HidlDeviceInfo3::startDeviceInterface() {
          Mutex::Autolock l(mDeviceAvailableLock);
          sp<hardware::camera::device::V3_2::ICameraDevice> device;
          ATRACE_CALL();
          if (mSavedInterface == nullptr) {
              sp<HidlProviderInfo> parentProvider =
                      static_cast<HidlProviderInfo *>(mParentProvider.promote().get());
              if (parentProvider != nullptr) {
                  // Wait for lazy HALs to confirm device availability
                  if (parentProvider->isExternalLazyHAL() && !mIsDeviceAvailable) {
                      ALOGV("%s: Wait for external device to become available %s",
                            __FUNCTION__,
                            mId.c_str());
      
                      auto res = mDeviceAvailableSignal.waitRelative(mDeviceAvailableLock,
                                                               kDeviceAvailableTimeout);
                      if (res != OK) {
                          ALOGE("%s: Failed waiting for device to become available",
                                __FUNCTION__);
                          return nullptr;
                      }
                  }
      
                  device = parentProvider->startDeviceInterface(mName);
              }
          } else {
              device = (hardware::camera::device::V3_2::ICameraDevice *) mSavedInterface.get();
          }
          return device;
      }
      
      

      mName值為ICameraProvider::descriptor + cameraId拼接.

      
      AidlProviderInfo::AidlProviderInfo(
                  const std::string &providerName,
                  const std::string &providerInstance,
                  CameraProviderManager *manager) :
                  CameraProviderManager::ProviderInfo(providerName, providerInstance, manager) {}
      
      status_t AidlProviderInfo::initializeAidlProvider(
              std::shared_ptr<ICameraProvider>& interface, int64_t currentDeviceState) {
      
          using aidl::android::hardware::camera::provider::ICameraProvider;
          std::string parsedProviderName =
                      mProviderName.substr(std::string(ICameraProvider::descriptor).size() + 1);
      
          status_t res = parseProviderName(parsedProviderName, &mType, &mId);
          if (res != OK) {
              ALOGE("%s: Invalid provider name, ignoring", __FUNCTION__);
              return BAD_VALUE;
          }
          ALOGI("Connecting to new camera provider: %s, isRemote? %d",
                  mProviderName.c_str(), interface->isRemote());
      
          // cameraDeviceStatusChange callbacks may be called (and causing new devices added)
          // before setCallback returns
          mCallbacks =
                  ndk::SharedRefBase::make<AidlProviderCallbacks>(this);
          ndk::ScopedAStatus status =
                  interface->setCallback(mCallbacks);
          if (!status.isOk()) {
              ALOGE("%s: Transaction error setting up callbacks with camera provider '%s': %s",
                      __FUNCTION__, mProviderName.c_str(), status.getMessage());
              return mapToStatusT(status);
          }
      
          mDeathRecipient = ndk::ScopedAIBinder_DeathRecipient(AIBinder_DeathRecipient_new(binderDied));
      
          if (!vd_flags::virtual_camera_service_discovery() || interface->isRemote()) {
              binder_status_t link =
                      AIBinder_linkToDeath(interface->asBinder().get(), mDeathRecipient.get(), this);
              if (link != STATUS_OK) {
                  ALOGW("%s: Unable to link to provider '%s' death notifications (%d)", __FUNCTION__,
                        mProviderName.c_str(), link);
                  return DEAD_OBJECT;
              }
          }
      
          if (!kEnableLazyHal) {
              // Save HAL reference indefinitely
              mSavedInterface = interface;
          } else {
              mActiveInterface = interface;
          }
      
          ALOGV("%s: Setting device state for %s: 0x%" PRIx64,
                  __FUNCTION__, mProviderName.c_str(), mDeviceState);
          notifyDeviceStateChange(currentDeviceState);
      
          res = setUpVendorTags();
          if (res != OK) {
              ALOGE("%s: Unable to set up vendor tags from provider '%s'",
                      __FUNCTION__, mProviderName.c_str());
              return res;
           }
      
          // Get initial list of camera devices, if any
          std::vector<std::string> devices;
          std::vector<std::string> retDevices;
          status = interface->getCameraIdList(&retDevices);
          if (!status.isOk()) {
              ALOGE("%s: Transaction error in getting camera ID list from provider '%s': %s",
                      __FUNCTION__, mProviderName.c_str(), status.getMessage());
              return mapToStatusT(status);
          }
      
          for (auto& name : retDevices) {
              uint16_t major, minor;
              std::string type, id;
              status_t res = parseDeviceName(name, &major, &minor, &type, &id);
              if (res != OK) {
                  ALOGE("%s: Error parsing deviceName: %s: %d", __FUNCTION__, name.c_str(), res);
                  return res;
              } else {
                  devices.push_back(name);
                  mProviderPublicCameraIds.push_back(id);
              }
          }
      
          // Get list of concurrent streaming camera device combinations
          res = getConcurrentCameraIdsInternalLocked(interface);
          if (res != OK) {
              return res;
          }
      
          mSetTorchModeSupported = true;
      
          mIsRemote = interface->isRemote();
      
          initializeProviderInfoCommon(devices);
          return OK;
      }
      

      拿到hardware::camera::device::V3_2::ICameraDevice *,實際上是out/soong/.intermediates/hardware/interfaces/camera/device/3.2/android.hardware.camera.device@3.2_genc++_headers/gen/android/hardware/camera/device/3.2/ICameraDevice.h.
      隨后IPC調用open方法。
      out/soong/.intermediates/hardware/interfaces/camera/device/3.2/android.hardware.camera.device@3.2_genc++_headers/gen/android/hardware/camera/device/3.2/BnHwCameraDevice.h

       static ::android::status_t _hidl_open(
                  ::android::hidl::base::V1_0::BnHwBase* _hidl_this,
                  const ::android::hardware::Parcel &_hidl_data,
                  ::android::hardware::Parcel *_hidl_reply,
                  TransactCallback _hidl_cb);
      

      out/soong/.intermediates/hardware/interfaces/camera/device/3.2/android.hardware.camera.device@3.2_genc++/gen/android/hardware/camera/device/3.2/CameraDeviceAll.cpp

      ::android::hardware::Return<void> BpHwCameraDevice::_hidl_open(::android::hardware::IInterface *_hidl_this, ::android::hardware::details::HidlInstrumentor *_hidl_this_instrumentor, const ::android::sp<::android::hardware::camera::device::V3_2::ICameraDeviceCallback>& callback, open_cb _hidl_cb) {
          #ifdef __ANDROID_DEBUGGABLE__
          bool mEnableInstrumentation = _hidl_this_instrumentor->isInstrumentationEnabled();
          const auto &mInstrumentationCallbacks = _hidl_this_instrumentor->getInstrumentationCallbacks();
          #else
          (void) _hidl_this_instrumentor;
          #endif // __ANDROID_DEBUGGABLE__
          ::android::ScopedTrace PASTE(___tracer, __LINE__) (ATRACE_TAG_HAL, "HIDL::ICameraDevice::open::client");
          #ifdef __ANDROID_DEBUGGABLE__
          if (UNLIKELY(mEnableInstrumentation)) {
              std::vector<void *> _hidl_args;
              _hidl_args.push_back((void *)&callback);
              for (const auto &callback: mInstrumentationCallbacks) {
                  callback(InstrumentationEvent::CLIENT_API_ENTRY, "android.hardware.camera.device", "3.2", "ICameraDevice", "open", &_hidl_args);
              }
          }
          #endif // __ANDROID_DEBUGGABLE__
      
          ::android::hardware::Parcel _hidl_data;
          ::android::hardware::Parcel _hidl_reply;
          ::android::status_t _hidl_err;
          ::android::status_t _hidl_transact_err;
          ::android::hardware::Status _hidl_status;
      
          _hidl_err = _hidl_data.writeInterfaceToken(BpHwCameraDevice::descriptor);
          if (_hidl_err != ::android::OK) { goto _hidl_error; }
      
          if (callback == nullptr) {
              _hidl_err = _hidl_data.writeStrongBinder(nullptr);
          } else {
              ::android::sp<::android::hardware::IBinder> _hidl_binder = ::android::hardware::getOrCreateCachedBinder(callback.get());
              if (_hidl_binder.get() != nullptr) {
                  _hidl_err = _hidl_data.writeStrongBinder(_hidl_binder);
              } else {
                  _hidl_err = ::android::UNKNOWN_ERROR;
              }
          }
          if (_hidl_err != ::android::OK) { goto _hidl_error; }
      
          ::android::hardware::ProcessState::self()->startThreadPool();
          _hidl_transact_err = ::android::hardware::IInterface::asBinder(_hidl_this)->transact(4 /* open */, _hidl_data, &_hidl_reply, 0 /* flags */, [&] (::android::hardware::Parcel& _hidl_reply) {
              ::android::hardware::camera::common::V1_0::Status _hidl_out_status;
              ::android::sp<::android::hardware::camera::device::V3_2::ICameraDeviceSession> _hidl_out_session;
      
      
              _hidl_err = ::android::hardware::readFromParcel(&_hidl_status, _hidl_reply);
              if (_hidl_err != ::android::OK) { return; }
      
              if (!_hidl_status.isOk()) { return; }
      
              _hidl_err = _hidl_reply.readUint32((uint32_t *)&_hidl_out_status);
              if (_hidl_err != ::android::OK) { return; }
      
              {
                  ::android::sp<::android::hardware::IBinder> _hidl_binder;
                  _hidl_err = _hidl_reply.readNullableStrongBinder(&_hidl_binder);
                  if (_hidl_err != ::android::OK) { return; }
      
                  _hidl_out_session = ::android::hardware::fromBinder<::android::hardware::camera::device::V3_2::ICameraDeviceSession,::android::hardware::camera::device::V3_2::BpHwCameraDeviceSession,::android::hardware::camera::device::V3_2::BnHwCameraDeviceSession>(_hidl_binder);
              }
      
              _hidl_cb(_hidl_out_status, _hidl_out_session);
      
              #ifdef __ANDROID_DEBUGGABLE__
              if (UNLIKELY(mEnableInstrumentation)) {
                  std::vector<void *> _hidl_args;
                  _hidl_args.push_back((void *)&_hidl_out_status);
                  _hidl_args.push_back((void *)&_hidl_out_session);
                  for (const auto &callback: mInstrumentationCallbacks) {
                      callback(InstrumentationEvent::CLIENT_API_EXIT, "android.hardware.camera.device", "3.2", "ICameraDevice", "open", &_hidl_args);
                  }
              }
              #endif // __ANDROID_DEBUGGABLE__
      
          });
          if (_hidl_transact_err != ::android::OK) {
              _hidl_err = _hidl_transact_err;
              goto _hidl_error;
          }
      
          if (!_hidl_status.isOk()) { return _hidl_status; }
          return ::android::hardware::Return<void>();
      
      _hidl_error:
          _hidl_status.setFromStatusT(_hidl_err);
          return ::android::hardware::Return<void>(_hidl_status);
      }
      ::android::status_t BnHwCameraDevice::_hidl_open(
              ::android::hidl::base::V1_0::BnHwBase* _hidl_this,
              const ::android::hardware::Parcel &_hidl_data,
              ::android::hardware::Parcel *_hidl_reply,
              TransactCallback _hidl_cb) {
          #ifdef __ANDROID_DEBUGGABLE__
          bool mEnableInstrumentation = _hidl_this->isInstrumentationEnabled();
          const auto &mInstrumentationCallbacks = _hidl_this->getInstrumentationCallbacks();
          #endif // __ANDROID_DEBUGGABLE__
      
          ::android::status_t _hidl_err = ::android::OK;
          if (!_hidl_data.enforceInterface(BnHwCameraDevice::Pure::descriptor)) {
              _hidl_err = ::android::BAD_TYPE;
              return _hidl_err;
          }
      
          ::android::sp<::android::hardware::camera::device::V3_2::ICameraDeviceCallback> callback;
      
          {
              ::android::sp<::android::hardware::IBinder> _hidl_binder;
              _hidl_err = _hidl_data.readNullableStrongBinder(&_hidl_binder);
              if (_hidl_err != ::android::OK) { return _hidl_err; }
      
              callback = ::android::hardware::fromBinder<::android::hardware::camera::device::V3_2::ICameraDeviceCallback,::android::hardware::camera::device::V3_2::BpHwCameraDeviceCallback,::android::hardware::camera::device::V3_2::BnHwCameraDeviceCallback>(_hidl_binder);
          }
      
          atrace_begin(ATRACE_TAG_HAL, "HIDL::ICameraDevice::open::server");
          #ifdef __ANDROID_DEBUGGABLE__
          if (UNLIKELY(mEnableInstrumentation)) {
              std::vector<void *> _hidl_args;
              _hidl_args.push_back((void *)&callback);
              for (const auto &callback: mInstrumentationCallbacks) {
                  callback(InstrumentationEvent::SERVER_API_ENTRY, "android.hardware.camera.device", "3.2", "ICameraDevice", "open", &_hidl_args);
              }
          }
          #endif // __ANDROID_DEBUGGABLE__
      
          bool _hidl_callbackCalled = false;
      
          ::android::hardware::Return<void> _hidl_ret = static_cast<ICameraDevice*>(_hidl_this->getImpl().get())->open(callback, [&](const auto &_hidl_out_status, const auto &_hidl_out_session) {
              if (_hidl_callbackCalled) {
                  LOG_ALWAYS_FATAL("open: _hidl_cb called a second time, but must be called once.");
              }
              _hidl_callbackCalled = true;
      
              ::android::hardware::writeToParcel(::android::hardware::Status::ok(), _hidl_reply);
      
              _hidl_err = _hidl_reply->writeUint32((uint32_t)_hidl_out_status);
              if (_hidl_err != ::android::OK) { goto _hidl_error; }
      
              if (_hidl_out_session == nullptr) {
                  _hidl_err = _hidl_reply->writeStrongBinder(nullptr);
              } else {
                  ::android::sp<::android::hardware::IBinder> _hidl_binder = ::android::hardware::getOrCreateCachedBinder(_hidl_out_session.get());
                  if (_hidl_binder.get() != nullptr) {
                      _hidl_err = _hidl_reply->writeStrongBinder(_hidl_binder);
                  } else {
                      _hidl_err = ::android::UNKNOWN_ERROR;
                  }
              }
              if (_hidl_err != ::android::OK) { goto _hidl_error; }
      
          _hidl_error:
              atrace_end(ATRACE_TAG_HAL);
              #ifdef __ANDROID_DEBUGGABLE__
              if (UNLIKELY(mEnableInstrumentation)) {
                  std::vector<void *> _hidl_args;
                  _hidl_args.push_back((void *)&_hidl_out_status);
                  _hidl_args.push_back((void *)&_hidl_out_session);
                  for (const auto &callback: mInstrumentationCallbacks) {
                      callback(InstrumentationEvent::SERVER_API_EXIT, "android.hardware.camera.device", "3.2", "ICameraDevice", "open", &_hidl_args);
                  }
              }
              #endif // __ANDROID_DEBUGGABLE__
      
              if (_hidl_err != ::android::OK) { return; }
              _hidl_cb(*_hidl_reply);
          });
      
          _hidl_ret.assertOk();
          if (!_hidl_callbackCalled) {
              LOG_ALWAYS_FATAL("open: _hidl_cb not called, but must be called once.");
          }
      
          return _hidl_err;
      }
      

      因為hardware/interfaces/camera/device/3.2/default/CameraDevice_3_2.h中的 TrampolineDeviceInterface_3_2實現了ICameraService接口,所有實現了open函數。

      /*
       * The camera device HAL implementation is opened lazily (via the open call)
       */
      struct CameraDevice : public virtual RefBase {
          // Called by provider HAL. Provider HAL must ensure the uniqueness of
          // CameraDevice object per cameraId, or there could be multiple CameraDevice
          // trying to access the same physical camera.
          // Also, provider will have to keep track of all CameraDevice objects in
          // order to notify CameraDevice when the underlying camera is detached
          CameraDevice(sp<CameraModule> module,
                       const std::string& cameraId,
                       const SortedVector<std::pair<std::string, std::string>>& cameraDeviceNames);
          virtual ~CameraDevice();
      
          // Retrieve the HIDL interface, split into its own class to avoid inheritance issues when
          // dealing with minor version revs and simultaneous implementation and interface inheritance
          virtual sp<ICameraDevice> getInterface() {
              return new TrampolineDeviceInterface_3_2(this);
          }
      
          // Caller must use this method to check if CameraDevice ctor failed
          bool isInitFailed() { return mInitFail; }
          // Used by provider HAL to signal external camera disconnected
          void setConnectionStatus(bool connected);
      
          /* Methods from ::android::hardware::camera::device::V3_2::ICameraDevice follow. */
          // The following method can be called without opening the actual camera device
          Return<void> getResourceCost(ICameraDevice::getResourceCost_cb _hidl_cb);
          Return<void> getCameraCharacteristics(ICameraDevice::getCameraCharacteristics_cb _hidl_cb);
          Return<Status> setTorchMode(TorchMode mode);
      
          // Open the device HAL and also return a default capture session
          Return<void> open(const sp<ICameraDeviceCallback>& callback, ICameraDevice::open_cb _hidl_cb);
      
      
          // Forward the dump call to the opened session, or do nothing
          Return<void> dumpState(const ::android::hardware::hidl_handle& fd);
          /* End of Methods from ::android::hardware::camera::device::V3_2::ICameraDevice */
      
      protected:
      
          // Overridden by child implementations for returning different versions of CameraDeviceSession
          virtual sp<CameraDeviceSession> createSession(camera3_device_t*,
                  const camera_metadata_t* deviceInfo,
                  const sp<ICameraDeviceCallback>&);
      
          const sp<CameraModule> mModule;
          const std::string mCameraId;
          // const after ctor
          int   mCameraIdInt;
          int   mDeviceVersion;
          bool  mInitFail = false;
          // Set by provider (when external camera is connected/disconnected)
          bool  mDisconnected;
          wp<CameraDeviceSession> mSession = nullptr;
      
          const SortedVector<std::pair<std::string, std::string>>& mCameraDeviceNames;
      
          // gating access to mSession and mDisconnected
          mutable Mutex mLock;
      
          // convert conventional HAL status to HIDL Status
          static Status getHidlStatus(int);
      
          Status initStatus() const;
      
      private:
          struct TrampolineDeviceInterface_3_2 : public ICameraDevice {
              TrampolineDeviceInterface_3_2(sp<CameraDevice> parent) :
                  mParent(parent) {}
      
              virtual Return<void> getResourceCost(V3_2::ICameraDevice::getResourceCost_cb _hidl_cb)
                      override {
                  return mParent->getResourceCost(_hidl_cb);
              }
      
              virtual Return<void> getCameraCharacteristics(
                      V3_2::ICameraDevice::getCameraCharacteristics_cb _hidl_cb) override {
                  return mParent->getCameraCharacteristics(_hidl_cb);
              }
      
              virtual Return<Status> setTorchMode(TorchMode mode) override {
                  return mParent->setTorchMode(mode);
              }
      
              virtual Return<void> open(const sp<V3_2::ICameraDeviceCallback>& callback,
                      V3_2::ICameraDevice::open_cb _hidl_cb) override {
                  return mParent->open(callback, _hidl_cb);
              }
      
              virtual Return<void> dumpState(const hidl_handle& fd) override {
                  return mParent->dumpState(fd);
              }
      
          private:
              sp<CameraDevice> mParent;
          };
      
      };
      

      mParent實際上就是CameraDevice
      hardware/interfaces/camera/device/3.2/default/CameraDevice.cpp

      Return<void> CameraDevice::open(const sp<ICameraDeviceCallback>& callback,
              ICameraDevice::open_cb _hidl_cb)  {
          Status status = initStatus();
          sp<CameraDeviceSession> session = nullptr;
      
          if (callback == nullptr) {
              ALOGE("%s: cannot open camera %s. callback is null!",
                      __FUNCTION__, mCameraId.c_str());
              _hidl_cb(Status::ILLEGAL_ARGUMENT, nullptr);
              return Void();
          }
      
          if (status != Status::OK) {
              // Provider will never pass initFailed device to client, so
              // this must be a disconnected camera
              ALOGE("%s: cannot open camera %s. camera is disconnected!",
                      __FUNCTION__, mCameraId.c_str());
              _hidl_cb(Status::CAMERA_DISCONNECTED, nullptr);
              return Void();
          } else {
              mLock.lock();
      
              ALOGV("%s: Initializing device for camera %d", __FUNCTION__, mCameraIdInt);
              session = mSession.promote();
              if (session != nullptr && !session->isClosed()) {
                  ALOGE("%s: cannot open an already opened camera!", __FUNCTION__);
                  mLock.unlock();
                  _hidl_cb(Status::CAMERA_IN_USE, nullptr);
                  return Void();
              }
      
              /** Open HAL device */
              status_t res;
              camera3_device_t *device;
      
              ATRACE_BEGIN("camera3->open");
              res = mModule->open(mCameraId.c_str(),
                      reinterpret_cast<hw_device_t**>(&device)); // 1
              ATRACE_END();
      
              if (res != OK) {
                  ALOGE("%s: cannot open camera %s!", __FUNCTION__, mCameraId.c_str());
                  mLock.unlock();
                  _hidl_cb(getHidlStatus(res), nullptr);
                  return Void();
              }
      
              /** Cross-check device version */
              if (device->common.version < CAMERA_DEVICE_API_VERSION_3_2) {
                  ALOGE("%s: Could not open camera: "
                          "Camera device should be at least %x, reports %x instead",
                          __FUNCTION__,
                          CAMERA_DEVICE_API_VERSION_3_2,
                          device->common.version);
                  device->common.close(&device->common);
                  mLock.unlock();
                  _hidl_cb(Status::ILLEGAL_ARGUMENT, nullptr);
                  return Void();
              }
      
              struct camera_info info;
              res = mModule->getCameraInfo(mCameraIdInt, &info);
              if (res != OK) {
                  ALOGE("%s: Could not open camera: getCameraInfo failed", __FUNCTION__);
                  device->common.close(&device->common);
                  mLock.unlock();
                  _hidl_cb(Status::ILLEGAL_ARGUMENT, nullptr);
                  return Void();
              }
      
              session = createSession(
                      device, info.static_camera_characteristics, callback);
              if (session == nullptr) {
                  ALOGE("%s: camera device session allocation failed", __FUNCTION__);
                  mLock.unlock();
                  _hidl_cb(Status::INTERNAL_ERROR, nullptr);
                  return Void();
              }
              if (session->isInitFailed()) {
                  ALOGE("%s: camera device session init failed", __FUNCTION__);
                  session = nullptr;
                  mLock.unlock();
                  _hidl_cb(Status::INTERNAL_ERROR, nullptr);
                  return Void();
              }
              mSession = session;
      
              IF_ALOGV() {
                  session->getInterface()->interfaceChain([](
                      ::android::hardware::hidl_vec<::android::hardware::hidl_string> interfaceChain) {
                          ALOGV("Session interface chain:");
                          for (const auto& iface : interfaceChain) {
                              ALOGV("  %s", iface.c_str());
                          }
                      });
              }
              mLock.unlock();
          }
          _hidl_cb(status, session->getInterface());
          return Void();
      }
      

      代碼1處真正有HIDL進入HAL層。

      CameraManager源碼解析

      示例代碼

      package edu.tyut.ffmpeglearn.manager
      
      import android.Manifest
      import android.content.Context
      import android.graphics.Camera
      import android.graphics.ImageFormat
      import android.hardware.camera2.CameraCaptureSession
      import android.hardware.camera2.CameraCharacteristics
      import android.hardware.camera2.CameraDevice
      import android.hardware.camera2.CameraManager
      import android.hardware.camera2.CaptureFailure
      import android.hardware.camera2.CaptureRequest
      import android.hardware.camera2.params.OutputConfiguration
      import android.hardware.camera2.params.SessionConfiguration
      import android.hardware.camera2.params.StreamConfigurationMap
      import android.media.Image
      import android.media.ImageReader
      import android.net.Uri
      import android.os.Build
      import android.os.Environment
      import android.os.Handler
      import android.os.HandlerThread
      import android.util.Log
      import android.util.Range
      import android.util.Size
      import android.view.Surface
      import androidx.annotation.RequiresPermission
      import androidx.core.content.FileProvider
      import java.io.File
      import java.util.concurrent.Executor
      
      private const val TAG: String = "CaptureManager"
      
      /**
       * ffplay -f rawvideo -vf format=yuv420p -video_size 1280x720 -framerate 30 yuv420p.yuv
       */
      internal class CaptureManager internal constructor(
          private val context: Context,
      ) {
          private var lastTimestamp = 0L
          private var frameCount = 0
          private val cameraManager: CameraManager by lazy {
              context.getSystemService<CameraManager>(CameraManager::class.java)
          }
      
          private val cameraThread = HandlerThread("CameraThread").apply { start() }
          private val cameraHandler = Handler(cameraThread.looper)
          private val executor: Executor = Executor { runnable ->
              cameraHandler.post(runnable)
          }
      
          private var mCaptureSession: CameraCaptureSession? = null
          private var mCameraDevice: CameraDevice? = null
          private var mImageReader: ImageReader? = null
      
          private val yuv420pUri: Uri = FileProvider.getUriForFile(
              context, "${context.packageName}.provider", File(
                  Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_DOWNLOADS),
                  "yuv420p.yuv"
              )
          )
      
          private val outputStream = context.contentResolver.openOutputStream(yuv420pUri)
      
      
          @RequiresPermission(Manifest.permission.CAMERA)
          // @RequiresApi(Build.VERSION_CODES.VANILLA_ICE_CREAM)
          internal fun open() {
              // val cameraId: String = cameraManager.cameraIdList.firstOrNull() ?: "0"
              val cameraId: String = "0"
              val cameraCharacteristics: CameraCharacteristics =
                  cameraManager.getCameraCharacteristics(cameraId)
              val streamConfigurationMap: StreamConfigurationMap? =
                  cameraCharacteristics[CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP]
              // LEGACY級別表示設備是通過舊版 camera HAL 模擬的 camera2 API
              val isSupportLegacy: Boolean =
                  cameraCharacteristics[CameraCharacteristics.INFO_SUPPORTED_HARDWARE_LEVEL] == CameraCharacteristics.INFO_SUPPORTED_HARDWARE_LEVEL_LEGACY
              val fpsRanges: Array<Range<Int>>? =
                  cameraCharacteristics.get(CameraCharacteristics.CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES)
              fpsRanges?.forEach {
                  Log.i(TAG, "Supported FPS range: ${it.lower} - ${it.upper}")
              }
              val hardwareLevel = cameraCharacteristics.get(
                  CameraCharacteristics.INFO_SUPPORTED_HARDWARE_LEVEL
              )
              val outputSizes: Array<Size>? =
                  streamConfigurationMap?.getOutputSizes(ImageFormat.YUV_420_888)
              Log.i(
                  TAG,
                  "open -> isSupportLegacy: $isSupportLegacy, hardwareLevel: $hardwareLevel, outputSizes: ${outputSizes?.joinToString()}"
              )
              streamConfigurationMap?.getHighSpeedVideoSizes()?.forEach {
                  val ranges = streamConfigurationMap.getHighSpeedVideoFpsRangesFor(it)
                  Log.i(TAG, "open -> size: $it, fps: ${ranges.joinToString()}")
              }
              val pixelsSize: Size =
                  outputSizes?.firstOrNull { it.width == 1280 && it.height == 720 } ?: Size(
                      1280,
                      720
                  ) // 1440 1080
              Log.i(TAG, "open -> pixelsSize: $pixelsSize")
              val imageReader =
                  ImageReader.newInstance(pixelsSize.width, pixelsSize.height, ImageFormat.YUV_420_888, 3)
              this.mImageReader = imageReader
      
              imageReader.setOnImageAvailableListener({ imageReader: ImageReader? ->
                  imageReader?.acquireLatestImage()?.use { image ->
                      Log.i(
                          TAG,
                          "open -> Available image width: ${image.width}, height: ${image.height}"
                      )
                      val byteArray: ByteArray = this@CaptureManager.yuv420ToYuv420p(image)
                      outputStream?.write(byteArray)
                      outputStream?.flush()
                  }
              }, cameraHandler)
      
              // imageReader.setOnImageAvailableListener({ reader ->
              //     val currentTimestamp = System.currentTimeMillis()
              //     frameCount++
              //     if (lastTimestamp == 0L) {
              //         lastTimestamp = currentTimestamp
              //     } else {
              //         val diff = currentTimestamp - lastTimestamp
              //         if (diff >= 1000) {
              //             val actualFps = frameCount * 1000 / diff
              //             Log.i(TAG, "Actual capture FPS: $actualFps")
              //             frameCount = 0
              //             lastTimestamp = currentTimestamp
              //         }
              //     }
              //     val image = reader.acquireLatestImage()
              //     image?.close()
              // }, cameraHandler)
              
              cameraManager.openCamera(cameraId, object : CameraDevice.StateCallback() {
                  override fun onDisconnected(camera: CameraDevice) {
                      Log.i(TAG, "onDisconnected...")
                  }
      
                  override fun onError(camera: CameraDevice, error: Int) {
                      Log.i(TAG, "onError -> error: $error")
                  }
      
                  override fun onOpened(camera: CameraDevice) {
                      this@CaptureManager.mCameraDevice = camera
      
                      val captureRequestBuilder: CaptureRequest.Builder =
                          camera.createCaptureRequest(if (isSupportLegacy) CameraDevice.TEMPLATE_RECORD else CameraDevice.TEMPLATE_PREVIEW)
                      captureRequestBuilder.addTarget(imageReader.surface)
                      fpsRanges?.firstOrNull { it.lower == 30 && it.upper == 30 }
                          ?.let { fpsRange: Range<Int> ->
                              Log.i(TAG, "onOpened -> fpsRange: $fpsRange")
                              captureRequestBuilder.set<Range<Int>>(
                                  CaptureRequest.CONTROL_AE_TARGET_FPS_RANGE,
                                  fpsRange // 這里示例設置為 24fps
                              )
                          }
                      captureRequestBuilder.set(
                          CaptureRequest.CONTROL_AE_MODE,
                          CaptureRequest.CONTROL_AE_MODE_ON
                      )
                      createSession(
                          camera = camera,
                          imageReader = imageReader,
                          captureRequestBuilder = captureRequestBuilder
                      )
                  }
      
                  override fun onClosed(camera: CameraDevice) {
                      super.onClosed(camera)
                      Log.i(TAG, "onClosed...")
                      // cameraHandler.post {
                      //     cameraHandler.removeCallbacksAndMessages(null)
                      //     val quitSafely: Boolean = cameraThread.quitSafely()
                      //     Log.i(TAG, "release -> quitSafely: $quitSafely")
                      // }
                  }
              }, cameraHandler)
          }
      
          // @RequiresApi(Build.VERSION_CODES.VANILLA_ICE_CREAM)
          private fun createSession(
              camera: CameraDevice,
              imageReader: ImageReader,
              captureRequestBuilder: CaptureRequest.Builder
          ) {
              val outputConfiguration = OutputConfiguration(imageReader.surface)
              if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.P) {
                  val configs = mutableListOf<OutputConfiguration>()
                  val sessionConfiguration = SessionConfiguration(
                      SessionConfiguration.SESSION_REGULAR,
                      listOf<OutputConfiguration>(outputConfiguration),
                      executor,
                      object : CameraCaptureSession.StateCallback() {
                          override fun onConfigureFailed(session: CameraCaptureSession) {
                              Log.i(TAG, "onConfigureFailed...")
                          }
      
                          override fun onConfigured(session: CameraCaptureSession) {
                              this@CaptureManager.mCaptureSession = session
                              session.setRepeatingRequest(
                                  captureRequestBuilder.build(),
                                  object : CameraCaptureSession.CaptureCallback() {
                                      override fun onCaptureFailed(
                                          session: CameraCaptureSession,
                                          request: CaptureRequest,
                                          failure: CaptureFailure
                                      ) {
                                          Log.i(TAG, "onCaptureFailed -> failure...")
                                      }
                                  },
                                  cameraHandler
                              )
                          }
                      })
                  camera.createCaptureSession(sessionConfiguration)
              } else {
                  @Suppress("DEPRECATION")
                  camera.createCaptureSession(
                      listOf<Surface>(imageReader.surface),
                      object : CameraCaptureSession.StateCallback() {
                          override fun onConfigureFailed(session: CameraCaptureSession) {
                              Log.i(TAG, "onConfigureFailed...")
                          }
      
                          override fun onConfigured(session: CameraCaptureSession) {
                              this@CaptureManager.mCaptureSession = session
                              session.setRepeatingRequest(
                                  captureRequestBuilder.build(),
                                  object : CameraCaptureSession.CaptureCallback() {
                                      override fun onCaptureFailed(
                                          session: CameraCaptureSession,
                                          request: CaptureRequest,
                                          failure: CaptureFailure
                                      ) {
                                          Log.i(TAG, "onCaptureFailed -> failure...")
                                      }
                                  },
                                  cameraHandler
                              )
                          }
                      },
                      cameraHandler
                  )
              }
          }
      
      
          fun yuv420ToNv21(image: Image): ByteArray {
              val width = image.width
              val height = image.height
              val ySize = width * height
              val uvSize = width * height / 2
              val out = ByteArray(ySize + uvSize)
      
              val yPlane = image.planes[0]
              val uPlane = image.planes[1]
              val vPlane = image.planes[2]
      
              // 拷貝 Y
              var pos = 0
              for (row in 0 until height) {
                  yPlane.buffer.position(row * yPlane.rowStride)
                  yPlane.buffer.get(out, pos, width)
                  pos += width
              }
      
              // 拷貝 UV (NV21)
              for (row in 0 until height / 2) {
                  for (col in 0 until width / 2) {
                      val uIndex = row * uPlane.rowStride + col * uPlane.pixelStride
                      val vIndex = row * vPlane.rowStride + col * vPlane.pixelStride
                      out[pos++] = vPlane.buffer.get(vIndex) // V
                      out[pos++] = uPlane.buffer.get(uIndex) // U
                  }
              }
      
              return out
          }
      
          internal fun release() {
              // 1. 停止重復請求
      
              try {
                  mCaptureSession?.stopRepeating()
                  mCaptureSession?.abortCaptures()
      
                  mCaptureSession?.close()
                  mImageReader?.close()
                  mCameraDevice?.close()
      
                  mImageReader?.setOnImageAvailableListener(null, null)
      
                  outputStream?.close()
      
                  mCaptureSession = null
                  mImageReader = null
                  mCameraDevice = null
      
                  cameraHandler.removeCallbacksAndMessages(null)
                  val quitSafely: Boolean = cameraThread.quitSafely()
                  Log.i(TAG, "release -> quitSafely: $quitSafely")
                  cameraThread.join()
      
              } catch (e: Exception) {
                  Log.e(TAG, "release -> error: ${e.message}", e)
              }
          }
      
          fun yuv420ToYuv420p(image: Image): ByteArray {
      
              val width = image.width
              val height = image.height
      
              val ySize = width * height
      
              val uvSize = width * height / 4  // 每個 U/V 平面大小 = width/2 * height/2
      
              val out = ByteArray(ySize + uvSize * 2)
      
              val yPlane = image.planes[0]
              val uPlane = image.planes[1]
              val vPlane = image.planes[2]
      
              var pos = 0
      
              // 拷貝 Y 平面
              val yBuffer = yPlane.buffer
              for (row in 0 until height) {
                  yBuffer.position(row * yPlane.rowStride)
                  yBuffer.get(out, pos, width)
                  pos += width
              }
      
              // 拷貝 U 平面
              val uBuffer = uPlane.buffer
              for (row in 0 until height / 2) {
                  for (col in 0 until width / 2) {
                      val index = row * uPlane.rowStride + col * uPlane.pixelStride
                      out[pos++] = uBuffer.get(index)
                  }
              }
      
              // 拷貝 V 平面
              val vBuffer = vPlane.buffer
              for (row in 0 until height / 2) {
                  for (col in 0 until width / 2) {
                      val index = row * vPlane.rowStride + col * vPlane.pixelStride
                      out[pos++] = vBuffer.get(index)
                  }
              }
      
              return out
          }
      }
      
      

      IPC調用connectDevice函數
      frameworks/av/services/camera/libcameraservice/CameraService.cpp

      Status CameraService::connectDevice(
              const sp<hardware::camera2::ICameraDeviceCallbacks>& cameraCb,
              const std::string& unresolvedCameraId,
              int oomScoreOffset, int targetSdkVersion,
              int rotationOverride, const AttributionSourceState& clientAttribution, int32_t devicePolicy,
              bool sharedMode,
              /*out*/sp<hardware::camera2::ICameraDeviceUser>* device) {
          return connectDeviceImpl(cameraCb, unresolvedCameraId, oomScoreOffset, targetSdkVersion,
                  rotationOverride, clientAttribution, devicePolicy, sharedMode,
                  /*isVendorClient*/false, device);
      }
      
      posted @ 2025-10-25 19:05  愛情丶眨眼而去  閱讀(18)  評論(0)    收藏  舉報
      主站蜘蛛池模板: 极品美女自拍偷精品视频| 一区二区三区人妻无码| 亚洲 制服 丝袜 无码| 免费人成视频网站在线观看18 | 国产福利片无码区在线观看| 日韩最新中文字幕| 精品少妇av蜜臀av| 91精品91久久久久久| 丁香婷婷在线视频| 中文字幕人妻无码一夲道| 色综合久久久久综合体桃花网| 中文字幕日韩有码国产| 粗大的内捧猛烈进出小视频| 日韩无矿砖一线二线卡乱| 美女禁区a级全片免费观看| 成人天堂资源www在线| 午夜爽爽爽男女污污污网站| 亚洲鸥美日韩精品久久| 湘乡市| 日本狂喷奶水在线播放212| 九九热在线观看精品视频| 精品在免费线中文字幕久久| 成人自拍短视频午夜福利| 久久天天躁狠狠躁夜夜婷| 清新县| 国产精品日韩av在线播放| 亚洲区1区3区4区中文字幕码| ww污污污网站在线看com| 国产熟女精品一区二区三区| 免费视频一区二区三区亚洲激情| 国内精品久久久久影院薰衣草| 国产三级国产精品久久成人| 久久这里只精品国产2| 亚洲国产日韩一区三区| 日本一区二区三区专线| 国产成人高清亚洲综合| 久久精品午夜视频| 欧洲免费一区二区三区视频| 亚洲第一无码AV无码专区| 国产福利精品一区二区| 亚洲免费观看一区二区三区|