美文网首页
camera2 framework到APP数据传递简单理解

camera2 framework到APP数据传递简单理解

作者: 梧叶已秋声 | 来源:发表于2022-09-24 17:57 被阅读0次

    数据的流向是 camera产生 -> 驱动层 ->HAL -> framework -> APP。

    1. HAL的一些理解

    HAL简介

    https://blog.csdn.net/u011037593/article/details/115498167
    Android的HAL(Hardware Abstraction Layer,硬件抽象层)是建立在Linux内核之上的一套程序。这套程序不属于Linux内核,而是运行在用户空间。HAL层屏蔽了硬件的差异,为上层应用提供了统一的硬件操作接口,这样就可以将硬件和应用软件隔离开,应用软件不必关心底层的具体硬件,而硬件更改,只需要按照HAL接口规范和标准提供HAL程序即可,而不影响应用软件。
    传统的Linux驱动中包含了访问硬件的寄存器代码和业务逻辑代码。Android HAL将Linux驱动进行了拆分,将访问硬件的寄存器代码保留到了Linux内核中,而将业务逻辑代码提取到HAL中。
    如下图的Android系统架构图所示,HAL位于Android系统服务层之下,位于Linux内核之上。HAL层包含了很多设备的HAL子系统,如Camera HAL、Audio HAL、Graphics HAL等等。

    image.png

    hal的升级历程

    出处:https://www.jianshu.com/p/8e29c3d9b27a

    hal的升级历程
    大致可以把hal层的历史分为如下3个时期:
    1.Legacy Hal:Android 8.0 之前版本的 HAL 都是编译成 so,然后动态链接到各个 frameworks service 中去。
    2.Passthrough Hal:该模式是为了兼容旧版的 HAL,旧版 HAL 实现仍以动态库的方式提供,只是 binder service 链接了动态库 HAL 实现,即 binder service 通过 hw_get_module 链接了旧版的 hal 实现,而用户端通过与 binder service IPC 通信,间接实现了与旧版 HAL 的交互。
    3.Binderized HAL:HAL 与 用户调用在不同的进程中,HAL 被写成 binder service,而用户接口如 frameworks 作为 binder client 通过 IPC 机制实现跨进程接口调用。这个是 Google 的最终设计目标。
    能看到,从历史来看,整个hal层的隔离越来越厉害,越来越接近AIDL的方向设计。因此,Google选择和AIDL的方式,构建一个hal文件,让硬件商和系统只需要关注暴露在hal文件中接口的逻辑即可。

    早期是这种


    https://blog.csdn.net/JansonZhe/article/details/47272501

    现在有点类似AIDL的那种架构。

    2. Camera Hal

    从android13开始,相机HAL接口,使用AIDL进行开发。

    https://source.android.google.cn/docs/core/camera/camera3
    Android 的相机硬件抽象层 (HAL) 可将 android.hardware.camera2 中较高级别的相机框架 API 连接到底层的相机驱动程序和硬件。从 Android 13 开始,相机 HAL 接口使用 AIDL 进行开发。Android 8.0 引入了 Treble,用于将 Camera HAL API 切换到由 HAL 接口描述语言 (HIDL) 定义的稳定接口。 如果您之前为 Android 7.0 及更低版本开发过相机 HAL 模块和驱动程序,请注意相机管道中发生的重大变化。

    官方App到camera硬件设备的相机模型如下所示:


    https://source.android.google.cn/docs/core/camera/camera3_requests_hal

    https://blog.csdn.net/TaylorPotter/article/details/105387109
    Camera Hal3 子系统
    1.应用向相机子系统发出request,一个request对应一组结果.request中包含所有配置信息。其中包括分辨率和像素格式;手动传感器、镜头和闪光灯控件;3A 操作模式;RAW 到 YUV 处理控件;以及统计信息的生成等.一次可发起多个请求,而且提交请求时不会出现阻塞。请求始终按照接收的顺序进行处理。
    2.图中看到request中携带了数据容器Surface,交到framework cameraserver中,打包成Camera3OutputStream实例,在一次CameraCaptureSession中包装成Hal request交给HAL层处理. Hal层获取到处理数据后返回給CameraServer,即CaptureResult通知到Framework,Framework cameraserver则得到HAL层传来的数据给他放进Stream中的容器Surface中.而这些Surface正是来自应用层封装了Surface的控件,这样App就得到了相机子系统传来的数据.
    3.HAL3 基于captureRequest和CaptureResult来实现事件和数据的传递,一个Request会对应一个Result.
    4.当然这些是Android原生的HAL3定义,接口放在那,各个芯片厂商实现不一样

    HAL3接口定义在
    http://androidxref.com/9.0.0_r3/xref/hardware/interfaces/camera/

    相机 HAL 概览如下所示:

    https://source.android.google.cn/docs/core/camera/camera3_requests_hal

    3. HAL到APP数据传递

    打开相机后,数据是怎么传递到APP层的?
    接下来,跟着代码去看看,数据是如何通过HAL,将camera采集(MTK是通过V4L2,高通另有框架,这里不考虑驱动层)到的图像数据,传输给App的。
    打开camera2后,调用 createCaptureSession后时序图如下所示

    https://www.jianshu.com/p/26d4b781a14d

    会创建Camera3Device.cpp

    Camera3Device.java中processCaptureResult这个方法是个callback函数,就是Camera3Device.java中定义,hal中调用。

    http://aosp.opersys.com/xref/android-9.0.0_r61/xref/hardware/interfaces/camera/device/3.2/ICameraDeviceCallback.hal

    package android.hardware.camera.device@3.2;
      
    import android.hardware.camera.common@1.0::types;
      
    interface ICameraDeviceCallback {
        processCaptureResult(vec<CaptureResult> results);
        notify(vec<NotifyMsg> msgs);
    };
    

    Camera3Device 继承ICameraDeviceCallback。
    http://aosp.opersys.com/xref/android-9.0.0_r61/xref/frameworks/av/services/camera/libcameraservice/device3/Camera3Device.h

    /**
     * CameraDevice for HAL devices with version CAMERA_DEVICE_API_VERSION_3_0 or higher.
     */
    class Camera3Device :
                public CameraDeviceBase,
                virtual public hardware::camera::device::V3_4::ICameraDeviceCallback,
                private camera3_callback_ops{
    }
    

    定义processCaptureResult。
    http://aosp.opersys.com/xref/android-9.0.0_r61/xref/frameworks/av/services/camera/libcameraservice/device3/Camera3Device.cpp

        /**
         *
         * Camera HAL device callback methods
         */
    
      void Camera3Device::processCaptureResult(const camera3_capture_result *result){
        ATRACE_CALL();
    
        status_t res;
    
        uint32_t frameNumber = result->frame_number;
        ...
      }
    

    出处:https://blog.csdn.net/u012596975/article/details/107137523
    该文件中定义了ICameraDeviceCallback接口类,由Camera Service中的Camera3Device继承并实现,通过调用ICameraDevice::open方法注册到Provider中,其主要接口如下:
    processCaptureResult: 一旦有图像数据产生会通过调用该方法将数据以及meta data上传至Camera Service。

    https://blog.csdn.net/sinat_22657459/article/details/92410687
    Camera3Device类的initialize方法中与HAL进行连接,获取session时,将自己作为callback回调类传递到了HAL,所以后续HAL就会回调到Camera3Device类的processCaptureResult方法当中。

    简单看看Camera3Device::initialize之后,发生了什么。
    http://aosp.opersys.com/xref/android-9.0.0_r61/xref/frameworks/av/services/camera/libcameraservice/device3/Camera3Device.cpp

    status_t Camera3Device::initialize(sp<CameraProviderManager> manager, const String8& monitorTags) {
        ATRACE_CALL();
    ...
    
        sp<ICameraDeviceSession> session;
        ATRACE_BEGIN("CameraHal::openSession");
        status_t res = manager->openSession(mId.string(), this,
                /*out*/ &session);
    ...
    }
    

    http://aosp.opersys.com/xref/android-9.0.0_r61/xref/frameworks/av/services/camera/libcameraservice/common/CameraProviderManager.cpp#279

    status_t CameraProviderManager::openSession(const std::string &id,
            const sp<hardware::camera::device::V3_2::ICameraDeviceCallback>& callback,
            /*out*/
            sp<hardware::camera::device::V3_2::ICameraDeviceSession> *session) {
    ...
        auto deviceInfo = findDeviceInfoLocked(id,
                /*minVersion*/ {3,0}, /*maxVersion*/ {4,0});
    ...
       auto *deviceInfo3 = static_cast<ProviderInfo::DeviceInfo3*>(deviceInfo);
    
        hardware::Return<void> ret;
        ret = deviceInfo3->mInterface->open(callback, [&status, &session]
                (Status s, const sp<device::V3_2::ICameraDeviceSession>& cameraSession) {
                    status = s;
                    if (status == Status::OK) {
                        *session = cameraSession;
                    }
                });
    ...
    }
    

    openSession的时候传入了this,即Camera3Device,而Camera3Device继承自ICameraDeviceCallback,所以openSession传入了Callback接口。然后调用deviceInfo3->mInterface->open

    http://aosp.opersys.com/xref/android-9.0.0_r61/xref/frameworks/av/services/camera/libcameraservice/common/CameraProviderManager.h#346

            // HALv3-specific camera fields, including the actual device interface
            struct DeviceInfo3 : public DeviceInfo {
                typedef hardware::camera::device::V3_2::ICameraDevice InterfaceT;
                const sp<InterfaceT> mInterface;
      ...
            };
    

    hidl类似aidl,定义.aidl文件,生成.java文件,然后通讯。这里是.hal文件,生成其他文件。
    这里的InterfaceT是ICameraDevice。
    http://aosp.opersys.com/xref/android-9.0.0_r61/xref/hardware/interfaces/camera/device/3.2/ICameraDevice.hal

        open(ICameraDeviceCallback callback) generates
                (Status status, ICameraDeviceSession session);
    

    在编译系统的时候 ICameraDevice.hal 会编译成 ICameraDevice.h
    使用 find 命令查找 :find ./out -name ICameraDevice.h
    打开 ICameraDevice.h

    //  ICameraDevice.h
    using open_cb = std::function<void(::android::hardware::camera::common::V1_0::Status status, const ::android::sp<ICameraDeviceSession>& session)>;
     
    virtual ::android::hardware::Return<void> open(const ::android::sp<ICameraDeviceCallback>& callback, open_cb _hidl_cb) = 0;
    

    ICameraDevice.h的 open(...) 实现在CameraDevice.cpp中。
    所以就到了这里mModule->open,mModule是构造的时候传入,mModule = module。
    http://aosp.opersys.com/xref/android-9.0.0_r61/xref/hardware/interfaces/camera/device/3.2/default/CameraDevice.cpp#210

    CameraDevice::CameraDevice(
        sp<CameraModule> module, const std::string& cameraId,
        const SortedVector<std::pair<std::string, std::string>>& cameraDeviceNames) :
            mModule(module),
            mCameraId(cameraId),
            mDisconnected(false),
            mCameraDeviceNames(cameraDeviceNames) {
        ....
      }
    ...
    Return<void> CameraDevice::open(const sp<ICameraDeviceCallback>& callback, open_cb _hidl_cb){
    ...
            res = mModule->open(mCameraId.c_str(),
                    reinterpret_cast<hw_device_t**>(&device));
    ...
    }
    ....
    
    

    http://aosp.opersys.com/xref/android-9.0.0_r61/xref/hardware/interfaces/camera/device/3.2/default/CameraDevice_3_2.h

        const sp<CameraModule> mModule;
    

    来看一下CameraModule。
    http://aosp.opersys.com/xref/android-9.0.0_r61/xref/hardware/interfaces/camera/common/1.0/default/CameraModule.cpp

    int CameraModule::open(const char* id, struct hw_device_t** device) {
        int res;
        ATRACE_BEGIN("camera_module->open");
        res = filterOpenErrorCode(mModule->common.methods->open(&mModule->common, id, device));
        ATRACE_END();
        return res;
    }
    

    这个open函数指针指向的位置要看各个厂商具体怎么写的HAL,各个不同的厂商这里不同。由于一般是不能发出来的,所以也比较少有人发代码,最多就发发思路。
    可简单参考这篇,这里列举了几家厂商的HAL:
    Android 8.0系统源码分析--openCamera(HAL)启动过程源码分析

    例如device\google\marlin\camera\QCamera2\stack\mm-camera-interface\src\mm_camera_interface.c

    int32_t mm_camera_open(mm_camera_obj_t *my_obj)
    {
        char dev_name[MM_CAMERA_DEV_NAME_LEN];
        int32_t rc = 0;
        int8_t n_try=MM_CAMERA_DEV_OPEN_TRIES;
        uint8_t sleep_msec=MM_CAMERA_DEV_OPEN_RETRY_SLEEP;
        int cam_idx = 0;
        const char *dev_name_value = NULL;
        int l_errno = 0;
     
        LOGD("begin\n");
     
        if (NULL == my_obj) {
            goto on_error;
        }
        dev_name_value = mm_camera_util_get_dev_name(my_obj->my_hdl);
        if (NULL == dev_name_value) {
            goto on_error;
        }
        snprintf(dev_name, sizeof(dev_name), "/dev/%s",
                 dev_name_value);
        sscanf(dev_name, "/dev/video%d", &cam_idx);
        LOGD("dev name = %s, cam_idx = %d", dev_name, cam_idx);
     
        do{
            n_try--;
            errno = 0;
            my_obj->ctrl_fd = open(dev_name, O_RDWR | O_NONBLOCK);
            l_errno = errno;
            LOGD("ctrl_fd = %d, errno == %d", my_obj->ctrl_fd, l_errno);
            if((my_obj->ctrl_fd >= 0) || (errno != EIO && errno != ETIMEDOUT) || (n_try <= 0 )) {
                break;
            }
            LOGE("Failed with %s error, retrying after %d milli-seconds",
                  strerror(errno), sleep_msec);
            usleep(sleep_msec * 1000U);
        }while (n_try > 0);
     
        if (my_obj->ctrl_fd < 0) {
            LOGE("cannot open control fd of '%s' (%s)\n",
                      dev_name, strerror(l_errno));
            if (l_errno == EBUSY)
                rc = -EUSERS;
            else
                rc = -1;
            goto on_error;
        } else {
            mm_camera_get_session_id(my_obj, &my_obj->sessionid);
            LOGH("Camera Opened id = %d sessionid = %d", cam_idx, my_obj->sessionid);
        }
     
    #ifdef DAEMON_PRESENT
        /* open domain socket*/
        n_try = MM_CAMERA_DEV_OPEN_TRIES;
        do {
            n_try--;
            my_obj->ds_fd = mm_camera_socket_create(cam_idx, MM_CAMERA_SOCK_TYPE_UDP);
            l_errno = errno;
            LOGD("ds_fd = %d, errno = %d", my_obj->ds_fd, l_errno);
            if((my_obj->ds_fd >= 0) || (n_try <= 0 )) {
                LOGD("opened, break out while loop");
                break;
            }
            LOGD("failed with I/O error retrying after %d milli-seconds",
                  sleep_msec);
            usleep(sleep_msec * 1000U);
        } while (n_try > 0);
     
        if (my_obj->ds_fd < 0) {
            LOGE("cannot open domain socket fd of '%s'(%s)\n",
                      dev_name, strerror(l_errno));
            rc = -1;
            goto on_error;
        }
    #else /* DAEMON_PRESENT */
        cam_status_t cam_status;
        cam_status = mm_camera_module_open_session(my_obj->sessionid,
                mm_camera_module_event_handler);
        if (cam_status < 0) {
            LOGE("Failed to open session");
            if (cam_status == CAM_STATUS_BUSY) {
                rc = -EUSERS;
            } else {
                rc = -1;
            }
            goto on_error;
        }
    #endif /* DAEMON_PRESENT */
     
        pthread_mutex_init(&my_obj->msg_lock, NULL);
        pthread_mutex_init(&my_obj->cb_lock, NULL);
        pthread_mutex_init(&my_obj->evt_lock, NULL);
        PTHREAD_COND_INIT(&my_obj->evt_cond);
     
        LOGD("Launch evt Thread in Cam Open");
        snprintf(my_obj->evt_thread.threadName, THREAD_NAME_SIZE, "CAM_Dispatch");
        mm_camera_cmd_thread_launch(&my_obj->evt_thread,
                                    mm_camera_dispatch_app_event,
                                    (void *)my_obj);
     
        /* launch event poll thread
         * we will add evt fd into event poll thread upon user first register for evt */
        LOGD("Launch evt Poll Thread in Cam Open");
        snprintf(my_obj->evt_poll_thread.threadName, THREAD_NAME_SIZE, "CAM_evntPoll");
        mm_camera_poll_thread_launch(&my_obj->evt_poll_thread,
                                     MM_CAMERA_POLL_TYPE_EVT);
        mm_camera_evt_sub(my_obj, TRUE);
     
        /* unlock cam_lock, we need release global intf_lock in camera_open(),
         * in order not block operation of other Camera in dual camera use case.*/
        pthread_mutex_unlock(&my_obj->cam_lock);
        LOGD("end (rc = %d)\n", rc);
        return rc;
     
    on_error:
     
        if (NULL == dev_name_value) {
            LOGE("Invalid device name\n");
            rc = -1;
        }
     
        if (NULL == my_obj) {
            LOGE("Invalid camera object\n");
            rc = -1;
        } else {
            if (my_obj->ctrl_fd >= 0) {
                close(my_obj->ctrl_fd);
                my_obj->ctrl_fd = -1;
            }
    #ifdef DAEMON_PRESENT
            if (my_obj->ds_fd >= 0) {
                mm_camera_socket_close(my_obj->ds_fd);
                my_obj->ds_fd = -1;
            }
    #endif
        }
     
        /* unlock cam_lock, we need release global intf_lock in camera_open(),
         * in order not block operation of other Camera in dual camera use case.*/
        pthread_mutex_unlock(&my_obj->cam_lock);
        return rc;
    }
    
    

    Hal会调用驱动来处理camera的open。

    下面回过头来看看Camera3Device类的processCaptureResult方法,重点是for循环对每个result调用processOneCaptureResultLocked进一步处理。
    http://aosp.opersys.com/xref/android-9.0.0_r61/xref/frameworks/av/services/camera/libcameraservice/device3/Camera3Device.cpp

    // Only one processCaptureResult should be called at a time, so
    // the locks won't block. The locks are present here simply to enforce this.
    hardware::Return<void> Camera3Device::processCaptureResult(
            const hardware::hidl_vec<
                    hardware::camera::device::V3_2::CaptureResult>& results) {
    ...
        for (const auto& result : results) {
            processOneCaptureResultLocked(result, noPhysMetadata);
        }
    ...
    }
    
    
    void Camera3Device::processOneCaptureResultLocked(
            const hardware::camera::device::V3_2::CaptureResult& result,
            const hardware::hidl_vec<
                    hardware::camera::device::V3_4::PhysicalCameraMetadata> physicalCameraMetadatas) {
    //封装返回结果 camera3_capture_result   
       camera3_capture_result r;
    ...
    
    // 读取metadata
        // Read and validate the result metadata.
        hardware::camera::device::V3_2::CameraMetadata resultMetadata;
        res = readOneCameraMetadataLocked(result.fmqResultSize, resultMetadata, result.result);
        if (res != OK) {
            ALOGE("%s: Frame %d: Failed to read capture result metadata",
                    __FUNCTION__, result.frameNumber);
            return;
        }
        r.result = reinterpret_cast<const camera_metadata_t*>(resultMetadata.data());
    
    ...
    
        std::vector<camera3_stream_buffer_t> outputBuffers(result.outputBuffers.size());
        std::vector<buffer_handle_t> outputBufferHandles(result.outputBuffers.size());
    //for循环将 result.outputBuffers 中的 StreamBuffer
        for (size_t i = 0; i < result.outputBuffers.size(); i++) {
            auto& bDst = outputBuffers[i];
            const StreamBuffer &bSrc = result.outputBuffers[i];
    
            ssize_t idx = mOutputStreams.indexOfKey(bSrc.streamId);
            if (idx == NAME_NOT_FOUND) {
                ALOGE("%s: Frame %d: Buffer %zu: Invalid output stream id %d",
                        __FUNCTION__, result.frameNumber, i, bSrc.streamId);
                return;
            }
            bDst.stream = mOutputStreams.valueAt(idx)->asHalStream();
    
            buffer_handle_t *buffer;
            // 取HAL 生产的buffer,这个buffer指针指向camera采集到的数据
            res = mInterface->popInflightBuffer(result.frameNumber, bSrc.streamId, &buffer);
            ...
            bDst.buffer = buffer;
            ...
        }
    ...
    
    //调用重载函数处理一帧结果
        processCaptureResult(&r);
    }
    
    
    
    
    /**
     * Camera HAL device callback methods
     */
    
    void Camera3Device::processCaptureResult(const camera3_capture_result *result) {
        ATRACE_CALL();
    ...
    
            // If shutter event isn't received yet, append the output buffers to
            // the in-flight request. Otherwise, return the output buffers to
            // streams.
            if (shutterTimestamp == 0) {
                request.pendingOutputBuffers.appendArray(result->output_buffers,
                    result->num_output_buffers);
            } else {
    //调用returnOutputBuffers会调用BufferQueue的queueBuffer
                returnOutputBuffers(result->output_buffers,
                    result->num_output_buffers, shutterTimestamp);
            }
    
            if (result->result != NULL && !isPartialResult) {
                for (uint32_t i = 0; i < result->num_physcam_metadata; i++) {
                    CameraMetadata physicalMetadata;
                    physicalMetadata.append(result->physcam_metadata[i]);
                    request.physicalMetadatas.push_back({String16(result->physcam_ids[i]),
                            physicalMetadata});
                }
                if (shutterTimestamp == 0) {
                    request.pendingMetadata = result->result;
                    request.collectedPartialResult = collectedPartialResult;
               } else if (request.hasCallback) {
                    CameraMetadata metadata;
                    metadata = result->result;
    // 调用sendCaptureResult把result发送给APP
                    sendCaptureResult(metadata, request.resultExtras,
                        collectedPartialResult, frameNumber,
                        hasInputBufferInRequest, request.physicalMetadatas);
                }
            }
    ...
    }
    

    这里会调用returnOutputBuffers和sendCaptureResult。
    returnOutputBuffers会将buffer控制权交给BufferQueue的consumer(如ImageReader等拥有Surface的类),sendCaptureResult会返回metadata。
    如下图:


    https://blog.csdn.net/u012596975/article/details/107137110

    出处: https://blog.csdn.net/u012596975/article/details/107137156
    当一旦有Meta Data生成,Camera Provider便会通过ICameraDeviceCallback的processCaptureResult_3_4方法将数据给到Camera Service,而该接口的实现对应的是Camera3Device的processCaptureResult_3_4方法,在该方法会通过层层调用,调用sendCaptureResult方法将Result放入一个mResultQueue中,并且通知FrameProcessorBase的线程去取出Result,并且将其发送至CameraDeviceClient中,之后通过内部的CameraDeviceCallbacks远程代理的onResultReceived方法将结果上传至Framework层,进而给到App中进行处理。
    随后Image Data前期也会按照类似的流程走到Camera3Device中,但是会通过调用returnOutputBuffers方法将数据给到Camera3OutputStream中,而该Stream中会通过BufferQueue这一生产者消费者模式中的生产者的queue方法通知消费者对该buffer进行消费,而消费者正是App端的诸如ImageReader等拥有Surface的类,最后App便可以将图像数据取出进行后期处理了。

    再来回顾一下BufferQueue。
    BufferQueue,永远不会赋值缓冲区,始终通过句柄传递。


    https://source.android.google.cn/docs/core/graphics/arch-bq-gralloc
    https://blog.csdn.net/yangwen123/article/details/16863377 https://www.cnblogs.com/roger-yu/p/16029867.html

    调用returnOutputBuffers后的时序图


    https://www.jianshu.com/p/26d4b781a14d

    Surface Producter/SurfaceFlinger Consumer关系图如下:

    https://www.jianshu.com/p/26d4b781a14d

    框架流程:


    https://www.jianshu.com/p/26d4b781a14d

    参考链接:
    深入理解Android相机体系结构之二
    深入理解Android相机体系结构之三
    深入理解Android相机体系结构之十

    官方文档

    Android Camera简单整理(一)-Camera Android架构(基于Q)
    Android Camera简单整理(三)-Mtk Camera MtkCam3架构学习

    Android硬件抽象层HAL之简介(一)
    关于Android的HAL的一些理解
    Android 重学系列 SurfaceFlinger 的HAL层初始化

    BufferQueue详解 原理
    Android-Fk:BufferQueue学习整理

    深入理解Android相机体系结构之五
    Android Camera2 Framwork+Hal+Surface整体数据流程

    Android 8.0系统源码分析--Camera processCaptureResult结果回传源码分析

    Treble 框架下的 Android Camera HAL3 一
    Android 8.0系统源码分析--openCamera(HAL)启动过程源码分析
    Android Camera API2 学习 Framework & HAL

    相关文章

      网友评论

          本文标题:camera2 framework到APP数据传递简单理解

          本文链接:https://www.haomeiwen.com/subject/hzwoortx.html