美文网首页Android开发Android开发Android开发经验谈
Android 系统运行机制 【Looper】【Choreogr

Android 系统运行机制 【Looper】【Choreogr

作者: 壹零二肆 | 来源:发表于2022-01-09 21:53 被阅读0次

    目录:
    1 MessageQueue next()
    2 Vsync
    3 Choreographer doFrame
    4 input


    系统是一个无限循环的模型, Android也不例外,进程被创建后就陷入了无限循环的状态

    系统运行最重要的两个概念:输入,输出。

    • 输入: 按键事件、触摸事件、鼠标事件、轨迹球事件
    • 输出: Choreographer 如何指挥完成一帧的绘制

    Android 中输入 输出 的往复循环都是在 looper 中消息机制驱动下完成的

    looper 的循环中, messageQueue next 取消息进行处理, 处理输入事件, 进行输出, 完成和用户交互

    主线程

    应用生命周期内会不断 产生 message 到 messageQueue 中, 有: java层 也有 native层

    其中最核心的方法就是 messageQueue 的 next 方法, 其中会先处理 java 层消息, 当 java 层没有消息时候, 会执行 nativePollOnce 来处理 native 的消息 以及监听 fd 各种事件

    从硬件来看, 屏幕不会一直刷新, 屏幕的刷新只需要符合人眼的视觉停留机制

    24Hz , 连续刷新每一帧, 人眼就会认为画面是流畅的

    所以我们只需要配合上这个频率, 在需要更新 UI 的时候执行绘制操作

    如何以这个频率进行绘制每一帧: Android 的方案是 Vsync 信号驱动。

    Vsync 信号的频率就是 24Hz , 也就是每隔 16.6667 ms 发送一次 Vsync 信号提示系统合成一帧。

    监听屏幕刷新来发送 Vsync 信号的能力,应用层 是做不到的, 系统是通过 jni 回调到 Choreographer 中的 Vsync 监听, 将这个重要信号从 native 传递到 java 层。

    总体来说 输入事件获取 Vsync信号获取 都是先由 native 捕获事件 然后 jni 到 java 层实现业务逻辑

    执行的是 messageQueue 中的关键方法: next


    1 MessageQueue next()

    next 主要的逻辑分为: java 部分 和 native 部分

    java 上主要是取java层的 messageQueue msg 执行, 无 msg 就 idleHandler

    java层 无 msg 会执行 native 的 pollOnce@Looper

    next@MessageQueue native message

    native looper 中 fd 监听封装为 requestQueue, epoll_wait 将 fd 中的事件和对应 request 封装为 response 处理, 处理的时候会调用 fd 对应的 callback 的 handleEvent

        /**
         * Waits for events to be available, with optional timeout in milliseconds.
         * Invokes callbacks for all file descriptors on which an event occurred.
         *
         * If the timeout is zero, returns immediately without blocking.
         * If the timeout is negative, waits indefinitely until an event appears.
         *
         * Returns POLL_WAKE if the poll was awoken using wake() before
         * the timeout expired and no callbacks were invoked and no other file
         * descriptors were ready.
         *
         * Returns POLL_CALLBACK if one or more callbacks were invoked.
         *
         * Returns POLL_TIMEOUT if there was no data before the given
         * timeout expired.
         *
         * Returns POLL_ERROR if an error occurred.
         *
         * Returns a value >= 0 containing an identifier if its file descriptor has data
         * and it has no callback function (requiring the caller here to handle it).
         * In this (and only this) case outFd, outEvents and outData will contain the poll
         * events and data associated with the fd, otherwise they will be set to NULL.
         *
         * This method does not return until it has finished invoking the appropriate callbacks
         * for all file descriptors that were signalled.
         */
        int pollOnce(int timeoutMillis, int* outFd, int* outEvents, void** outData);
    

    native 层 pollOnce 主要做的事情是:

    • epoll_wait 监听 fd 封装为 response
    • 处理所有 native message
    • 处理 response 回调 handleEvent ,handleEvent 很多回回调到 java 层

    vsync 信号,输入事件, 都是通过这样的机制完成的。


    2 vsync

    epoll_wait 机制 拿到的 event , 都在 response pollOnce pollInner 处理了

    这里的 dispatchVsync 从 native 回到 java 层

    native:

    int DisplayEventDispatcher::handleEvent(int, int events, void*) {
        if (events & (Looper::EVENT_ERROR | Looper::EVENT_HANGUP)) {
            ALOGE("Display event receiver pipe was closed or an error occurred.  "
                  "events=0x%x",
                  events);
            return 0; // remove the callback
        }
    ......
            dispatchVsync(vsyncTimestamp, vsyncDisplayId, vsyncCount, vsyncEventData);
        }
    

    java:

        // Called from native code.
        @SuppressWarnings("unused")
        @UnsupportedAppUsage
        private void dispatchVsync(long timestampNanos, long physicalDisplayId, int frame) {
            onVsync(timestampNanos, physicalDisplayId, frame);
        }
    
        private final class FrameDisplayEventReceiver extends DisplayEventReceiver
                implements Runnable {
    
            @Override
            public void onVsync(long timestampNanos, long physicalDisplayId, int frame) {
            
                mTimestampNanos = timestampNanos;
                mFrame = frame;
                Message msg = Message.obtain(mHandler, this);
                msg.setAsynchronous(true);
                mHandler.sendMessageAtTime(msg, timestampNanos / TimeUtils.NANOS_PER_MS);
            }
    
            @Override
            public void run() {
                mHavePendingVsync = false;
                doFrame(mTimestampNanos, mFrame);
            }
        }
    

    收到 Vsync 信号后, Choreographer 执行 doFrame

    应用层重要的工作几乎都在 doFrame 中


    3 Choreographer doFrame

    首先看下 doFrame 执行了什么:

            try {
                Trace.traceBegin(Trace.TRACE_TAG_VIEW, "Choreographer#doFrame");
                AnimationUtils.lockAnimationClock(frameTimeNanos / TimeUtils.NANOS_PER_MS);
    
                mFrameInfo.markInputHandlingStart();
                doCallbacks(Choreographer.CALLBACK_INPUT, frameTimeNanos);
    
                mFrameInfo.markAnimationsStart();
                doCallbacks(Choreographer.CALLBACK_ANIMATION, frameTimeNanos);
                doCallbacks(Choreographer.CALLBACK_INSETS_ANIMATION, frameTimeNanos);
    
                mFrameInfo.markPerformTraversalsStart();
                doCallbacks(Choreographer.CALLBACK_TRAVERSAL, frameTimeNanos);
    
                doCallbacks(Choreographer.CALLBACK_COMMIT, frameTimeNanos);
            } finally {
                AnimationUtils.unlockAnimationClock();
                Trace.traceEnd(Trace.TRACE_TAG_VIEW);
            }
    

    UI 线程的核心工作就在这几个方法中:

    上述执行 callback 的过程就对应了图片中 依次处理 input animation traversal 这几个关键过程

    执行的周期是 16.6ms, 实际可能因为一些 delay 造成一些延迟、丢帧


    4 input

    input 事件的整体逻辑和 vsync 类似

    native handleEvent ,在 NativeInputEventReceiver 中处理事件, 区分不同事件会通过 JNI

    走到 java 层,WindowInputEventReceiver 然后进行分发消费

    native :

    int NativeInputEventReceiver::handleEvent(int receiveFd, int events, void* data) {
        // Allowed return values of this function as documented in LooperCallback::handleEvent
        constexpr int REMOVE_CALLBACK = 0;
        constexpr int KEEP_CALLBACK = 1;
    
    
        if (events & ALOOPER_EVENT_INPUT) {
            JNIEnv* env = AndroidRuntime::getJNIEnv();
            status_t status = consumeEvents(env, false /*consumeBatches*/, -1, nullptr);
            mMessageQueue->raiseAndClearException(env, "handleReceiveCallback");
            return status == OK || status == NO_MEMORY ? KEEP_CALLBACK : REMOVE_CALLBACK;
        }
    
    
    status_t NativeInputEventReceiver::consumeEvents(JNIEnv* env,
            bool consumeBatches, nsecs_t frameTime, bool* outConsumedBatch) {
     
        for (;;) {
            uint32_t seq;
            InputEvent* inputEvent;
    
            status_t status = mInputConsumer.consume(&mInputEventFactory,
                    consumeBatches, frameTime, &seq, &inputEvent);
            if (status != OK && status != WOULD_BLOCK) {
                ALOGE("channel '%s' ~ Failed to consume input event.  status=%s(%d)",
                      getInputChannelName().c_str(), statusToString(status).c_str(), status);
                return status;
            }
    
            if (status == WOULD_BLOCK) {
                if (!skipCallbacks && !mBatchedInputEventPending && mInputConsumer.hasPendingBatch()) {
                    // There is a pending batch.  Come back later.
                    if (!receiverObj.get()) {
                        receiverObj.reset(jniGetReferent(env, mReceiverWeakGlobal));
                        if (!receiverObj.get()) {
                            ALOGW("channel '%s' ~ Receiver object was finalized "
                                  "without being disposed.",
                                  getInputChannelName().c_str());
                            return DEAD_OBJECT;
                        }
                    }
    
                    mBatchedInputEventPending = true;
                    if (kDebugDispatchCycle) {
                        ALOGD("channel '%s' ~ Dispatching batched input event pending notification.",
                              getInputChannelName().c_str());
                    }
    
                    env->CallVoidMethod(receiverObj.get(),
                                        gInputEventReceiverClassInfo.onBatchedInputEventPending,
                                        mInputConsumer.getPendingBatchSource());
                    if (env->ExceptionCheck()) {
                        ALOGE("Exception dispatching batched input events.");
                        mBatchedInputEventPending = false; // try again later
                    }
                }
                return OK;
            }
            assert(inputEvent);
    
            if (!skipCallbacks) {
                if (!receiverObj.get()) {
                    receiverObj.reset(jniGetReferent(env, mReceiverWeakGlobal));
                    if (!receiverObj.get()) {
                        ALOGW("channel '%s' ~ Receiver object was finalized "
                                "without being disposed.", getInputChannelName().c_str());
                        return DEAD_OBJECT;
                    }
                }
    
                jobject inputEventObj;
                switch (inputEvent->getType()) {
                case AINPUT_EVENT_TYPE_KEY:
                    if (kDebugDispatchCycle) {
                        ALOGD("channel '%s' ~ Received key event.", getInputChannelName().c_str());
                    }
                    inputEventObj = android_view_KeyEvent_fromNative(env,
                            static_cast<KeyEvent*>(inputEvent));
                    break;
    
                case AINPUT_EVENT_TYPE_MOTION: {
                    if (kDebugDispatchCycle) {
                        ALOGD("channel '%s' ~ Received motion event.", getInputChannelName().c_str());
                    }
                    MotionEvent* motionEvent = static_cast<MotionEvent*>(inputEvent);
                    if ((motionEvent->getAction() & AMOTION_EVENT_ACTION_MOVE) && outConsumedBatch) {
                        *outConsumedBatch = true;
                    }
                    inputEventObj = android_view_MotionEvent_obtainAsCopy(env, motionEvent);
                    break;
                }
                case AINPUT_EVENT_TYPE_FOCUS: {
                    FocusEvent* focusEvent = static_cast<FocusEvent*>(inputEvent);
                    if (kDebugDispatchCycle) {
                        ALOGD("channel '%s' ~ Received focus event: hasFocus=%s, inTouchMode=%s.",
                              getInputChannelName().c_str(), toString(focusEvent->getHasFocus()),
                              toString(focusEvent->getInTouchMode()));
                    }
                    env->CallVoidMethod(receiverObj.get(), gInputEventReceiverClassInfo.onFocusEvent,
                                        jboolean(focusEvent->getHasFocus()),
                                        jboolean(focusEvent->getInTouchMode()));
                    finishInputEvent(seq, true /* handled */);
                    continue;
                }
                case AINPUT_EVENT_TYPE_CAPTURE: {
                    const CaptureEvent* captureEvent = static_cast<CaptureEvent*>(inputEvent);
                    if (kDebugDispatchCycle) {
                        ALOGD("channel '%s' ~ Received capture event: pointerCaptureEnabled=%s",
                              getInputChannelName().c_str(),
                              toString(captureEvent->getPointerCaptureEnabled()));
                    }
                    env->CallVoidMethod(receiverObj.get(),
                                        gInputEventReceiverClassInfo.onPointerCaptureEvent,
                                        jboolean(captureEvent->getPointerCaptureEnabled()));
                    finishInputEvent(seq, true /* handled */);
                    continue;
                }
                case AINPUT_EVENT_TYPE_DRAG: {
                    const DragEvent* dragEvent = static_cast<DragEvent*>(inputEvent);
                    if (kDebugDispatchCycle) {
                        ALOGD("channel '%s' ~ Received drag event: isExiting=%s",
                              getInputChannelName().c_str(), toString(dragEvent->isExiting()));
                    }
                    env->CallVoidMethod(receiverObj.get(), gInputEventReceiverClassInfo.onDragEvent,
                                        jboolean(dragEvent->isExiting()), dragEvent->getX(),
                                        dragEvent->getY());
                    finishInputEvent(seq, true /* handled */);
                    continue;
                }
    
                default:
                    assert(false); // InputConsumer should prevent this from ever happening
                    inputEventObj = nullptr;
                }
    
                if (inputEventObj) {
                    if (kDebugDispatchCycle) {
                        ALOGD("channel '%s' ~ Dispatching input event.", getInputChannelName().c_str());
                    }
                    env->CallVoidMethod(receiverObj.get(),
                            gInputEventReceiverClassInfo.dispatchInputEvent, seq, inputEventObj);
                    if (env->ExceptionCheck()) {
                        ALOGE("Exception dispatching input event.");
                        skipCallbacks = true;
                    }
                    env->DeleteLocalRef(inputEventObj);
                } else {
                    ALOGW("channel '%s' ~ Failed to obtain event object.",
                            getInputChannelName().c_str());
                    skipCallbacks = true;
                }
            }
        }
    }
    

    java:

        final class WindowInputEventReceiver extends InputEventReceiver {
            public WindowInputEventReceiver(InputChannel inputChannel, Looper looper) {
                super(inputChannel, looper);
            }
    
            @Override
            public void onInputEvent(InputEvent event) {
                Trace.traceBegin(Trace.TRACE_TAG_VIEW, "processInputEventForCompatibility");
                List<InputEvent> processedEvents;
                try {
                    processedEvents =
                        mInputCompatProcessor.processInputEventForCompatibility(event);
                } finally {
                    Trace.traceEnd(Trace.TRACE_TAG_VIEW);
                }
                if (processedEvents != null) {
                    if (processedEvents.isEmpty()) {
                        // InputEvent consumed by mInputCompatProcessor
                        finishInputEvent(event, true);
                    } else {
                        for (int i = 0; i < processedEvents.size(); i++) {
                            enqueueInputEvent(
                                    processedEvents.get(i), this,
                                    QueuedInputEvent.FLAG_MODIFIED_FOR_COMPATIBILITY, true);
                        }
                    }
                } else {
                    enqueueInputEvent(event, this, 0, true);
                }
            }
    
            @Override
            public void onBatchedInputEventPending(int source) {
                // mStopped: There will be no more choreographer callbacks if we are stopped,
                // so we must consume all input immediately to prevent ANR
                final boolean unbuffered = mUnbufferedInputDispatch
                        || (source & mUnbufferedInputSource) != SOURCE_CLASS_NONE
                        || mStopped;
                if (unbuffered) {
                    if (mConsumeBatchedInputScheduled) {
                        unscheduleConsumeBatchedInput();
                    }
                    // Consume event immediately if unbuffered input dispatch has been requested.
                    consumeBatchedInputEvents(-1);
                    return;
                }
                scheduleConsumeBatchedInput();
            }
    
            @Override
            public void onFocusEvent(boolean hasFocus, boolean inTouchMode) {
                windowFocusChanged(hasFocus, inTouchMode);
            }
    
            @Override
            public void dispose() {
                unscheduleConsumeBatchedInput();
                super.dispose();
            }
        }
    

    input事件的处理流程:

    输入event deliverInputEvent

    deliver的 input 事件会来到 InputStage

    InputStage 是一个责任链, 会分发消费这些 InputEvent

    下面以滑动一下 recyclerView 为例子, 整体逻辑如下:

    native post input callback

    vsync 信号到来, 执行 doFrame,执行到 input 阶段


    do input callback

    touchEvent 消费, recyclerView layout 一些 ViewHolder

    onbindViewHolder

    scroll 中 fill 结束,会执行 一个 recyclerView viewProperty 变化, 触发了invalidate

    invalidate 会走硬件加速, 一直到达 ViewRootImpl , 从而将 Traversal 的 callback post choreographer执行到 traversal 阶段就会执行

    滚动 schduleTraversals

    ViewRootImpl 执行 performTraversal , 会根据目前是否需要重新layout , 然后执行layout, draw 等流程

    Traversal

    整个 input 到 traversal 结束,硬件绘制后, sync 任务到 GPU , 然后合成一帧。

    交给 SurfaceFlinger 来显示。

    SurfaceFlinger 是系统进程, 每一个应用进程是一个 client 端, 通过 IPC 机制,client 将图像显示工作交给 SurfaceFlinger

    launch 一个 app:

    相关文章

      网友评论

        本文标题:Android 系统运行机制 【Looper】【Choreogr

        本文链接:https://www.haomeiwen.com/subject/upsucrtx.html