美文网首页
基于Camera2实现边录制视频边实时分析图片

基于Camera2实现边录制视频边实时分析图片

作者: Dale_Dawson | 来源:发表于2021-12-21 20:04 被阅读0次

    前言

        项目中有个需求是在边录制视频的时候实时分析视频中的人脸情况,在camera1中预览时有个回调PreviewCallback的onPreviewFrame方法中能获取每一帧的数据,但是一旦开始录制视频,就不会走onPreviewFrame方法。查阅资料看到很多人说使用camera2能实现上述需求,但是也没找到现成的,所以自己琢磨了一番之后总算是实现了,特此记录。

    new Camera.PreviewCallback() {
         @Override
         public void onPreviewFrame(byte[] data, Camera camera) 
         {
          //TODO对帧数据进行处理
         }
    }
    

    正文

        话不多说,进入正题,实现上述需求主要就是使用ImageReader对象

    mImageReader = ImageReader.newInstance(mPreviewSize.getWidth(), mPreviewSize.getHeight(),
                        ImageFormat.YUV_420_888, 2);
    

    创建ImageReader对像
    ImageReader ir = ImageReader.newInstance(int width, int height, int format, int maxImages);
    参数:
    width:默认图像的宽度像素
    height:默认图像的高度像素
    format:图像的格式
    maxImages:用户想要读图像的最大数量

    ImageReader类的主要操作:

    • getSurface() //得到一个表面,可用于生产这个ImageReader图像
    • acquireLatestImage() //从ImageReader的队列获得最新的图像,放弃旧的图像。
    • acquireNextImage() //从ImageReader的队列获取下一个图像
    • getMaxImages() //最大数量的图像
    • getWidth() //每个图像的宽度,以像素为单位。
    • getHeight() //每个图像的高度,以像素为单位。
    • getImageFormat() //图像格式。
    • close() //释放与此ImageReader相关的所有资源。用完记得关
    设置监听
    mImageReader.setOnImageAvailableListener(
                        new OnImageAvailableListenerImpl(), mBackgroundHandler);
    

    我这里使用的是usb单目摄像头,所以图像格式我使用的是ImageFormat.YUV_420_888,而且我最后需要得到nv21来识别图像中的人脸,下面的方法就是处理Image 得到一个nv21的数据

    private class OnImageAvailableListenerImpl implements ImageReader.OnImageAvailableListener {
            private byte[] y;
            private byte[] u;
            private byte[] v;
            private byte[] nv21;
            private ReentrantLock lock = new ReentrantLock();
            private Object mImageReaderLock = 1;//1 available,0 unAvailable
    
            @Override
            public void onImageAvailable(ImageReader reader) {
                Image image = reader.acquireNextImage();
                if (image == null) {
                    return;
                }
                synchronized (mImageReaderLock) {
                    if (!mImageReaderLock.equals(1)) {
                        Logger.v(TAG, "--- image not available,just return!!!");
                        image.close();
                        return;
                    }
                    if (ImageFormat.YUV_420_888 == image.getFormat()) {
                        Image.Plane[] planes = image.getPlanes();
                        lock.lock();
                        if (y == null) {
                            y = new byte[planes[0].getBuffer().limit() - planes[0].getBuffer().position()];
                            u = new byte[planes[1].getBuffer().limit() - planes[1].getBuffer().position()];
                            v = new byte[planes[2].getBuffer().limit() - planes[2].getBuffer().position()];
                        }
                        if (image.getPlanes()[0].getBuffer().remaining() == y.length) {
                            planes[0].getBuffer().get(y);
                            planes[1].getBuffer().get(u);
                            planes[2].getBuffer().get(v);
                            if (nv21 == null) {
                                nv21 = new byte[planes[0].getRowStride() * mPreviewSize.getHeight() * 3 / 2];
                            }
                            if (nv21 != null && (nv21.length != planes[0].getRowStride() * mPreviewSize.getHeight() * 3 / 2)) {
                                return;
                            }
                            // 回传数据是YUV422
                            if (y.length / u.length == 2) {
                                ImageUtil.yuv422ToYuv420sp(y, u, v, nv21, planes[0].getRowStride(), mPreviewSize.getHeight());
                            }
                            // 回传数据是YUV420
                            else if (y.length / u.length == 4) {
                                ImageUtil.yuv420ToYuv420sp(y, u, v, nv21, planes[0].getRowStride(), mPreviewSize.getHeight());
                            }
                            //调用人脸分析算法,绘制人脸信息
                            dealWithFace(nv21);
                        }
                        lock.unlock();
                    }
                }
                image.close(); //一定不能掉了
            }
        }
    

    实现上述需求相当于有三个操作,预览,录像,分析图片,所以需要给mPreviewBuilder添加三个surface,然后开启录制,刷新预览就可以了

     mPreviewBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_RECORD);
                List<Surface> surfaces = new ArrayList<>();
                // Set up Surface for the camera preview
                Surface previewSurface = new Surface(texture);
                surfaces.add(previewSurface);
                mPreviewBuilder.addTarget(previewSurface);
                // Set up Surface for the MediaRecorder
                Surface recorderSurface = mMediaRecorder.getSurface();
                surfaces.add(recorderSurface);
                mPreviewBuilder.addTarget(recorderSurface);
                //分析图片
                Surface imageSurface = mImageReader.getSurface();
                surfaces.add(imageSurface);
                mPreviewBuilder.addTarget(imageSurface);
    

    先写到这里吧,文末会放出源码
    未完待续......

    源码

    public class MonitorActivity extends BaseActivity implements ViewTreeObserver.OnGlobalLayoutListener {
        private static final int SENSOR_ORIENTATION_DEFAULT_DEGREES = 90;
        private static final int SENSOR_ORIENTATION_INVERSE_DEGREES = 180;
        private static final SparseIntArray DEFAULT_ORIENTATIONS = new SparseIntArray();
        private static final SparseIntArray INVERSE_ORIENTATIONS = new SparseIntArray();
    
        private static final String TAG = "MonitorActivity";
    
        static {
            DEFAULT_ORIENTATIONS.append(Surface.ROTATION_0, 90);
            DEFAULT_ORIENTATIONS.append(Surface.ROTATION_90, 0);
            DEFAULT_ORIENTATIONS.append(Surface.ROTATION_180, 270);
            DEFAULT_ORIENTATIONS.append(Surface.ROTATION_270, 180);
        }
    
        static {
            INVERSE_ORIENTATIONS.append(Surface.ROTATION_0, 270);
            INVERSE_ORIENTATIONS.append(Surface.ROTATION_90, 180);
            INVERSE_ORIENTATIONS.append(Surface.ROTATION_180, 90);
            INVERSE_ORIENTATIONS.append(Surface.ROTATION_270, 0);
        }
    
        private AutoFitTextureView mTextureView;
        private TextView mButtonVideo;
        private CameraDevice mCameraDevice;
        private CameraCaptureSession mPreviewSession;
        private TextureView.SurfaceTextureListener mSurfaceTextureListener
                = new TextureView.SurfaceTextureListener() {
    
            @Override
            public void onSurfaceTextureAvailable(SurfaceTexture surfaceTexture,
                                                  int width, int height) {
                openCamera(width, height);
            }
    
            @Override
            public void onSurfaceTextureSizeChanged(SurfaceTexture surfaceTexture,
                                                    int width, int height) {
                configureTransform(width, height);
            }
    
            @Override
            public boolean onSurfaceTextureDestroyed(SurfaceTexture surfaceTexture) {
                return true;
            }
    
            @Override
            public void onSurfaceTextureUpdated(SurfaceTexture surfaceTexture) {
            }
    
        };
        private Size mPreviewSize;
        private Size mVideoSize;
        private MediaRecorder mMediaRecorder;
        private boolean mIsRecordingVideo;
        /**
         * 图片分析
         */
        private ImageReader mImageReader;
        private HandlerThread mBackgroundThread;
        private Handler mBackgroundHandler;
        private Semaphore mCameraOpenCloseLock = new Semaphore(1);
        private TextView textRecordTick;
        private RelativeLayout timeRl;
        private ImageView ivClose;
        public static long mRecordingBegin;
        public static boolean mRecording;
    
        @Override
        protected void onCreate(Bundle savedInstanceState) {
            super.onCreate(savedInstanceState);
            setContentView(R.layout.activity_video);
            mButtonVideo = findViewById(R.id.iv_start_stop);
            textRecordTick = findViewById(R.id.tv_time);
            timeRl = findViewById(R.id.rl_time);
            ivClose = findViewById(R.id.iv_close);
            mTextureView = findViewById(R.id.textureView);
            mButtonVideo.setText("开启监控");
            mButtonVideo.setCompoundDrawablesWithIntrinsicBounds(null, null, null, getResources().getDrawable(R.drawable.icon_start));
            mButtonVideo.setOnClickListener(new View.OnClickListener() {
                @Override
                public void onClick(View v) {
                    if (mIsRecordingVideo) {
                        stopRecordingVideo();
                        onStopRecord();
                    } else {
                        startRecordingVideo();
                        ivClose.setVisibility(View.GONE);
                        timeRl.setVisibility(View.VISIBLE);
                        mButtonVideo.setText("");
                        onStartRecord();
                    }
                }
            });
            ivClose.setOnClickListener(new View.OnClickListener() {
                @Override
                public void onClick(View v) {
                    finish();
                }
            });
            startBackgroundThread();
            if (mTextureView.isAvailable()) {
                openCamera(mTextureView.getWidth(), mTextureView.getHeight());
            } else {
                mTextureView.setSurfaceTextureListener(mSurfaceTextureListener);
            }
            // Activity启动后就锁定为启动时的方向
            setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LOCKED);
        }
    
    
        private void onStartRecord() {
            // 开始录像的通知,记下当前时间
            mRecording = true;
            mRecordingBegin = System.currentTimeMillis();
    
            runOnUiThread(new Runnable() {
                @Override
                public void run() {
                    textRecordTick.setVisibility(View.VISIBLE);
                    textRecordTick.removeCallbacks(mRecordTickRunnable);
                    textRecordTick.post(mRecordTickRunnable);
                }
            });
        }
    
        private void onStopRecord() {
            mRecording = false;
            mRecordingBegin = 0;
    
            runOnUiThread(new Runnable() {
                @Override
                public void run() {
                    textRecordTick.setVisibility(View.INVISIBLE);
                    textRecordTick.removeCallbacks(mRecordTickRunnable);
                }
            });
        }
    
        private long exceptionTime = 0;
        private List<Long> exceptionList = new ArrayList<>();
        // 录像时的线程
        private Runnable mRecordTickRunnable = new Runnable() {
            @Override
            public void run() {
                long duration = System.currentTimeMillis() - mRecordingBegin;
                duration /= 1000;
                exceptionTime = duration;
                textRecordTick.setText(String.format("%02d:%02d:%02d", duration / 3600, duration % 3600 / 60, duration % 60));
                textRecordTick.removeCallbacks(this);
                textRecordTick.postDelayed(this, 1000);
            }
        };
    
        @Override
        public void onResume() {
            super.onResume();
        }
    
        @Override
        public void onPause() {
            super.onPause();
        }
    
        @Override
        protected void onDestroy() {
            super.onDestroy();
            closeCamera();
            stopBackgroundThread();
        }
    
    
        private CameraDevice.StateCallback mStateCallback = new CameraDevice.StateCallback() {
    
            @Override
            public void onOpened(@NonNull CameraDevice cameraDevice) {
                mCameraDevice = cameraDevice;
                startPreview();
                mCameraOpenCloseLock.release();
                if (null != mTextureView) {
                    configureTransform(mTextureView.getWidth(), mTextureView.getHeight());
                }
            }
    
            @Override
            public void onDisconnected(@NonNull CameraDevice cameraDevice) {
                mCameraOpenCloseLock.release();
                cameraDevice.close();
                mCameraDevice = null;
            }
    
            @Override
            public void onError(@NonNull CameraDevice cameraDevice, int error) {
                mCameraOpenCloseLock.release();
                cameraDevice.close();
                mCameraDevice = null;
                Activity activity = MonitorActivity.this;
                if (null != activity) {
                    activity.finish();
                }
            }
    
        };
        private Integer mSensorOrientation;
        private String mNextVideoAbsolutePath;
        private CaptureRequest.Builder mPreviewBuilder;
    
        private static Size chooseVideoSize(Size[] choices) {
            for (Size size : choices) {
                if (size.getWidth() == size.getHeight() * 4 / 3 && size.getWidth() <= 1080) {
                    return size;
                }
            }
            Logger.e(TAG, "Couldn't find any suitable video size");
            return choices[choices.length - 1];
        }
    
        private static Size chooseOptimalSize(Size[] choices, int width, int height, Size aspectRatio) {
            // Collect the supported resolutions that are at least as big as the preview Surface
            List<Size> bigEnough = new ArrayList<>();
            int w = aspectRatio.getWidth();
            int h = aspectRatio.getHeight();
            for (Size option : choices) {
                if (option.getHeight() == option.getWidth() * h / w &&
                        option.getWidth() >= width && option.getHeight() >= height) {
                    bigEnough.add(option);
                }
            }
    
            // Pick the smallest of those, assuming we found any
            if (bigEnough.size() > 0) {
                return Collections.min(bigEnough, new CompareSizesByArea());
            } else {
                Logger.e(TAG, "Couldn't find any suitable preview size");
                return choices[0];
            }
        }
    
        private void startBackgroundThread() {
            mBackgroundThread = new HandlerThread("CameraBackground");
            mBackgroundThread.start();
            mBackgroundHandler = new Handler(mBackgroundThread.getLooper());
        }
    
        private void stopBackgroundThread() {
            mBackgroundThread.quitSafely();
            try {
                mBackgroundThread.join();
                mBackgroundThread = null;
                mBackgroundHandler = null;
            } catch (InterruptedException e) {
                e.printStackTrace();
            }
        }
    
        @SuppressWarnings("MissingPermission")
        private void openCamera(int width, int height) {
            final Activity activity = MonitorActivity.this;
            if (null == activity || activity.isFinishing()) {
                return;
            }
            CameraManager manager = (CameraManager) activity.getSystemService(Context.CAMERA_SERVICE);
            try {
                Logger.d(TAG, "tryAcquire");
                if (!mCameraOpenCloseLock.tryAcquire(2500, TimeUnit.MILLISECONDS)) {
                    throw new RuntimeException("Time out waiting to lock camera opening.");
                }
                String cameraId = manager.getCameraIdList()[0];
    
                // Choose the sizes for camera preview and video recording
                CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraId);
                StreamConfigurationMap map = characteristics
                        .get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
                mSensorOrientation = characteristics.get(CameraCharacteristics.SENSOR_ORIENTATION);
                if (map == null) {
                    throw new RuntimeException("Cannot get available preview/video sizes");
                }
                mVideoSize = chooseVideoSize(map.getOutputSizes(MediaRecorder.class));
                mPreviewSize = chooseOptimalSize(map.getOutputSizes(SurfaceTexture.class),
                        width, height, mVideoSize);
                mImageReader = ImageReader.newInstance(mPreviewSize.getWidth(), mPreviewSize.getHeight(),
                        ImageFormat.YUV_420_888, 2);
                mImageReader.setOnImageAvailableListener(
                        new OnImageAvailableListenerImpl(), mBackgroundHandler);
                int orientation = getResources().getConfiguration().orientation;
                if (orientation == Configuration.ORIENTATION_LANDSCAPE) {
                    mTextureView.setAspectRatio(1920, 1200);
                } else {
                    mTextureView.setAspectRatio(mPreviewSize.getHeight(), mPreviewSize.getWidth());
                }
                configureTransform(width, height);
                mMediaRecorder = new MediaRecorder();
                manager.openCamera(cameraId, mStateCallback, null);
            } catch (CameraAccessException e) {
                Toast.makeText(activity, "Cannot access the camera.", Toast.LENGTH_SHORT).show();
                activity.finish();
            } catch (NullPointerException e) {
                // Currently an NPE is thrown when the Camera2API is used but not supported on the
                // device this code runs.
                e.printStackTrace();
            } catch (InterruptedException e) {
                throw new RuntimeException("Interrupted while trying to lock camera opening.");
            }
        }
    
        private void closeCamera() {
            try {
                mCameraOpenCloseLock.acquire();
                closePreviewSession();
                if (null != mCameraDevice) {
                    mCameraDevice.close();
                    mCameraDevice = null;
                }
                if (null != mMediaRecorder) {
                    mMediaRecorder.release();
                    mMediaRecorder = null;
                }
            } catch (InterruptedException e) {
                throw new RuntimeException("Interrupted while trying to lock camera closing.");
            } finally {
                mCameraOpenCloseLock.release();
            }
        }
    
        /**
         * Start the camera preview.
         */
        private void startPreview() {
            if (null == mCameraDevice || !mTextureView.isAvailable() || null == mPreviewSize) {
                return;
            }
            try {
                closePreviewSession();
                SurfaceTexture texture = mTextureView.getSurfaceTexture();
                assert texture != null;
                texture.setDefaultBufferSize(mPreviewSize.getWidth(), mPreviewSize.getHeight());
                mPreviewBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
    
                Surface previewSurface = new Surface(texture);
                mPreviewBuilder.addTarget(previewSurface);
    //            mPreviewBuilder.addTarget(mImageReader.getSurface()); 预览时也实时分析
    
                mCameraDevice.createCaptureSession(Arrays.asList(previewSurface),
                        new CameraCaptureSession.StateCallback() {
    
                            @Override
                            public void onConfigured(@NonNull CameraCaptureSession session) {
                                mPreviewSession = session;
                                updatePreview();
                            }
    
                            @Override
                            public void onConfigureFailed(@NonNull CameraCaptureSession session) {
                                Activity activity = MonitorActivity.this;
                                if (null != activity) {
                                    Toast.makeText(activity, "Failed", Toast.LENGTH_SHORT).show();
                                }
                            }
                        }, mBackgroundHandler);
            } catch (CameraAccessException e) {
                e.printStackTrace();
            }
        }
    
        /**
         * Update the camera preview. {@link #startPreview()} needs to be called in advance.
         */
        private void updatePreview() {
            if (null == mCameraDevice) {
                return;
            }
            try {
                setUpCaptureRequestBuilder(mPreviewBuilder);
                HandlerThread thread = new HandlerThread("CameraPreview");
                thread.start();
                mPreviewSession.setRepeatingRequest(mPreviewBuilder.build(), null, mBackgroundHandler);
            } catch (CameraAccessException e) {
                e.printStackTrace();
            }
        }
    
        private void setUpCaptureRequestBuilder(CaptureRequest.Builder builder) {
            builder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO);
        }
    
        private void configureTransform(int viewWidth, int viewHeight) {
            Activity activity = MonitorActivity.this;
            if (null == mTextureView || null == mPreviewSize || null == activity) {
                return;
            }
            int rotation = activity.getWindowManager().getDefaultDisplay().getRotation();
            Matrix matrix = new Matrix();
            RectF viewRect = new RectF(0, 0, viewWidth, viewHeight);
            RectF bufferRect = new RectF(0, 0, mPreviewSize.getHeight(), mPreviewSize.getWidth());
            float centerX = viewRect.centerX();
            float centerY = viewRect.centerY();
            if (Surface.ROTATION_90 == rotation || Surface.ROTATION_270 == rotation) {
                bufferRect.offset(centerX - bufferRect.centerX(), centerY - bufferRect.centerY());
                matrix.setRectToRect(viewRect, bufferRect, Matrix.ScaleToFit.FILL);
                float scale = Math.max(
                        (float) viewHeight / mPreviewSize.getHeight(),
                        (float) viewWidth / mPreviewSize.getWidth());
                matrix.postScale(scale, scale, centerX, centerY);
                matrix.postRotate(90 * (rotation - 2), centerX, centerY);
            }
            mTextureView.setTransform(matrix);
        }
    
        private void setUpMediaRecorder() throws IOException {
            final Activity activity = MonitorActivity.this;
            if (null == activity) {
                return;
            }
            mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
            mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.SURFACE);
            mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
            if (mNextVideoAbsolutePath == null || mNextVideoAbsolutePath.isEmpty()) {
                mNextVideoAbsolutePath = getSystemFilePath(MonitorActivity.this);
            }
            mMediaRecorder.setOutputFile(mNextVideoAbsolutePath);
            mMediaRecorder.setVideoEncodingBitRate(10000000);
            mMediaRecorder.setVideoFrameRate(30);
            mMediaRecorder.setVideoSize(mVideoSize.getWidth(), mVideoSize.getHeight());
            mMediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.H264);
            mMediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AAC);
            int rotation = activity.getWindowManager().getDefaultDisplay().getRotation();
            switch (mSensorOrientation) {
                case SENSOR_ORIENTATION_DEFAULT_DEGREES:
                    mMediaRecorder.setOrientationHint(DEFAULT_ORIENTATIONS.get(rotation));
                    break;
                case SENSOR_ORIENTATION_INVERSE_DEGREES:
                    mMediaRecorder.setOrientationHint(INVERSE_ORIENTATIONS.get(rotation));
                    break;
            }
            mMediaRecorder.prepare();
        }
    
        private void startRecordingVideo() {
            if (null == mCameraDevice || !mTextureView.isAvailable() || null == mPreviewSize) {
                return;
            }
            try {
                closePreviewSession();
                setUpMediaRecorder();
                SurfaceTexture texture = mTextureView.getSurfaceTexture();
                assert texture != null;
                texture.setDefaultBufferSize(mPreviewSize.getWidth(), mPreviewSize.getHeight());
                mPreviewBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_RECORD);
                List<Surface> surfaces = new ArrayList<>();
    
                // Set up Surface for the camera preview
                Surface previewSurface = new Surface(texture);
                surfaces.add(previewSurface);
                mPreviewBuilder.addTarget(previewSurface);
    
                // Set up Surface for the MediaRecorder
                Surface recorderSurface = mMediaRecorder.getSurface();
                surfaces.add(recorderSurface);
                mPreviewBuilder.addTarget(recorderSurface);
                //分析图片
                Surface imageSurface = mImageReader.getSurface();
                surfaces.add(imageSurface);
                mPreviewBuilder.addTarget(imageSurface);
                // Start a capture session
                // Once the session starts, we can update the UI and start recording
                mCameraDevice.createCaptureSession(surfaces, new CameraCaptureSession.StateCallback() {
    
                    @Override
                    public void onConfigured(@NonNull CameraCaptureSession cameraCaptureSession) {
                        mPreviewSession = cameraCaptureSession;
                        updatePreview();
                        MonitorActivity.this.runOnUiThread(new Runnable() {
                            @Override
                            public void run() {
                                // UI
                                mButtonVideo.setCompoundDrawablesWithIntrinsicBounds(null, null, null, getResources().getDrawable(R.drawable.icon_stop));
                                mIsRecordingVideo = true;
    
                                // Start recording
                                mMediaRecorder.start();
                            }
                        });
                    }
    
                    @Override
                    public void onConfigureFailed(@NonNull CameraCaptureSession cameraCaptureSession) {
                        Activity activity = MonitorActivity.this;
                        if (null != activity) {
                            Toast.makeText(activity, "Failed", Toast.LENGTH_SHORT).show();
                        }
                    }
                }, mBackgroundHandler);
            } catch (CameraAccessException | IOException e) {
                e.printStackTrace();
            }
    
        }
    
        private void closePreviewSession() {
            if (mPreviewSession != null) {
                mPreviewSession.close();
                mPreviewSession = null;
            }
        }
    
        private void stopRecordingVideo() {
            // UI
            mIsRecordingVideo = false;
            mButtonVideo.setCompoundDrawablesWithIntrinsicBounds(null, getResources().getDrawable(R.drawable.icon_start), null, null);
    //        startPreview();
            // Stop recording
            mMediaRecorder.stop();
            mMediaRecorder.reset();
    
            Activity activity = this;
            if (null != activity) {
    //            Toast.makeText(activity, "Video saved: " + mNextVideoAbsolutePath,
    //                    Toast.LENGTH_SHORT).show();
                Logger.d(TAG, "Video saved: " + mNextVideoAbsolutePath);
            }
        }
    
    
        /**
         * Compares two {@code Size}s based on their areas.
         */
        static class CompareSizesByArea implements Comparator<Size> {
    
            @Override
            public int compare(Size lhs, Size rhs) {
                // We cast here to ensure the multiplications won't overflow
                return Long.signum((long) lhs.getWidth() * lhs.getHeight() -
                        (long) rhs.getWidth() * rhs.getHeight());
            }
    
    
        }
    
        private class OnImageAvailableListenerImpl implements ImageReader.OnImageAvailableListener {
            private byte[] y;
            private byte[] u;
            private byte[] v;
            private byte[] nv21;
            private ReentrantLock lock = new ReentrantLock();
            private Object mImageReaderLock = 1;//1 available,0 unAvailable
    
            @Override
            public void onImageAvailable(ImageReader reader) {
                Image image = reader.acquireNextImage();
    //            if (image == null) {
    //                return;
    //            }
    //            byte[] data = TengineKitSdk.getInstance().yuv2nv21camera2(image);
    //            detect(data);
                if (image == null) {
                    return;
                }
    
                synchronized (mImageReaderLock) {
                    if (!mImageReaderLock.equals(1)) {
                        Logger.v(TAG, "--- image not available,just return!!!");
                        image.close();
                        return;
                    }
                    if (ImageFormat.YUV_420_888 == image.getFormat()) {
                        Image.Plane[] planes = image.getPlanes();
    
                        lock.lock();
                        if (y == null) {
                            y = new byte[planes[0].getBuffer().limit() - planes[0].getBuffer().position()];
                            u = new byte[planes[1].getBuffer().limit() - planes[1].getBuffer().position()];
                            v = new byte[planes[2].getBuffer().limit() - planes[2].getBuffer().position()];
                        }
    
                        if (image.getPlanes()[0].getBuffer().remaining() == y.length) {
                            planes[0].getBuffer().get(y);
                            planes[1].getBuffer().get(u);
                            planes[2].getBuffer().get(v);
    
                            if (nv21 == null) {
                                nv21 = new byte[planes[0].getRowStride() * mPreviewSize.getHeight() * 3 / 2];
                            }
    
                            if (nv21 != null && (nv21.length != planes[0].getRowStride() * mPreviewSize.getHeight() * 3 / 2)) {
                                return;
                            }
    
                            // 回传数据是YUV422
                            if (y.length / u.length == 2) {
                                ImageUtil.yuv422ToYuv420sp(y, u, v, nv21, planes[0].getRowStride(), mPreviewSize.getHeight());
                            }
                            // 回传数据是YUV420
                            else if (y.length / u.length == 4) {
                                ImageUtil.yuv420ToYuv420sp(y, u, v, nv21, planes[0].getRowStride(), mPreviewSize.getHeight());
                            }
    
                            //调用Arcsoft算法,绘制人脸信息
                            dealWithFace(nv21);
                        }
                        lock.unlock();
                    }
                }
    //            image.close();
    //            dealWithFace(data);
                image.close();
            }
        }
    

    相关文章

      网友评论

          本文标题:基于Camera2实现边录制视频边实时分析图片

          本文链接:https://www.haomeiwen.com/subject/qcgxqrtx.html