美文网首页
spydroid-ipcamera源码分析(四):VideoSt

spydroid-ipcamera源码分析(四):VideoSt

作者: 管弦_ | 来源:发表于2017-05-31 17:12 被阅读0次

    VideoStream类

    VideoStream类是视频流的基类,同样继承了MediaStream类,并封装了对视频流的基本操作。与AudioStream类相比,对视频流的操作远比音频流的操作繁琐而复杂,涉及的知识面也更宽,所以我们准备以更大的篇幅来介绍视频流的基本操作。

    我们先来看一下对摄像头的基本操作的几个方法。

        /**
         * Opens the camera in a new Looper thread so that the preview callback is not called from the main thread
         * If an exception is thrown in this Looper thread, we bring it back into the main thread.
         * @throws RuntimeException Might happen if another app is already using the camera.
         */
        private void openCamera() throws RuntimeException {
            final Semaphore lock = new Semaphore(0);
            final RuntimeException[] exception = new RuntimeException[1];
            mCameraThread = new Thread(new Runnable() {
                @Override
                public void run() {
                    Looper.prepare();
                    mCameraLooper = Looper.myLooper();
                    try {
                        mCamera = Camera.open(mCameraId);
                    } catch (RuntimeException e) {
                        exception[0] = e;
                    } finally {
                        lock.release();
                        Looper.loop();
                    }
                }
            });
            mCameraThread.start();
            lock.acquireUninterruptibly();
            if (exception[0] != null) throw new CameraInUseException(exception[0].getMessage());
        }
    

    openCamera()方法就是打开摄像头的操作,Camera.open(mCameraId)本身就是一个耗时的方法,所以启动一个新的线程来执行(虽然Session的start()就是在子线程中);Semaphore对象和Looper对象其实是防止该线程的同时启用多次(不确定? 这里欢迎指正)。

        if (mCamera == null) {
                openCamera();
                
                ...
                
            try {
                if (mMode == MODE_MEDIACODEC_API_2) {
                    mSurfaceView.startGLThread();
                    mCamera.setPreviewTexture(mSurfaceView.getSurfaceTexture());
                } else {
                    mCamera.setPreviewDisplay(mSurfaceView.getHolder());
                }
            } catch (IOException e) {
                throw new InvalidSurfaceException("Invalid surface !");
            }   
            
            
                ...
        }
    

    上面代码是截取createCamera()方法中的部分代码,createCamera()方法一开始调用了openCamera(),而且对摄像头预览控件进行了配置,mMode == MODE_MEDIACODEC_API_2的情况我们稍后再说。

       protected synchronized void updateCamera() throws RuntimeException {
            if (mPreviewStarted) {
                mPreviewStarted = false;
                mCamera.stopPreview();
            }
    
            Parameters parameters = mCamera.getParameters();
            mQuality = VideoQuality.determineClosestSupportedResolution(parameters, mQuality);
            int[] max = VideoQuality.determineMaximumSupportedFramerate(parameters);
            parameters.setPreviewFormat(mCameraImageFormat);
            parameters.setPreviewSize(mQuality.resX, mQuality.resY);
            parameters.setPreviewFpsRange(max[0], max[1]);
    
            try {
                mCamera.setParameters(parameters);
                mCamera.setDisplayOrientation(mOrientation);
                mCamera.startPreview();
                mPreviewStarted = true;
            } catch (RuntimeException e) {
                destroyCamera();
                throw e;
            }
        }
    

    updateCamera()其实是对摄像头的参数进行配置,这里依次配置了原始数据格式、分辨率(所支持的)、帧率(所支持的)、旋转角度。

        protected synchronized void destroyCamera() {
            if (mCamera != null) {
                if (mStreaming) super.stop();
                lockCamera();
                mCamera.stopPreview();
                try {
                    mCamera.release();
                } catch (Exception e) {
                    Log.e(TAG,e.getMessage()!=null?e.getMessage():"unknown error");
                }
                mCamera = null;
                mCameraLooper.quit();
                mUnlocked = false;
                mPreviewStarted = false;
            }   
        }
    

    destroyCamera()就是停止和释放摄像头,其中mCameraLooper.quit();表示释放了openCamera();中的线程Looper,所以又可以启用该线程了。

        /**
         * Video encoding is done by a MediaRecorder.
         */
        protected void encodeWithMediaRecorder() throws IOException {
    
            Log.d(TAG,"Video encoded using the MediaRecorder API");
    
            // We need a local socket to forward data output by the camera to the packetizer
            createSockets();
    
            // Reopens the camera if needed
            destroyCamera();
            createCamera();
    
            // The camera must be unlocked before the MediaRecorder can use it
            unlockCamera();
    
            try {
                mMediaRecorder = new MediaRecorder();
                mMediaRecorder.setCamera(mCamera);
                mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
                mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
                mMediaRecorder.setVideoEncoder(mVideoEncoder);
                mMediaRecorder.setPreviewDisplay(mSurfaceView.getHolder().getSurface());
                mMediaRecorder.setVideoSize(mRequestedQuality.resX,mRequestedQuality.resY);
                mMediaRecorder.setVideoFrameRate(mRequestedQuality.framerate);
    
                // The bandwidth actually consumed is often above what was requested 
                mMediaRecorder.setVideoEncodingBitRate((int)(mRequestedQuality.bitrate*0.8));
    
                // We write the ouput of the camera in a local socket instead of a file !           
                // This one little trick makes streaming feasible quiet simply: data from the camera
                // can then be manipulated at the other end of the socket
                mMediaRecorder.setOutputFile(mSender.getFileDescriptor());
    
                mMediaRecorder.prepare();
                mMediaRecorder.start();
    
            } catch (Exception e) {
                throw new ConfNotSupportedException(e.getMessage());
            }
    
            // This will skip the MPEG4 header if this step fails we can't stream anything :(
            InputStream is = mReceiver.getInputStream();
            try {
                byte buffer[] = new byte[4];
                // Skip all atoms preceding mdat atom
                while (!Thread.interrupted()) {
                    while (is.read() != 'm');
                    is.read(buffer,0,3);
                    if (buffer[0] == 'd' && buffer[1] == 'a' && buffer[2] == 't') break;
                }
            } catch (IOException e) {
                Log.e(TAG,"Couldn't skip mp4 header :/");
                stop();
                throw e;
            }
    
            // The packetizer encapsulates the bit stream in an RTP stream and send it over the network
            mPacketizer.setDestination(mDestination, mRtpPort, mRtcpPort);
            mPacketizer.setInputStream(mReceiver.getInputStream());
            mPacketizer.start();
    
            mStreaming = true;
    
        }
    

    VideoStream重写的encodeWithMediaRecorder()方法其实和AudioStream的大同小异,整个流程简单来说:创建本地Sockets,重启摄像头(需要的话),摄像头释放锁,MediaRecorder设置参数(视频来源、输出格式、编码格式、预览Surface、分辨率、帧率、比特率、输出路径),启动MediaRecorder执行录制视频,启动一个循环遍历视频流来过滤MPEG4格式的头(mdat),最后使用打包器打包和输出已过滤的视频流。

        /**
         * Video encoding is done by a MediaCodec.
         */
        protected void encodeWithMediaCodec() throws RuntimeException, IOException {
            if (mMode == MODE_MEDIACODEC_API_2) {
                // Uses the method MediaCodec.createInputSurface to feed the encoder
                encodeWithMediaCodecMethod2();
            } else {
                // Uses dequeueInputBuffer to feed the encoder
                encodeWithMediaCodecMethod1();
            }
        }
    

    VideoStream的encodeWithMediaCodec()方法分为两种方式,第一种与AudioStream的encodeWithMediaCodec()差不多,就是拿到原始数据然后往MediaCodec添加进行处理,而第二种就是用createInputSurface()方法设置Surface作为数据源。两者其实差不多,但是第一种可以做到数据的添加和取回是可控的,可以处理完一段数据再处理下一段数据,而第二种方法无法直接控制数据,所以无法做到可控。

        /**
         * Video encoding is done by a MediaCodec.
         */
        @SuppressLint("NewApi")
        protected void encodeWithMediaCodecMethod1() throws RuntimeException, IOException {
    
            Log.d(TAG,"Video encoded using the MediaCodec API with a buffer");
    
            // Updates the parameters of the camera if needed
            createCamera();
            updateCamera();
    
            // Estimates the framerate of the camera
            measureFramerate();
    
            // Starts the preview if needed
            if (!mPreviewStarted) {
                try {
                    mCamera.startPreview();
                    mPreviewStarted = true;
                } catch (RuntimeException e) {
                    destroyCamera();
                    throw e;
                }
            }
    
            //这个类就是检测和绕过一些视频编码上错误,帮助我们正确的完成配置参数
            EncoderDebugger debugger = EncoderDebugger.debug(mSettings, mQuality.resX, mQuality.resY);
            final NV21Convertor convertor = debugger.getNV21Convertor();
    
            //配置参数依次:分辨率、比特率、帧率、颜色格式、帧间隔
            mMediaCodec = MediaCodec.createByCodecName(debugger.getEncoderName());
            MediaFormat mediaFormat = MediaFormat.createVideoFormat("video/avc", mQuality.resX, mQuality.resY);
            mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, mQuality.bitrate);
            mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, mQuality.framerate); 
            mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT,debugger.getEncoderColorFormat());
            mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 1);
            mMediaCodec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
            mMediaCodec.start();
            //摄像头获取的每一帧数据的回调    
            Camera.PreviewCallback callback = new Camera.PreviewCallback() {
                long now = System.nanoTime()/1000, oldnow = now, i=0;
                ByteBuffer[] inputBuffers = mMediaCodec.getInputBuffers();
                @Override
                public void onPreviewFrame(byte[] data, Camera camera) {
                    //data就是视频流每一帧的原始数据
                    oldnow = now;
                    now = System.nanoTime()/1000;
                    if (i++>3) {
                        i = 0;
                        //Log.d(TAG,"Measured: "+1000000L/(now-oldnow)+" fps.");
                    }
                    try {
                        //从输入流队列中取数据进行编码操作(出队列)。
                        int bufferIndex = mMediaCodec.dequeueInputBuffer(500000);
                        if (bufferIndex>=0) {
                            inputBuffers[bufferIndex].clear();
                            //对原始数据进行转码
                            convertor.convert(data, inputBuffers[bufferIndex]);
                            //输入流入队列(往编码器中添加数据做编码处理)
                            mMediaCodec.queueInputBuffer(bufferIndex, 0, inputBuffers[bufferIndex].position(), now, 0);
                        } else {
                            Log.e(TAG,"No buffer available !");
                        }
                    } finally {
                        //这里就是通知这一帧数据已经处理完了,可以回调下一帧数据了,也就是前面所说的可控性
                        mCamera.addCallbackBuffer(data);
                    }               
                }
            };
            //通知回调数据
            for (int i=0;i<10;i++) mCamera.addCallbackBuffer(new byte[convertor.getBufferSize()]);
            //给摄像头添加回调
            mCamera.setPreviewCallbackWithBuffer(callback);
            
            //打包器打包数据并传输
            // The packetizer encapsulates the bit stream in an RTP stream and send it over the network
            mPacketizer.setDestination(mDestination, mRtpPort, mRtcpPort);
            mPacketizer.setInputStream(new MediaCodecInputStream(mMediaCodec));
            mPacketizer.start();
    
            mStreaming = true;
    
        }
    

    encodeWithMediaCodecMethod1()的基本流程:打开摄像头(需要的话),调试帧率,打开预览(需要的话),检测视频格式bug(EncoderDebugger类里面涉及到的关于视频格式和编码的问题过于深入,水平有限,这里就不展开了),实例化MediaCodec对象并配置参数(分辨率、比特率、帧率、颜色格式、帧间隔),给摄像头添加每一帧数据的回调,在回调方法中拿到每一帧的原始数据,添加到MediaCodec进行转码,然后打包器打包数据并传输。

        /**
         * Video encoding is done by a MediaCodec.
         * But here we will use the buffer-to-surface methode
         */
        @SuppressLint({ "InlinedApi", "NewApi" })   
        protected void encodeWithMediaCodecMethod2() throws RuntimeException, IOException {
    
            Log.d(TAG,"Video encoded using the MediaCodec API with a surface");
    
            // Updates the parameters of the camera if needed
            createCamera();
            updateCamera();
    
            // Estimates the framerate of the camera
            measureFramerate();
    
            EncoderDebugger debugger = EncoderDebugger.debug(mSettings, mQuality.resX, mQuality.resY);
    
            mMediaCodec = MediaCodec.createByCodecName(debugger.getEncoderName());
            MediaFormat mediaFormat = MediaFormat.createVideoFormat("video/avc", mQuality.resX, mQuality.resY);
            mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, mQuality.bitrate);
            mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, mQuality.framerate); 
            mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface);
            mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 1);
            mMediaCodec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
            //这里就是将mSurfaceView中的surface作为数据源,来替代输入缓冲区
            Surface surface = mMediaCodec.createInputSurface();
            ((SurfaceView)mSurfaceView).addMediaCodecSurface(surface);
            mMediaCodec.start();
    
            // The packetizer encapsulates the bit stream in an RTP stream and send it over the network
            mPacketizer.setDestination(mDestination, mRtpPort, mRtcpPort);
            mPacketizer.setInputStream(new MediaCodecInputStream(mMediaCodec));
            mPacketizer.start();
    
            mStreaming = true;
    
        }
    

    encodeWithMediaCodecMethod2()的流程其实差不多,只是将一个surface对象作为数据源,而不用处理每一帧的原始数据,相对简单但不可控。

    这一篇到这里已经基本分析了VideoStream类的内部实现和摄像头的基本操作,也大概了解了视频流从采集到编码的整个流程,下一篇我们将具体讲到VideoStream类的子类和视频流的具体编码格式。

    相关文章

      网友评论

          本文标题:spydroid-ipcamera源码分析(四):VideoSt

          本文链接:https://www.haomeiwen.com/subject/szsjfxtx.html