美文网首页AR/VR与AndroidAndroid音视频系列FFmpeg
用openGL ES+MediaPlayer 渲染播放视频+滤镜

用openGL ES+MediaPlayer 渲染播放视频+滤镜

作者: 小码哥_WS | 来源:发表于2017-05-27 10:10 被阅读653次

    之前曾经写过用SurfaceView,TextureView+MediaPlayer 播放视频,和 ffmpeg avi解码后SurfaceView播放视频 ,今天再给大家来一篇openGL ES+MediaPlayer来播放视频。。。。当年也曾呆过camera开发组近一年时间,可惜那时候没写博客的意识,没能给自己给大家留下多少干货分享。。。

    上个效果图吧:

    这里写图片描述

    用openGL着色器实现黑白(灰度图)效果。
    即 0.299,0.587,0.114 CRT中转灰度的模型

    这里写图片描述

    下面看具体实现的逻辑:

    如果你曾用openGL实现过贴图,那么就容易理解多了。和图片不同的是,视频需要不断地刷新,每当有新的一帧来时,我们都应该更新纹理,然后重新绘制。用openGL播放视频就是把视频贴到屏幕上。
    对openGL不熟的同学先看这里:学openGL必知道的图形学知识

    1.先写顶点着色器和片段着色器(我的习惯是这样,你也可以后边根据需要再写这个)

    顶点着色器:

    attribute vec4 aPosition;//顶点位置
    attribute vec4 aTexCoord;//S T 纹理坐标
    varying vec2 vTexCoord;
    uniform mat4 uMatrix;
    uniform mat4 uSTMatrix;
    void main() {
        vTexCoord = (uSTMatrix * aTexCoord).xy;
        gl_Position = uMatrix*aPosition;
    }
    

    片段着色器:

    #extension GL_OES_EGL_image_external : require
    precision mediump float;
    varying vec2 vTexCoord;
    uniform samplerExternalOES sTexture;
    void main() {
        gl_FragColor=texture2D(sTexture, vTexCoord);
    }
    

    对着色器语言不懂的同学看这里:http://blog.csdn.net/king1425/article/details/71425556

    <big>samplerExternalOES代替贴图片时的sampler2D,作用就是和surfaceTexture配合进行纹理更新和格式转换

    2.MediaPlayer的输出

    在GLVideoRenderer的构造函数中初始化MediaPlayer:

    mediaPlayer=new MediaPlayer();
            try{
                mediaPlayer.setDataSource(context, Uri.parse(videoPath));
            }catch (IOException e){
                e.printStackTrace();
            }
            mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
            mediaPlayer.setLooping(true);
    
            mediaPlayer.setOnVideoSizeChangedListener(this);
    

    onSurfaceCreated函数中使用SurfaceTexture来设置MediaPlayer的输出
    我们要用SurfaceTexture 创建一个Surface,然后将这个Surface作为MediaPlayer的输出表面。
    SurfaceTexture的主要作用就是,从视频流和相机数据流获取新一帧的数据,获取新数据调用的方法是updateTexImage。
    需要注意的是MediaPlayer的输出往往不是RGB格式(一般是YUV),而GLSurfaceView需要RGB格式才能正常显示。
    所以我们先在onSurfaceCreated中将生成纹理的代码改成这样:

    textureId = textures[0];
    GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, textureId);
    ShaderUtils.checkGlError("ws-------glBindTexture mTextureID");
    
    GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MIN_FILTER,
            GLES20.GL_NEAREST);
    GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MAG_FILTER,
            GLES20.GL_LINEAR);
    

    GLES11Ext.GL_TEXTURE_EXTERNAL_OES的用处是什么?
    之前提到视频解码的输出格式是YUV的(YUV420sp,应该是),那么这个扩展纹理的作用就是实现YUV格式到RGB的自动转化,我们就不需要再为此写YUV转RGB的代码了

    然后在onSurfaceCreated的最后加上如下代码:

     surfaceTexture = new SurfaceTexture(textureId);
            surfaceTexture.setOnFrameAvailableListener(this);//监听是否有新的一帧数据到来
    
            Surface surface = new Surface(surfaceTexture);
            mediaPlayer.setSurface(surface);
            surface.release();
    
            if (!playerPrepared){
                try {
                    mediaPlayer.prepare();
                    playerPrepared=true;
                } catch (IOException t) {
                    Log.e(TAG, "media player prepare failed");
                }
                mediaPlayer.start();
                playerPrepared=true;
            }
    

    用SurfaceTexture 创建一个Surface,然后将这个Surface作为MediaPlayer的输出表面.

    在onDrawFrame中

     synchronized (this){
                if (updateSurface){
                    surfaceTexture.updateTexImage();//获取新数据
                    surfaceTexture.getTransformMatrix(mSTMatrix);//让新的纹理和纹理坐标系能够正确的对应,mSTMatrix的定义是和projectionMatrix完全一样的。
                    updateSurface = false;
                }
            }
    

    在有新数据时,用updateTexImage来更新纹理,这个getTransformMatrix的目的,是让新的纹理和纹理坐标系能够正确的对应,mSTMatrix的定义是和projectionMatrix完全一样的。

     private final float[] vertexData = {
                1f,-1f,0f,
                -1f,-1f,0f,
                1f,1f,0f,
                -1f,1f,0f
        };
    
    
    
        private final float[] textureVertexData = {
                1f,0f,
                0f,0f,
                1f,1f,
                0f,1f
        };
    

    vertexData 代表要绘制的视口坐标。textureVertexData 代表视频纹理,与屏幕坐标对应

    然后我们读取坐标,在此自己我们先与着色器映射。
    在onSurfaceCreated映射

            aPositionLocation= GLES20.glGetAttribLocation(programId,"aPosition");
            uMatrixLocation=GLES20.glGetUniformLocation(programId,"uMatrix");
            uSTMMatrixHandle = GLES20.glGetUniformLocation(programId, "uSTMatrix");
            uTextureSamplerLocation=GLES20.glGetUniformLocation(programId,"sTexture");
            aTextureCoordLocation=GLES20.glGetAttribLocation(programId,"aTexCoord");
    

    onDrawFrame中读取:

    GLES20.glUseProgram(programId);
            GLES20.glUniformMatrix4fv(uMatrixLocation,1,false,projectionMatrix,0);
            GLES20.glUniformMatrix4fv(uSTMMatrixHandle, 1, false, mSTMatrix, 0);
    
            vertexBuffer.position(0);
            GLES20.glEnableVertexAttribArray(aPositionLocation);
            GLES20.glVertexAttribPointer(aPositionLocation, 3, GLES20.GL_FLOAT, false,
                    12, vertexBuffer);
    
            textureVertexBuffer.position(0);
            GLES20.glEnableVertexAttribArray(aTextureCoordLocation);
            GLES20.glVertexAttribPointer(aTextureCoordLocation,2,GLES20.GL_FLOAT,false,8,textureVertexBuffer);
    
            GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
            GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,textureId);
    
            GLES20.glUniform1i(uTextureSamplerLocation,0);
            GLES20.glViewport(0,0,screenWidth,screenHeight);
            GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4);
    

    GLVideoRenderer 全部代码如下:

    package com.ws.openglvideoplayer;
    
    import android.content.Context;
    import android.graphics.SurfaceTexture;
    import android.media.AudioManager;
    import android.media.MediaPlayer;
    import android.net.Uri;
    import android.opengl.GLES11Ext;
    import android.opengl.GLES20;
    import android.opengl.GLSurfaceView;
    import android.opengl.Matrix;
    import android.util.Log;
    import android.view.Surface;
    
    import java.io.IOException;
    import java.nio.ByteBuffer;
    import java.nio.ByteOrder;
    import java.nio.FloatBuffer;
    
    import javax.microedition.khronos.egl.EGLConfig;
    import javax.microedition.khronos.opengles.GL10;
    
    /**
     * Created by Shuo.Wang on 2017/3/19.
     */
    
    public class GLVideoRenderer implements GLSurfaceView.Renderer
            , SurfaceTexture.OnFrameAvailableListener, MediaPlayer.OnVideoSizeChangedListener  {
    
    
        private static final String TAG = "GLRenderer";
        private Context context;
        private int aPositionLocation;
        private int programId;
        private FloatBuffer vertexBuffer;
        private final float[] vertexData = {
                1f,-1f,0f,
                -1f,-1f,0f,
                1f,1f,0f,
                -1f,1f,0f
        };
    
        private final float[] projectionMatrix=new float[16];
        private int uMatrixLocation;
    
        private final float[] textureVertexData = {
                1f,0f,
                0f,0f,
                1f,1f,
                0f,1f
        };
        private FloatBuffer textureVertexBuffer;
        private int uTextureSamplerLocation;
        private int aTextureCoordLocation;
        private int textureId;
    
        private SurfaceTexture surfaceTexture;
        private MediaPlayer mediaPlayer;
        private float[] mSTMatrix = new float[16];
        private int uSTMMatrixHandle;
    
        private boolean updateSurface;
        private boolean playerPrepared;
        private int screenWidth,screenHeight;
        public GLVideoRenderer(Context context,String videoPath) {
            this.context = context;
            playerPrepared=false;
            synchronized(this) {
                updateSurface = false;
            }
            vertexBuffer = ByteBuffer.allocateDirect(vertexData.length * 4)
                    .order(ByteOrder.nativeOrder())
                    .asFloatBuffer()
                    .put(vertexData);
            vertexBuffer.position(0);
    
            textureVertexBuffer = ByteBuffer.allocateDirect(textureVertexData.length * 4)
                    .order(ByteOrder.nativeOrder())
                    .asFloatBuffer()
                    .put(textureVertexData);
            textureVertexBuffer.position(0);
    
            mediaPlayer=new MediaPlayer();
            try{
                mediaPlayer.setDataSource(context, Uri.parse(videoPath));
            }catch (IOException e){
                e.printStackTrace();
            }
            mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
            mediaPlayer.setLooping(true);
    
            mediaPlayer.setOnVideoSizeChangedListener(this);
        }
    
        @Override
        public void onSurfaceCreated(GL10 gl, EGLConfig config) {
            String vertexShader = ShaderUtils.readRawTextFile(context, R.raw.simple_vertex_shader);
            String fragmentShader= ShaderUtils.readRawTextFile(context, R.raw.simple_fragment_shader);
            programId=ShaderUtils.createProgram(vertexShader,fragmentShader);
            aPositionLocation= GLES20.glGetAttribLocation(programId,"aPosition");
    
            uMatrixLocation=GLES20.glGetUniformLocation(programId,"uMatrix");
            uSTMMatrixHandle = GLES20.glGetUniformLocation(programId, "uSTMatrix");
            uTextureSamplerLocation=GLES20.glGetUniformLocation(programId,"sTexture");
            aTextureCoordLocation=GLES20.glGetAttribLocation(programId,"aTexCoord");
    
    
            int[] textures = new int[1];
            GLES20.glGenTextures(1, textures, 0);
    
            textureId = textures[0];
            GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, textureId);
            ShaderUtils.checkGlError("glBindTexture mTextureID");
       /*GLES11Ext.GL_TEXTURE_EXTERNAL_OES的用处?
          之前提到视频解码的输出格式是YUV的(YUV420p,应该是),那么这个扩展纹理的作用就是实现YUV格式到RGB的自动转化,
          我们就不需要再为此写YUV转RGB的代码了*/
            GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MIN_FILTER,
                    GLES20.GL_NEAREST);
            GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MAG_FILTER,
                    GLES20.GL_LINEAR);
    
            surfaceTexture = new SurfaceTexture(textureId);
            surfaceTexture.setOnFrameAvailableListener(this);//监听是否有新的一帧数据到来
    
            Surface surface = new Surface(surfaceTexture);
            mediaPlayer.setSurface(surface);
            surface.release();
    
            if (!playerPrepared){
                try {
                    mediaPlayer.prepare();
                    playerPrepared=true;
                } catch (IOException t) {
                    Log.e(TAG, "media player prepare failed");
                }
                mediaPlayer.start();
                playerPrepared=true;
            }
        }
    
        @Override
        public void onSurfaceChanged(GL10 gl, int width, int height) {
            Log.d(TAG, "onSurfaceChanged: "+width+" "+height);
            screenWidth=width; screenHeight=height;
        }
    
        @Override
        public void onDrawFrame(GL10 gl) {
            GLES20.glClear( GLES20.GL_DEPTH_BUFFER_BIT | GLES20.GL_COLOR_BUFFER_BIT);
            synchronized (this){
                if (updateSurface){
                    surfaceTexture.updateTexImage();//获取新数据
                    surfaceTexture.getTransformMatrix(mSTMatrix);//让新的纹理和纹理坐标系能够正确的对应,mSTMatrix的定义是和projectionMatrix完全一样的。
                    updateSurface = false;
                }
            }
            GLES20.glUseProgram(programId);
            GLES20.glUniformMatrix4fv(uMatrixLocation,1,false,projectionMatrix,0);
            GLES20.glUniformMatrix4fv(uSTMMatrixHandle, 1, false, mSTMatrix, 0);
    
            vertexBuffer.position(0);
            GLES20.glEnableVertexAttribArray(aPositionLocation);
            GLES20.glVertexAttribPointer(aPositionLocation, 3, GLES20.GL_FLOAT, false,
                    12, vertexBuffer);
    
            textureVertexBuffer.position(0);
            GLES20.glEnableVertexAttribArray(aTextureCoordLocation);
            GLES20.glVertexAttribPointer(aTextureCoordLocation,2,GLES20.GL_FLOAT,false,8,textureVertexBuffer);
    
            GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
            GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES,textureId);
    
            GLES20.glUniform1i(uTextureSamplerLocation,0);
            GLES20.glViewport(0,0,screenWidth,screenHeight);
            GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4);
        }
    
        @Override
        synchronized public void onFrameAvailable(SurfaceTexture surface) {
            updateSurface = true;
        }
    
        @Override
        public void onVideoSizeChanged(MediaPlayer mp, int width, int height) {
            Log.d(TAG, "onVideoSizeChanged: "+width+" "+height);
            updateProjection(width,height);
        }
    
        private void updateProjection(int videoWidth, int videoHeight){
            float screenRatio=(float)screenWidth/screenHeight;
            float videoRatio=(float)videoWidth/videoHeight;
            if (videoRatio>screenRatio){
                Matrix.orthoM(projectionMatrix,0,-1f,1f,-videoRatio/screenRatio,videoRatio/screenRatio,-1f,1f);
            }else Matrix.orthoM(projectionMatrix,0,-screenRatio/videoRatio,screenRatio/videoRatio,-1f,1f,-1f,1f);
        }
    
        public MediaPlayer getMediaPlayer() {
            return mediaPlayer;
        }
    }
    
    
    
    

    要实现上图中的滤镜视频效果,只需用0.299,0.587,0.114 CRT中转灰度的模型算法。(自己可以网上搜寻更多效果,这里只是抛砖引玉)
    更改片段着色器即可:

    #extension GL_OES_EGL_image_external : require
    precision mediump float;
    varying vec2 vTexCoord;
    uniform samplerExternalOES sTexture;
    void main() {
        //gl_FragColor=texture2D(sTexture, vTexCoord);
    
            vec3 centralColor = texture2D(sTexture, vTexCoord).rgb;
            gl_FragColor = vec4(0.299*centralColor.r+0.587*centralColor.g+0.114*centralColor.b);
    
    }
    
    这里写图片描述

    到此结束,我们已经实现了openGL ES+MediaPlayer 渲染播放视频+滤镜效果。后期将讲述全景视频原理及实现过程,敬请关注~

    相关文章

      网友评论

        本文标题:用openGL ES+MediaPlayer 渲染播放视频+滤镜

        本文链接:https://www.haomeiwen.com/subject/qonffxtx.html