美文网首页
移植ffmpeg到android源码

移植ffmpeg到android源码

作者: Ed_Lannister | 来源:发表于2019-01-29 17:10 被阅读10次

本文为使用AS编辑工程,并拆解项目到android源码实操过程。
感谢雷神的博文,本篇属于补充性质,对实际操作过程中一些细节问题的补充,先看如下链接,如遇问题回归本文

https://blog.csdn.net/leixiaohua1020/article/details/47008825

1」.编译ffmpeg类库
按如上链接可以正常实现,如遇c compile的问题,查看使用的ndk版本是否存在问题。
我这里使用的是android-ndk-r14b,ffmpeg下载的是ffmpeg-4.0.3。build脚本是全功能的脚本

#!/bin/bash
NDK=/home/edward/bin/ndk/android-ndk-r14b
SYSROOT=$NDK/platforms/android-21/arch-arm/
CPU=armv7-a
TOOLCHAIN=$NDK/toolchains/arm-linux-androideabi-4.9/prebuilt/linux-x86_64
PREFIX=$(pwd)/android/$CPU
OPTIMIZE_CFLAGS="-mfloat-abi=softfp -mfpu=vfp -marm -march=$CPU "

function build_android
{
./configure \
--prefix=$PREFIX \
--enable-neon \
--enable-hwaccels \
--enable-shared \
--enable-jni \
--enable-mediacodec \
--enable-decoder=h264_mediacodec \
--enable-gpl \
--enable-ffmpeg \
--enable-small \
--disable-static \
--disable-doc \
--disable-ffplay \
--disable-ffprobe \
--enable-avdevice \
--disable-doc \
--disable-symver \
--disable-stripping \
--cross-prefix=$TOOLCHAIN/bin/arm-linux-androideabi- \
--nm=$PREBUILT/bin/arm-linux-androideabi-nm \
--target-os=android \
--enable-runtime-cpudetect \
--disable-asm \
--arch=arm \
--cpu=armv7-a \
--enable-cross-compile \
--sysroot=$SYSROOT \
--extra-cflags="-Os -fpic $OPTIMIZE_CFLAGS" \
--extra-ldflags="$ADDI_LDFLAGS" \
$ADDITIONAL_CONFIGURE_FLAG
}
build_android

make install之后在同级目录android下面生成了一个armv7-a的目录,包含了可执行文件夹,头文件夹,动态库文件夹和share文件夹。
2」.开始一个android studio项目(先做一个AS项目是因为在AS里面有很好的代码检测机制,方便代码编译,之后会拆分结构放到源码当中),创建一个空白的AS项目


2019-01-29 10-02-59屏幕截图.png

3」.AS appdemo根目录下面新建jni文件夹,将1编译出来的头文件夹和动态库拷贝到jni目录下面。


2019-01-29 10-32-49屏幕截图.png
4」.编写布局文件,values等文件,还有java主体文件
activity_main.xml
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:tools="http://schemas.android.com/tools"
    android:layout_width="match_parent"
    android:layout_height="match_parent"
    android:paddingBottom="@dimen/activity_vertical_margin"
    android:paddingLeft="@dimen/activity_horizontal_margin"
    android:paddingRight="@dimen/activity_horizontal_margin"
    android:paddingTop="@dimen/activity_vertical_margin"
    tools:context=".MainActivity" >

    <TextView
        android:id="@+id/text_label1"
        android:layout_width="wrap_content"
        android:layout_height="wrap_content"
        android:layout_alignParentLeft="true"
        android:layout_alignParentTop="true" 
        android:text="@+string/input_bitstream"/>

    <EditText
        android:id="@+id/input_url"
        android:layout_width="wrap_content"
        android:layout_height="wrap_content"
        android:layout_alignLeft="@+id/text_label1"
        android:layout_below="@+id/text_label1"
        android:ems="100"
        android:inputType="textUri"
        android:text="@+string/sintel.mp4"
        android:textColor="#ff8c00" >

        <requestFocus />
    </EditText>

    <TextView
        android:id="@+id/text_label2"
        android:layout_width="wrap_content"
        android:layout_height="wrap_content"
        android:layout_alignLeft="@+id/input_url"
        android:layout_below="@+id/input_url"
        android:text="@+string/out_put_raw"/>

    <EditText
        android:id="@+id/output_url"
        android:layout_width="wrap_content"
        android:layout_height="wrap_content"
        android:layout_alignLeft="@+id/text_label2"
        android:layout_below="@+id/text_label2"
        android:ems="100"
        android:inputType="textUri"
        android:text="@+string/sintel.yuv"
        android:textColor="#ff8c00" >

    </EditText>

    <Button
        android:id="@+id/button_start"
        style="?android:attr/buttonStyleSmall"
        android:layout_width="wrap_content"
        android:layout_height="wrap_content"
        android:layout_alignLeft="@+id/output_url"
        android:layout_below="@+id/output_url"
        android:text="@+string/btn_start"/>

    <TextView
        android:id="@+id/text_info"
        android:layout_width="wrap_content"
        android:layout_height="wrap_content"
        android:layout_alignLeft="@+id/button_start"
        android:layout_alignParentBottom="true"
        android:layout_alignRight="@+id/output_url"
        android:layout_below="@+id/button_start"
        android:text="@+string/text_view"/>

</RelativeLayout>

strings.xml

<?xml version="1.0" encoding="utf-8"?>
<resources>

    <string name="app_name">SFFmpegAndroidDecoder</string>
    <string name="app_author">Lei Xiaohua</string>
    <string name="label_url">URL</string>
    <string name="btn_start">Start</string>
    <string name="title_activity_sdl">SDLActivity</string>
    <string name="action_settings">Settings</string>
    <string name="hello_world">Hello world!</string>

</resources>

新建dimens.xml

<resources>

    <!-- Default screen margins, per the Android Design guidelines. -->
    <dimen name="activity_horizontal_margin">16dp</dimen>
    <dimen name="activity_vertical_margin">16dp</dimen>

</resources>

创建menu文件夹和文件

<menu xmlns:android="http://schemas.android.com/apk/res/android" >

    <item
        android:id="@+id/action_settings"
        android:orderInCategory="100"
        android:title="@string/action_settings"/>

</menu>

MainActivity.java

package com.qiyi.ffmpegdecoderdemo;

import android.os.Bundle;
import android.os.Environment;
import android.app.Activity;
import android.text.Editable;
import android.util.Log;
import android.view.Menu;
import android.view.View;
import android.view.View.OnClickListener;
import android.widget.Button;
import android.widget.EditText;
import android.widget.TextView;

public class MainActivity extends Activity {



    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);

        Button startButton = (Button) this.findViewById(R.id.button_start);
        final EditText urlEdittext_input= (EditText) this.findViewById(R.id.input_url);
        final EditText urlEdittext_output= (EditText) this.findViewById(R.id.output_url);

        startButton.setOnClickListener(new OnClickListener() {
            public void onClick(View arg0){

                String folderurl=Environment.getExternalStorageDirectory().getPath();

                String urltext_input=urlEdittext_input.getText().toString();
                String inputurl=folderurl+"/"+urltext_input;

                String urltext_output=urlEdittext_output.getText().toString();
                String outputurl=folderurl+"/"+urltext_output;

                Log.i("inputurl",inputurl);
                Log.i("outputurl",outputurl);

                decode(inputurl,outputurl);

            }
        });
    }

    @Override
    public boolean onCreateOptionsMenu(Menu menu) {
        // Inflate the menu; this adds items to the action bar if it is present.
        getMenuInflater().inflate(R.menu.main, menu);
        return true;
    }

    //JNI
    public native int decode(String inputurl, String outputurl);

    static{
        System.loadLibrary("avutil");
        System.loadLibrary("swresample");
        System.loadLibrary("avcodec");
        System.loadLibrary("avformat");
        System.loadLibrary("swscale");
        System.loadLibrary("postproc");
        System.loadLibrary("avfilter");
        System.loadLibrary("avdevice");
        System.loadLibrary("ffdecoder");
    }
}

5」.根据MainActivity.java生成jni cpp的头文件


2019-01-29 10-50-47屏幕截图.png

结果如下

/* DO NOT EDIT THIS FILE - it is machine generated */
#include <jni.h>
/* Header for class com_qiyi_ffmpegdecoderdemo_MainActivity */

#ifndef _Included_com_qiyi_ffmpegdecoderdemo_MainActivity
#define _Included_com_qiyi_ffmpegdecoderdemo_MainActivity
#ifdef __cplusplus
extern "C" {
#endif
/*
 * Class:     com_qiyi_ffmpegdecoderdemo_MainActivity
 * Method:    decode
 * Signature: (Ljava/lang/String;Ljava/lang/String;)I
 */
JNIEXPORT jint JNICALL Java_com_qiyi_ffmpegdecoderdemo_MainActivity_decode
  (JNIEnv *, jobject, jstring, jstring);

#ifdef __cplusplus
}
#endif
#endif

根据头文件编写相应的jni cpp文件,将生成的头文件拷贝到jni目录,并复制文件名编写cpp文件,将接口函数替换成自己的demo的接口函数


2019-01-29 10-55-42屏幕截图.png
#include <stdio.h>
#include <time.h>

#include "libavcodec/avcodec.h"
#include "libavformat/avformat.h"
#include "libswscale/swscale.h"
#include "libavutil/log.h"

#ifdef ANDROID
#include <jni.h>
#include <android/log.h>
#define LOGE(format, ...)  __android_log_print(ANDROID_LOG_ERROR, "(>_<)", format, ##__VA_ARGS__)
#define LOGI(format, ...)  __android_log_print(ANDROID_LOG_INFO,  "(^_^)", format, ##__VA_ARGS__)
#else
#define LOGE(format, ...)  printf("(>_<) " format "\n", ##__VA_ARGS__)
#define LOGI(format, ...)  printf("(^_^) " format "\n", ##__VA_ARGS__)
#endif


//Output FFmpeg's av_log()
void custom_log(void *ptr, int level, const char* fmt, va_list vl){
    FILE *fp=fopen("/storage/emulated/0/av_log.txt","a+");
    if(fp){
        vfprintf(fp,fmt,vl);
        fflush(fp);
        fclose(fp);
    }
}

JNIEXPORT jint JNICALL Java_com_qiyi_ffmpegdecoderdemo_MainActivity_decode
  (JNIEnv *env, jobject obj, jstring input_jstr, jstring output_jstr)
{
    AVFormatContext *pFormatCtx;
    int             i, videoindex;
    AVCodecContext  *pCodecCtx;
    AVCodec         *pCodec;
    AVFrame *pFrame,*pFrameYUV;
    uint8_t *out_buffer;
    AVPacket *packet;
    int y_size;
    int ret, got_picture;
    struct SwsContext *img_convert_ctx;
    FILE *fp_yuv;
    int frame_cnt;
    clock_t time_start, time_finish;
    double  time_duration = 0.0;

    char input_str[500]={0};
    char output_str[500]={0};
    char info[1000]={0};
    sprintf(input_str,"%s",(*env)->GetStringUTFChars(env,input_jstr, NULL));
    sprintf(output_str,"%s",(*env)->GetStringUTFChars(env,output_jstr, NULL));

    //FFmpeg av_log() callback
  av_log_set_callback(custom_log);

    av_register_all();
    avformat_network_init();
    pFormatCtx = avformat_alloc_context();

    if(avformat_open_input(&pFormatCtx,input_str,NULL,NULL)!=0){
        LOGE("Couldn't open input stream.\n");
        return -1;
    }
    if(avformat_find_stream_info(pFormatCtx,NULL)<0){
        LOGE("Couldn't find stream information.\n");
        return -1;
    }
    videoindex=-1;
    for(i=0; i<pFormatCtx->nb_streams; i++)
        if(pFormatCtx->streams[i]->codec->codec_type==AVMEDIA_TYPE_VIDEO){
            videoindex=i;
            break;
        }
    if(videoindex==-1){
        LOGE("Couldn't find a video stream.\n");
        return -1;
    }
    pCodecCtx=pFormatCtx->streams[videoindex]->codec;
    pCodec=avcodec_find_decoder(pCodecCtx->codec_id);
    if(pCodec==NULL){
        LOGE("Couldn't find Codec.\n");
        return -1;
    }
    if(avcodec_open2(pCodecCtx, pCodec,NULL)<0){
        LOGE("Couldn't open codec.\n");
        return -1;
    }

    pFrame=av_frame_alloc();
    pFrameYUV=av_frame_alloc();
    out_buffer=(unsigned char *)av_malloc(av_image_get_buffer_size(AV_PIX_FMT_YUV420P,  pCodecCtx->width, pCodecCtx->height,1));
    av_image_fill_arrays(pFrameYUV->data, pFrameYUV->linesize,out_buffer,
        AV_PIX_FMT_YUV420P,pCodecCtx->width, pCodecCtx->height,1);


    packet=(AVPacket *)av_malloc(sizeof(AVPacket));

    img_convert_ctx = sws_getContext(pCodecCtx->width, pCodecCtx->height, pCodecCtx->pix_fmt,
    pCodecCtx->width, pCodecCtx->height, AV_PIX_FMT_YUV420P, SWS_BICUBIC, NULL, NULL, NULL);


  sprintf(info,   "[Input     ]%s\n", input_str);
  sprintf(info, "%s[Output    ]%s\n",info,output_str);
  sprintf(info, "%s[Format    ]%s\n",info, pFormatCtx->iformat->name);
  sprintf(info, "%s[Codec     ]%s\n",info, pCodecCtx->codec->name);
  sprintf(info, "%s[Resolution]%dx%d\n",info, pCodecCtx->width,pCodecCtx->height);


  fp_yuv=fopen(output_str,"wb+");
  if(fp_yuv==NULL){
        printf("Cannot open output file.\n");
        return -1;
    }

    frame_cnt=0;
    time_start = clock();

    while(av_read_frame(pFormatCtx, packet)>=0){
        if(packet->stream_index==videoindex){
            ret = avcodec_decode_video2(pCodecCtx, pFrame, &got_picture, packet);
            if(ret < 0){
                LOGE("Decode Error.\n");
                return -1;
            }
            if(got_picture){
                sws_scale(img_convert_ctx, (const uint8_t* const*)pFrame->data, pFrame->linesize, 0, pCodecCtx->height,
                    pFrameYUV->data, pFrameYUV->linesize);

                y_size=pCodecCtx->width*pCodecCtx->height;
                fwrite(pFrameYUV->data[0],1,y_size,fp_yuv);    //Y
                fwrite(pFrameYUV->data[1],1,y_size/4,fp_yuv);  //U
                fwrite(pFrameYUV->data[2],1,y_size/4,fp_yuv);  //V
                //Output info
                char pictype_str[10]={0};
                switch(pFrame->pict_type){
                    case AV_PICTURE_TYPE_I:sprintf(pictype_str,"I");break;
                  case AV_PICTURE_TYPE_P:sprintf(pictype_str,"P");break;
                    case AV_PICTURE_TYPE_B:sprintf(pictype_str,"B");break;
                    default:sprintf(pictype_str,"Other");break;
                }
                LOGI("Frame Index: %5d. Type:%s",frame_cnt,pictype_str);
                frame_cnt++;
            }
        }
        av_free_packet(packet);
    }
    //flush decoder
    //FIX: Flush Frames remained in Codec
    while (1) {
        ret = avcodec_decode_video2(pCodecCtx, pFrame, &got_picture, packet);
        if (ret < 0)
            break;
        if (!got_picture)
            break;
        sws_scale(img_convert_ctx, (const uint8_t* const*)pFrame->data, pFrame->linesize, 0, pCodecCtx->height,
            pFrameYUV->data, pFrameYUV->linesize);
        int y_size=pCodecCtx->width*pCodecCtx->height;
        fwrite(pFrameYUV->data[0],1,y_size,fp_yuv);    //Y
        fwrite(pFrameYUV->data[1],1,y_size/4,fp_yuv);  //U
        fwrite(pFrameYUV->data[2],1,y_size/4,fp_yuv);  //V
        //Output info
        char pictype_str[10]={0};
        switch(pFrame->pict_type){
            case AV_PICTURE_TYPE_I:sprintf(pictype_str,"I");break;
          case AV_PICTURE_TYPE_P:sprintf(pictype_str,"P");break;
            case AV_PICTURE_TYPE_B:sprintf(pictype_str,"B");break;
            default:sprintf(pictype_str,"Other");break;
        }
        LOGI("Frame Index: %5d. Type:%s",frame_cnt,pictype_str);
        frame_cnt++;
    }
    time_finish = clock();
    time_duration=(double)(time_finish - time_start);

    sprintf(info, "%s[Time      ]%fms\n",info,time_duration);
    sprintf(info, "%s[Count     ]%d\n",info,frame_cnt);

    sws_freeContext(img_convert_ctx);

  fclose(fp_yuv);

    av_frame_free(&pFrameYUV);
    av_frame_free(&pFrame);
    avcodec_close(pCodecCtx);
    avformat_close_input(&pFormatCtx);

    return 0;
}

6」.编写动态库模块的Android.mk文件和Application.mk文件,模块名和MainActivity里面加载的动态库ffmpegdecoder名一致,修改对应的源文件。
Android.mk

LOCAL_PATH := $(call my-dir)

# FFmpeg library
include $(CLEAR_VARS)
LOCAL_MODULE := avcodec
LOCAL_SRC_FILES := libavcodec.so
include $(PREBUILT_SHARED_LIBRARY)

include $(CLEAR_VARS)
LOCAL_MODULE := avdevice
LOCAL_SRC_FILES := libavdevice.so
include $(PREBUILT_SHARED_LIBRARY)

include $(CLEAR_VARS)
LOCAL_MODULE := avfilter
LOCAL_SRC_FILES := libavfilter.so
include $(PREBUILT_SHARED_LIBRARY)

include $(CLEAR_VARS)
LOCAL_MODULE := avformat
LOCAL_SRC_FILES := libavformat.so
include $(PREBUILT_SHARED_LIBRARY)

include $(CLEAR_VARS)
LOCAL_MODULE := avutil
LOCAL_SRC_FILES := libavutil.so
include $(PREBUILT_SHARED_LIBRARY)

include $(CLEAR_VARS)
LOCAL_MODULE := postproc
LOCAL_SRC_FILES := libpostproc.so
include $(PREBUILT_SHARED_LIBRARY)

include $(CLEAR_VARS)
LOCAL_MODULE := swresample
LOCAL_SRC_FILES := libswresample.so
include $(PREBUILT_SHARED_LIBRARY)

include $(CLEAR_VARS)
LOCAL_MODULE := swscale
LOCAL_SRC_FILES := libswscale.so
include $(PREBUILT_SHARED_LIBRARY)

# Program
include $(CLEAR_VARS)
LOCAL_MODULE := ffmpegdecoder
LOCAL_SRC_FILES :=com_qiyi_ffmpegdecoderdemo_MainActivity.c
LOCAL_C_INCLUDES += $(LOCAL_PATH)/include
LOCAL_LDLIBS := -llog -lz
LOCAL_SHARED_LIBRARIES := avcodec avdevice avfilter avformat avutil postproc swresample swscale
include $(BUILD_SHARED_LIBRARY)

Application.mk

APP_ABI :=armeabi
APP_PLATFORM := android-14

7」.在jni目录命令行使用ndk-build命令(安装ndk的时候将ndk加到环境变量中去)。同级目录中生成libs文件夹,通过ndk中的交叉编译工具编译产生的可用的android动态链接库就产生了。


2019-01-29 14-06-42屏幕截图.png

8」.下面就是要android化了,将整个AS项目拷贝到Android源码目录当中去。
编写Android source的Android.mk文件

#
#  Copyright (C) 2015 Google, Inc.
#
#  Licensed under the Apache License, Version 2.0 (the "License");
#  you may not use this file except in compliance with the License.
#  You may obtain a copy of the License at:
#
#  http://www.apache.org/licenses/LICENSE-2.0
#
#  Unless required by applicable law or agreed to in writing, software
#  distributed under the License is distributed on an "AS IS" BASIS,
#  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
#  See the License for the specific language governing permissions and
#  limitations under the License.
#

LOCAL_PATH:= $(call my-dir)
include $(CLEAR_VARS)

LOCAL_MODULE_TAGS := optional

LOCAL_SRC_FILES := \
    $(call all-java-files-under, src)

LOCAL_PACKAGE_NAME := ffmpegdecoder

LOCAL_PROGUARD_ENABLED := disabled

LOCAL_STATIC_JAVA_LIBRARIES := \
     android-support-v4 \
     android-support-v7-recyclerview \
     android-support-v7-preference \
     android-support-v7-appcompat

LOCAL_CERTIFICATE := platform
include $(BUILD_PACKAGE)

将src和res、AndroidManifest.xml文件夹挪到根目录,AndroidManifest.xml要赋予相应权限

<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
    package="com.qiyi.ffmpegdecoderdemo">
    
    <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
    <uses-permission android:name="android.permission.RECORD_AUDIO" />
    <uses-permission android:name="android.permission.RECORD_VIDEO" />
    <uses-permission android:name="android.permission.IN"/>

    <application
        android:allowBackup="true"
        android:icon="@mipmap/ic_launcher"
        android:label="@string/app_name"
        android:roundIcon="@mipmap/ic_launcher_round"
        android:supportsRtl="true"
        android:theme="@style/AppTheme">
        <activity android:name=".MainActivity">
            <intent-filter>
                <action android:name="android.intent.action.MAIN" />

                <category android:name="android.intent.category.LAUNCHER" />
            </intent-filter>
        </activity>
    </application>

</manifest>

修改styles.xml,删除颜色属性,同时检查AndroidManifest.xml的style删除没有
styles.xml

<resources>
</resources>

strings.xml

<resources>
    <string name="app_name">FFmpegDecoderDemo</string>
    <string name="label_url">URL</string>
    <string name="btn_start">开始</string>
    <string name="title_activity_sdl">SDLActivity</string>
    <string name="action_settings">设置</string>
    <string name="input_bitstream">输入比特流</string>
    <string name="sintel.mp4">对应mp4源</string>
    <string name="out_put_raw">输出原始数据</string>
    <string name="sintel.yuv">原始数据</string>
    <string name="hello_world">Hello world!</string>
</resources>

由于编译好的动态链接库还没有封装到apk,运行程序可能会遇到闪退,log中出现找不到相应的ffmpeg库,这个问题可以通过将so库手动打入到apk中。

1).将第三方库.so放到应用下的lib/armeabi中

2).通过mmm编译出apk,这个apk里面没有包含lib目录

3).通过aapt命令,添加lib/armeabi里的.so库,例如:./aapt a ../../../out/target/product/dt307sq/system/app/ClientAgent.apk lib/armeabi/libnative-backendservice-jni.so

aapt命令在源码out/host/linux-x86/bin下,是很强大的工具,注意一定要带lib/armeabi/目录,apk会根据名称生成对应的目录

4).最后签名,例如:java -jar signapk.jar platform.x509.pem platform.pk8 ../../../out/target/product/dt307sq/system/app/ClientAgent.apk ClientAgentSign.apk

生成的apk,具有系统级权限,在源码里编译生成的,同时需要调用的jni动态库是在源码外面通过android ndk编译生成的。

9」.一些小问题
a)mp4放置路径可能不一样,如果抓到如下log,说明

String folderurl=Environment.getExternalStorageDirectory().getPath();

得到的并不是sdcard的目录,这里可以在MainActivity里面修改一下路径即可

01-14 13:52:39.183  3805  3805 I inputurl: /storage/emulated/0/sintel.mp4
01-14 13:52:39.183  3805  3805 I outputurl: /storage/emulated/0/sintel.yuv

b)如果一切正常,点击按钮没反映,要检查一下是否给app开放了相应权限,要进入到setting中打开,根据平台并不是在manifest中添加了读取外部存储的权限就可以了。
10」.不足之处
a)mk文件写好一点就可以不用通过aapt将动态库打包进apk
b)打包进去默认是32位的lib而不是64位的lib64,还需要手动修改data/app/对应appp/lib的目录名

相关文章

网友评论

      本文标题:移植ffmpeg到android源码

      本文链接:https://www.haomeiwen.com/subject/shitsqtx.html