"本文转载自:[yanbixing123]的Android MultiMedia框架完全解析 - MediaPlayer的C/S架构与Binder机制实现"
1.概述
在Android中大量使用到了C/S架构来实现应用层和底层服务交互,而Binder机制无处不在。同样MediaPlayer也使用了这种机制,MediaPlayer在运行的时候,同样可以分为Client/Server两个部分,他们分别在不同的进程中,不同进程间的通信使用Binder机制,我们这里就以setDataSource()为例,讲解一下他们是如何建立关系的,架构图如下:
image.png(1)如果从功能角度看,最上层是java层MediaPlayer的API, 然后通过JNI层到C++层之间的IPC通信,最下边就是palyer的具体实现了(StageFrightPlayer,NuPlayer等等)
(2)C++层是比较重要的环节,这一块也是C/S架构的核心,主要围绕C++层MediaPlayer通BpMediaPlayerService这个proxy对象,经过IPC与远程服务端MediaPlayerService(BnMediaPlayerService)通信,完成C/S架构。(Bpxxx是一个代理外包,所有真正的工作是再Bnxxx里面做的,这里p指proxy,n指native)
(3)当Server端接受到Client端的请求,MediaPlayerService会为每一个Client进程创建一个会话,这里就是new一个MediaPlayer::Client对象与其交互,然后这个对象再根据Client端请求的资源类型去判断创建了什么类型的Player,就是最下面那些。(最下面这些Player很多都是芯片厂商自己做的,每家做的都不一样)
2.setDataSource函数
继续分析setDataSource函数,在上一节中,我们分析了从java层到JNI层的一些步骤。
static void
android_media_MediaPlayer_setDataSourceFD(JNIEnv *env, jobject thiz, jobject fileDescriptor, jlong offset, jlong length)
{
sp<MediaPlayer> mp = getMediaPlayer(env, thiz);
if (mp == NULL ) {
jniThrowException(env, "java/lang/IllegalStateException", NULL);
return;
}
if (fileDescriptor == NULL) {
jniThrowException(env, "java/lang/IllegalArgumentException", NULL);
return;
}
int fd = jniGetFDFromFileDescriptor(env, fileDescriptor);
ALOGV("setDataSourceFD: fd %d", fd);
process_media_player_call( env, thiz, mp->setDataSource(fd, offset, length), "java/io/IOException", "setDataSourceFD failed." );
}
它位于frameworks/base/media/jni/android_media_MediaPlayer.cpp文件中,下面就跳转到frameworks/av/media/libmedia/mediaplayer.cpp中:
status_t MediaPlayer::setDataSource(int fd, int64_t offset, int64_t length)
{
ALOGV("setDataSource(%d, %" PRId64 ", %" PRId64 ")", fd, offset, length);
status_t err = UNKNOWN_ERROR;
const sp<IMediaPlayerService> service(getMediaPlayerService());
if (service != 0) {
sp<IMediaPlayer> player(service->create(this, mAudioSessionId));
if ((NO_ERROR != doSetRetransmitEndpoint(player)) ||
(NO_ERROR != player->setDataSource(fd, offset, length))) {
player.clear();
}
err = attachNewPlayer(player);
}
return err;
}
在这个过程中,mediaplayer.cpp中的setDataSource会从ServiceManager中获取MediaPlayerService服务,然后通过服务来创建Player,这个Player就是真正干活的Player。
2.1 通过Binder获取远程服务
通过getMediaPlayerService得到的service就是BpMediaPlayerService,这是和MediaPlayerService进程中对应的BnMediaPlayerService负责Binder通信,BpMediaPlayerService中的create只是通过binder机制将CREATE命令发送出去,真正的工作是在Bn端做的,这个发送命令的代码位frameworks/av/media/libmedia/IMediaPlayerService.cpp中:
- class BpMediaPlayerService
virtual sp<IMediaPlayer> create(
const sp<IMediaPlayerClient>& client, audio_session_t audioSessionId) {
Parcel data, reply;
data.writeInterfaceToken(IMediaPlayerService::getInterfaceDescriptor());
data.writeStrongBinder(IInterface::asBinder(client));
data.writeInt32(audioSessionId);
remote()->transact(CREATE, data, &reply);
return interface_cast<IMediaPlayer>(reply.readStrongBinder());
}
在这个函数中,Parcel是Binder进行通信的数据载体,Bp端向Bn端发送的数据结构是data,Bn端向Bp端回复的数据是reply。然后通过remote()->transact()发送CREATE命令到Bn端。
在服务端的Bn端,BnMediaPlayerService通过onTransact()接受这个命令,并把结果通过replay返回。代码同样位于frameworks/av/media/libmedia/IMediaPlayerService.cpp中:
status_t BnMediaPlayerService::onTransact(
uint32_t code, const Parcel& data, Parcel* reply, uint32_t flags)
{
switch (code) {
case CREATE: {
CHECK_INTERFACE(IMediaPlayerService, data, reply);
sp<IMediaPlayerClient> client =
interface_cast<IMediaPlayerClient>(data.readStrongBinder());
audio_session_t audioSessionId = (audio_session_t) data.readInt32();
sp<IMediaPlayer> player = create(client, audioSessionId);
reply->writeStrongBinder(IInterface::asBinder(player));
return NO_ERROR;
} break;
......
}
}
这里创建了2个匿名Binder Server:IMediaPlayerClient和IMediaPlayer。
BnMediaPlayerService的子类是MediaPlayerService,即MediaPlayerService继承自BnMediaPlayerService。这里首先生成了一个IMediaPlayerClient,这个类用于MediaPlayerService与MediaPlayer通信。然后调MediaPlayerService类的create函数,这里就把客户端的指令最终传送到服务端了,等服务端完成这个指令后,通过Binder通信,填写reply这个Parcel,把最终的结果回馈给客户端,这个流程就算执行完毕了。
2.2 服务端创建会话
MediaPlayer::setDataSource()方法中获取远程服务后,会去调用MediaPlayerService的create方法,文件位于frameworks/av/media/libmediaplayerservice/MediaPlayerService.cpp中:
sp<IMediaPlayer> MediaPlayerService::create(const sp<IMediaPlayerClient>& client,
audio_session_t audioSessionId)
{
pid_t pid = IPCThreadState::self()->getCallingPid();
int32_t connId = android_atomic_inc(&mNextConnId);
sp<Client> c = new Client(
this, pid, connId, client, audioSessionId,
IPCThreadState::self()->getCallingUid());
ALOGV("Create new client(%d) from pid %d, uid %d, ", connId, pid,
IPCThreadState::self()->getCallingUid());
wp<Client> w = c;
{
Mutex::Autolock lock(mLock);
mClients.add(w);
}
return c;
}
在这个函数中,创建了一个MediaPlayerService::Client实例,也就是说,MediaPlayerService会为每个client应用程序创建一个相应的MediaPlayerServece::Client,来提供服务。也就是说,mediaplayer.cpp中执行的setDataSource函数,在MediaPlayerService.cpp中,实际上只是创建了一个client,并将两者对应起来,如果再次执行setDataSource函数,会创建第二个client。同时看这个函数的返回值类型,为sp<IMediaPlayer>,而在函数的最后,return c,这个c的类型是sp<Client>,从而从侧面也证明了两者几乎是等同的关系。
2.3 回执后setDataSource的流程
继续回到frameworks/av/media/libmedia/mediaplayer.cpp中的MediaPlayer::setDataSource函数中,上面执行完了create函数,MediaPlayerService创建了一个client对应,同时将这个client返回到MediaPlayer中,那就继续执行下面的:
player->setDataSource(fd, offset, length)
此时,这个player就对应刚创建的client,以后每次执行有关player的操作,都会对应执行MediaPlayerService::Client中的操作,所以,这里就执行到了MediaPlayerService::Client::setDataSource函数中:
status_t MediaPlayerService::Client::setDataSource(
const sp<IDataSource> &source) {
sp<DataSource> dataSource = CreateDataSourceFromIDataSource(source);
player_type playerType = MediaPlayerFactory::getPlayerType(this, dataSource);
sp<MediaPlayerBase> p = setDataSource_pre(playerType);
if (p == NULL) {
return NO_INIT;
}
// now set data source
return mStatus = setDataSource_post(p, p->setDataSource(dataSource));
}
这里面涉及到了三个主要过程:getPlayerType(),setDataSource_pre()和setDataSource_post()。这三个重点放在后面再讲,这里先分析Media这块的Binder机制。
2.4 Binder机制实现
这块代码涉及到的Binder机制相关代码如下:
class BpMediaPlayerService: public BpInterface<IMediaPlayerService>
class BnMediaPlayerService: public BnInterface<IMediaPlayerService>
class IMediaPlayerService: public IInterface
IMediaPlayerService类中定义了MediaPlayerService中能够提供的服务函数,这个IMediaPlayerService类定义在IMediaPlayerService.h中,并且,需要通过DECLARE_META_INTERFACE宏来声明asInterface这个函数,这个函数与interface_cast相关。interface_cast函数就能够通过asInterface这个函数,创建一个BpMediaPlayerService。当然,具体实现是在asInterface函数中实现的,而asInterface函数的实现是在IMPLEMENT_META_INTERFACE宏中。
在IMediaPlayerService.h中,包含对IMediaPlayerService类的声明和BnMediaPlayerService的声明。IMediaPlayerService类中包含BpMediaPlayerService需要实现的虚函数,相当于IMediaPlayerService.cpp 的头文件。BnMediaPlayerService中只声明一个onTransact函数。
在IMediaPlayerService.cpp中,包含了BpMediaPlayerService类的实现和BnMediaPlayerService类中onTransact函数的实现。
下面来看一下简单的代码实现(IMediaPlayerService.h):
class IMediaPlayerService: public IInterface
{
public:
DECLARE_META_INTERFACE(MediaPlayerService);
virtual sp<IMediaRecorder> createMediaRecorder(const String16 &opPackageName) = 0;
virtual sp<IMediaMetadataRetriever> createMetadataRetriever() = 0;
virtual sp<IMediaPlayer> create(const sp<IMediaPlayerClient>& client,
audio_session_t audioSessionId = AUDIO_SESSION_ALLOCATE) = 0;
virtual sp<IMediaCodecList> getCodecList() const = 0;
// Connects to a remote display.
// 'iface' specifies the address of the local interface on which to listen for
// a connection from the remote display as an ip address and port number
// of the form "x.x.x.x:y". The media server should call back into the provided remote
// display client when display connection, disconnection or errors occur.
// The assumption is that at most one remote display will be connected to the
// provided interface at a time.
virtual sp<IRemoteDisplay> listenForRemoteDisplay(const String16 &opPackageName,
const sp<IRemoteDisplayClient>& client, const String8& iface) = 0;
// codecs and audio devices usage tracking for the battery app
enum BatteryDataBits {
// tracking audio codec
kBatteryDataTrackAudio = 0x1,
// tracking video codec
kBatteryDataTrackVideo = 0x2,
// codec is started, otherwise codec is paused
kBatteryDataCodecStarted = 0x4,
// tracking decoder (for media player),
// otherwise tracking encoder (for media recorder)
kBatteryDataTrackDecoder = 0x8,
// start to play an audio on an audio device
kBatteryDataAudioFlingerStart = 0x10,
// stop/pause the audio playback
kBatteryDataAudioFlingerStop = 0x20,
// audio is rounted to speaker
kBatteryDataSpeakerOn = 0x40,
// audio is rounted to devices other than speaker
kBatteryDataOtherAudioDeviceOn = 0x80,
};
virtual void addBatteryData(uint32_t params) = 0;
virtual status_t pullBatteryData(Parcel* reply) = 0;
};
// ----------------------------------------------------------------------------
class BnMediaPlayerService: public BnInterface<IMediaPlayerService>
{
public:
virtual status_t onTransact( uint32_t code,
const Parcel& data,
Parcel* reply,
uint32_t flags = 0);
};
在IMediaPlayerService.cpp中,直接声明BpMediaPlayerService,它继承自BpInterface<IMediaPlayerService>,然后实现Bp端的代码。同时,还包含BnMediaPlayerService::onTransact函数的具体实现。
以sp<IMediaPlayer> create函数为例(frameworks/av/media/libmedia/IMediaPlayerService.cpp):
virtual sp<IMediaPlayer> create(
const sp<IMediaPlayerClient>& client, audio_session_t audioSessionId) {
Parcel data, reply;
data.writeInterfaceToken(IMediaPlayerService::getInterfaceDescriptor());
data.writeStrongBinder(IInterface::asBinder(client));
data.writeInt32(audioSessionId);
remote()->transact(CREATE, data, &reply);
return interface_cast<IMediaPlayer>(reply.readStrongBinder());
}
在Binder中,我们知道,Bp端只是一个Proxy,真正的实现是在Bn端,所以,在看这个文件中,就能够看到,Bp端只是填写data这个Parcel,然后通过remote()->transact()来把真正需要做的内容(CREATE命令)传递给Bn端。而Bn端其实只有一个onTransact函数,在这个函数中根据不同的case来分别处理:
status_t BnMediaPlayerService::onTransact(
uint32_t code, const Parcel& data, Parcel* reply, uint32_t flags)
{
switch (code) {
case CREATE: {
CHECK_INTERFACE(IMediaPlayerService, data, reply);
sp<IMediaPlayerClient> client =
interface_cast<IMediaPlayerClient>(data.readStrongBinder());
audio_session_t audioSessionId = (audio_session_t) data.readInt32();
sp<IMediaPlayer> player = create(client, audioSessionId);
reply->writeStrongBinder(IInterface::asBinder(player));
return NO_ERROR;
} break;
......
mediaplayer.cpp这个是客户端,客户端提出的请求,最终由服务端来完成,服务端就是MediaPlayerService.cpp,由上面的代码,调用到create函数,这个函数在MediaPlayerService.cpp中实现:
sp<IMediaPlayer> MediaPlayerService::create(const sp<IMediaPlayerClient>& client,
audio_session_t audioSessionId)
{
pid_t pid = IPCThreadState::self()->getCallingPid();
int32_t connId = android_atomic_inc(&mNextConnId);
sp<Client> c = new Client(
this, pid, connId, client, audioSessionId,
IPCThreadState::self()->getCallingUid());
ALOGV("Create new client(%d) from pid %d, uid %d, ", connId, pid,
IPCThreadState::self()->getCallingUid());
wp<Client> w = c;
{
Mutex::Autolock lock(mLock);
mClients.add(w);
}
return c;
}
在create函数中,其实是创建了一个MediaPlayerService::Client的实例,并且这个实例通过一个线程来维护。也就是说,MediaPlayerService会为每个客户端创建一个相应的MediaPlayerService::Client实例,用来提供服务。
2.5 Binder Server
(1)Bp端
再看到frameworks/av/media/libmedia/IMediaPlayerService.cpp中sp<IMediaPlayer> create函数,Bp端会执行如下代码:
data.writeStrongBinder(IInterface::asBinder(client));
它实际上是生成了一个匿名Binder Server,生成的Binder名字叫做IMediaPlayerClient,通过IInterface::asBinder(client),这个Binder的BpMediaPlayerClient获取到BnMediaPlayerClient的handle,这样就能够将Bp端和Bn端联系起来。具体的代码在这里就不仔细分析了。总之,这个Binder没有注册到ServiceManager中,只能在这个进程内部使用。
(2)Bn端
同时这个IMediaPlayerClient并没有创建完整,后续的代码继续看frameworks/av/media/libmedia/IMediaPlayerService.cpp中的BnMediaPlayerService::onTransact(),这次在Bn端:
sp<IMediaPlayerClient> client =
interface_cast<IMediaPlayerClient>(data.readStrongBinder());
这次是真正的创建好IMediaPlayerClient了,然后就是下面的代码,来创建player。在创建player的create函数中,需要把这个Client传进去,同时,又再次用了上面介绍的匿名Binder Server的方法,再次创建了一个匿名Binder server,这次创建的是IMediaPlayer。
sp<IMediaPlayer> player = create(client, audioSessionId);
reply->writeStrongBinder(IInterface::asBinder(player));
(3)下面分析一下这些创建的Binder Server之间的关系
首先是IMediaPlayerService:
class IMediaPlayerService: public IInterface —— IMediaPlayerService.h
class BnMediaPlayerService: public BnInterface<IMediaPlayerService> —— IMediaPlayerService.h
class BpMediaPlayerService: public BpInterface<IMediaPlayerService> —IMediaPlayerService.cpp
MediaPlayerService是BnMediaPlayerService的子类,这个MediaPlayerService需要处理的事情很多,对于IMediaPlayer的事情它要管,对于IMediaMetadataRetriever的事情它也要管,同时还有其他的如:IMediaRecorder,IOMX,IMediaCodecList等等。所以在内部,它会把一些任务分流,比如IMediaPlayer,它执行完create函数后,它为每个客户创建了一个MediaPlayerService::Client实例,并启用一个线程来负责它。
在log信息中可以看到:
01-01 00:06:02.373 230 535 V MediaPlayerService: Client(1) constructor
01-01 00:06:02.373 230 535 V MediaPlayerService: Create new client(1) from pid 1841, uid 10036,
每个客户端都对应MediaPlayerService中的一个Client,并且都有自己的一个connId。
除此之外,还启动了一个IMediaPlayerClient(在MediaPlayerService::create方法中):
class IMediaPlayerClient: public IInterface —— IMediaPlayerClient.h
class BnMediaPlayerClient: public BnInterface<IMediaPlayerClient> —— IMediaPlayerClient.h
class BpMediaPlayerClient: public BpInterface<IMediaPlayerClient> —— IMediaPlayerClient.cpp
这个IMediaPlayerClient负责一个客户端的回调函数,比如对于一个MediaPlayer,MediaPlayerService通过IMediaPlayerClient来通知对应的MediaPlayer。
具体这个可以分析NuPlayerDriver: notifyListener_l(0xb5f17080), (5, 1024, 768)的执行流程。这个IMediaPlayerClient只负责notify这个过程,它相当于包裹在MediaPlayer外面的一层wrapper。
最里面包含了IMediaPlayer:
class IMediaPlayer: public IInterface —— IMediaPlayer.h
class BnMediaPlayer: public BnInterface<IMediaPlayer> —— IMediaPlayer.h
class BpMediaPlayer: public BpInterface<IMediaPlayer> —— IMediaPlayer.cpp
这个就于playback息息相关了,对于每个MediaPlayer常用的函数,start,stop,setDataSource,pause,seekTo等等,都是通过它来传递给MediaPlayerService::Client的。
比如对于start的流程,首先mediaplayer.cpp中发出start的函数 ---> 到达IMediaPlayer的Bp端 ---> 传到IMediaPlayer的Bn端 ---> 到达MediaPlayerService::Client端的MediaPlayerService::Client::start函数 ---> NuPlayerDriver::start ---> NuPlayer: kWhatStart,就这样执行下去了。
网友评论