一 首先在android studio 里面新建一个empty 项目
这儿要用到摄像头和录音,因此要申请权限
image.png
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.RECORD_AUDIO" />
然后加入WebRTC 的依赖,还有权限申请的依赖
image.png
implementation 'org.webrtc:google-webrtc:1.0.32006'
implementation 'pub.devrel:easypermissions:3.0.0'
在这个地方把阿里云的镜像写上,否则是弄不下来google 的依赖包的
image.png
// google()
// mavenCentral()
maven { url 'https://maven.aliyun.com/repository/public' } // new
maven { url 'https://maven.aliyun.com/repository/google' } // new
maven { url 'https://maven.aliyun.com/repository/gradle-plugin' } // new
这时候就 sync 一下项目后, 加上webrtc的显示组件
image.png
<org.webrtc.SurfaceViewRenderer
android:id="@+id/localView"
android:layout_width="match_parent"
android:layout_height="match_parent"
app:layout_constraintBottom_toBottomOf="parent"
app:layout_constraintLeft_toLeftOf="parent"
app:layout_constraintRight_toRightOf="parent"
app:layout_constraintTop_toTopOf="parent" />
然后在MainActivity里面写如下的代码:
package com.example.myappwebrtc01;
import androidx.appcompat.app.AppCompatActivity;
import android.Manifest;
import android.os.Bundle;
import org.webrtc.AudioSource;
import org.webrtc.AudioTrack;
import org.webrtc.Camera1Enumerator;
import org.webrtc.EglBase;
import org.webrtc.MediaConstraints;
import org.webrtc.PeerConnectionFactory;
import org.webrtc.SurfaceTextureHelper;
import org.webrtc.SurfaceViewRenderer;
import org.webrtc.VideoCapturer;
import org.webrtc.VideoSource;
import org.webrtc.VideoTrack;
import pub.devrel.easypermissions.EasyPermissions;
public class MainActivity extends AppCompatActivity {
String[] perms = {Manifest.permission.CAMERA, Manifest.permission.RECORD_AUDIO};
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
if (!EasyPermissions.hasPermissions(MainActivity.this, perms)) {
EasyPermissions.requestPermissions(
MainActivity.this,
"需要相机和录音权限",
100,
perms
);
}
// create PeerConnectionFactory
PeerConnectionFactory.initialize(PeerConnectionFactory.InitializationOptions.builder(this).createInitializationOptions());
PeerConnectionFactory peerConnectionFactory = PeerConnectionFactory.builder().createPeerConnectionFactory();
// create AudioSource
AudioSource audioSource = peerConnectionFactory.createAudioSource(new MediaConstraints());
AudioTrack audioTrack = peerConnectionFactory.createAudioTrack("101", audioSource);
EglBase.Context eglBaseContext = EglBase.create().getEglBaseContext();
SurfaceTextureHelper surfaceTextureHelper = SurfaceTextureHelper.create("CaptureThread", eglBaseContext);
// create VideoCapturer
VideoCapturer videoCapturer = this.createCameraCapturer();
VideoSource videoSource = peerConnectionFactory.createVideoSource(videoCapturer.isScreencast());
videoCapturer.initialize(surfaceTextureHelper, getApplicationContext(), videoSource.getCapturerObserver());
videoCapturer.startCapture(480, 640, 30);
SurfaceViewRenderer localView = findViewById(R.id.localView);
localView.setMirror(true);
localView.init(eglBaseContext, null);
// create VideoTrack
VideoTrack videoTrack = peerConnectionFactory.createVideoTrack("101", videoSource);
// display in localView
videoTrack.addSink(localView);
}
private VideoCapturer createCameraCapturer() {
Camera1Enumerator enumerator = new Camera1Enumerator(false);
final String[] deviceNames = enumerator.getDeviceNames();
// First, try to find front facing camera
for (String deviceName : deviceNames) {
if (enumerator.isFrontFacing(deviceName)) {
VideoCapturer videoCapturer = enumerator.createCapturer(deviceName, null);
if (videoCapturer != null) {
return videoCapturer;
}
}
}
// Front facing camera not found, try something else
for (String deviceName : deviceNames) {
if (!enumerator.isFrontFacing(deviceName)) {
VideoCapturer videoCapturer = enumerator.createCapturer(deviceName, null);
if (videoCapturer != null) {
return videoCapturer;
}
}
}
return null;
}
}
最后编译运行.
网友评论