问题:
声网rtc和ZFPlayer视频播放冲突。具体表现为:
1、在播放声网rtc视频流的同时,开始用ZFPlayer播放其他视频,声网会把其他视频播放暂停(猜测其轮询检测有其他视频播放且其模式不是playAndRecord就会把其他视频暂停)
2、把其他视频模式改为playAndRecord和声网一致的时候,其他视频不会被暂停,但是音频线路被改为了听筒模式,声音很小。
后续更改为NELivePlayerController。问题发生了转移:
1、原来视频播放正常的。有人上麦视频播放就没声了,频道声音正常。之后一直没音。
2、先有人在麦,然后播放视频,视频和频道都正常,然后给嘉宾下麦,视频就没音了,频道正常。
3、视频播放开始卡麦丢失几秒音频
调试打印了,频道和视频的模式都是:playAndRecord、default、defaultToSpeaker
上下麦都会有加入频道操作。猜测是只要加入频道了,声网就关掉了其他音频播放源
解决
joinChannel前设置私有参数:agoraKit.setParameters("{"che.audio.keep.audiosession":true}")
启动退出时需要沿用下频道内AVAudioSession的配置
知识补充
Audio Session:系统与应用程序的中介
AudioSession模式详解
1、监听中断(待验证)
NotificationCenter.default.addObserver(self,
selector: #selector(handleInterruption),
name: .AVAudioSessionInterruption,
object: AVAudioSession.sharedInstance())
func handleInterruption(_ notification: Notification) {
guard let info = notification.userInfo,
let typeValue = info[AVAudioSessionInterruptionTypeKey] as? UInt,
let type = AVAudioSessionInterruptionType(rawValue: typeValue) else {
return
}
if type == .began {
// Interruption began, take appropriate actions (save state, update user interface)
}
else if type == .ended {
guard let optionsValue =
userInfo[AVAudioSessionInterruptionOptionKey] as? UInt else {
return
}
let options = AVAudioSessionInterruptionOptions(rawValue: optionsValue)
if options.contains(.shouldResume) {
// Interruption Ended - playback should resume
}
}
}
2、监听media server
media server通过一个共享服务器进程提供了音频和其他多媒体功能。
AVAudioSessionMediaServicesWereResetNotification:重置
AVAudioSessionMediaServicesWereLostNotification:不可用
3、监听route线路改变
插拔耳机、连接,断开蓝牙耳机、插拔USB音频设备,都会引起线路变化
NotificationCenter.default.addObserver(self,
selector: #selector(handleRouteChange),
name: .AVAudioSessionRouteChange,
object: AVAudioSession.sharedInstance())
func handleRouteChange(notification: NSNotification) {
guard let userInfo = notification.userInfo,
let reasonValue = userInfo[AVAudioSessionRouteChangeReasonKey] as? UInt,
let reason = AVAudioSessionRouteChangeReason(rawValue:reasonValue) else {
return
}
switch reason {
//当有音频硬件插入时,你可以查询audio session的currentRoute属性去确定当前音频输出的位置.它将返回一个AVAudioSessionRouteDescription对象包含audio session全部的输入输出信息
case .newDeviceAvailable:
let session = AVAudioSession.sharedInstance()
for output in session.currentRoute.outputs where output.portType == AVAudioSessionPortHeadphones {
headphonesConnected = true
}
//当一个音频硬件被移除时,我们也可以从该对象中查询上一个线路.在以上两种情况中,我们都可以查询outputs属性,通过返回的AVAudioSessionPortDescription对象提供了音频输出的全部信息
case .oldDeviceUnavailable:
if let previousRoute =
userInfo[AVAudioSessionRouteChangePreviousRouteKey] as? AVAudioSessionRouteDescription {
for output in previousRoute.outputs where output.portType == AVAudioSessionPortHeadphones {
headphonesConnected = false
}
}
default: ()
}
}
3、设置
在激活audio session前必须完成设置内容.如果你正在运行audio session, 先停用它,然后改变设置重新激活.不然会报错:
AVAudioSession.mm:1079:-[AVAudioSession setActive:withOptions:error:]: Deactivating an audio session that has running I/O. All I/O should be stopped or paused prior to deactivating the audio session.
4、设置麦克风
一个设备可能有多个麦克风(内置,外接),iOS会根据当前使用的audio session mode自动选择一个.mode指定了输入数字信号处理(DSP)和可能的线路.输入线路针对每种模式的用例进行了优化,设置mode还可能影响正在使用的音频线路.开发者可以手动选择麦克风
// Preferred Mic = Front, Preferred Polar Pattern = Cardioid
let preferredMicOrientation = AVAudioSessionOrientationFront
let preferredPolarPattern = AVAudioSessionPolarPatternCardioid
// Retrieve your configured and activated audio session
let session = AVAudioSession.sharedInstance()
// Get available inputs
guard let inputs = session.availableInputs else { return }
// Find built-in mic
guard let builtInMic = inputs.first(where: {
$0.portType == AVAudioSessionPortBuiltInMic
}) else { return }
// Find the data source at the specified orientation
guard let dataSource = builtInMic.dataSources?.first (where: {
$0.orientation == preferredMicOrientation
}) else { return }
// Set data source's polar pattern
do {
try dataSource.setPreferredPolarPattern(preferredPolarPattern)
} catch let error as NSError {
print("Unable to preferred polar pattern: \(error.localizedDescription)")
}
// Set the data source as the input's preferred data source
do {
try builtInMic.setPreferredDataSource(dataSource)
} catch let error as NSError {
print("Unable to preferred dataSource: \(error.localizedDescription)")
}
// Set the built-in mic as the preferred input
// This call will be a no-op if already selected
do {
try session.setPreferredInput(builtInMic)
} catch let error as NSError {
print("Unable to preferred input: \(error.localizedDescription)")
}
// Print Active Configuration
session.currentRoute.inputs.forEach { portDesc in
print("Port: \(portDesc.portType)")
if let ds = portDesc.selectedDataSource {
print("Name: \(ds.dataSourceName)")
print("Polar Pattern: \(ds.selectedPolarPattern ?? "[none]")")
}
}
Running this code on an iPhone 6s produces the following console output:
Port: MicrophoneBuiltIn
Name: Front
Polar Pattern: Cardioid
网友评论