本文介紹了 MPIDRSSDK 中可能用到的部分 MRTC 接口及接口的使用方法。
MRTC API
下面只列舉了雙錄過程中可能會用到的部分 MRTC 接口,更多 MRTC 詳細信息參見 iOS MRTC使用文檔。
MRTC 配合 MPIDRSSDK 的使用說明
初始化 MRTC 實例
MPIDRSSDK 可以用來初始化 MRTC 實例,您獲取 MRTC 實例之后可以配置音視頻通話邏輯,以下為 Demo 配置。
[MPIDRSSDK initRTCWithUserId:self.uid appId:AppId success:^(id _Nonnull responseObject) {
self.artvcEgnine = responseObject;
// 音視頻通話相關狀態回調delegate
self.artvcEgnine.delegate = self;
//設置視頻編碼分辨率,默認是 ARTVCVideoProfileType_640x360_15Fps。
self.artvcEgnine.videoProfileType = ARTVCVideoProfileType_1280x720_30Fps;
// 音視頻通話發布流配置
ARTVCPublishConfig *publishConfig = [[ARTVCPublishConfig alloc] init];
publishConfig.videoProfile = self.artvcEgnine.videoProfileType;
publishConfig.audioEnable = YES;
publishConfig.videoEnable = YES;
self.artvcEgnine.autoPublishConfig = publishConfig;
// 自動推流
self.artvcEgnine.autoPublish = YES;
// 音視頻通話訂閱配置
ARTVCSubscribeOptions *subscribeOptions = [[ARTVCSubscribeOptions alloc] init];
subscribeOptions.receiveAudio = YES;
subscribeOptions.receiveVideo = YES;
self.artvcEgnine.autoSubscribeOptions = subscribeOptions;
// 自動拉流(訂閱)
self.artvcEgnine.autoSubscribe = YES;
// 如果需要回調本地音頻數據,設置YES
self.artvcEgnine.enableAudioBufferOutput = YES;
// 如果需要回調本地視頻數據,設置YES
self.artvcEgnine.enableCameraRawSampleOutput = YES;
// 聲音模式
self.artvcEgnine.expectedAudioPlayMode = ARTVCAudioPlayModeSpeaker;
// 帶寬不足時保證分辨率優先(幀率下降)還是流暢度優先 (分辨率下降)
/*
ARTVCDegradationPreferenceMAINTAIN_FRAMERATE, //流暢度優先
ARTVCDegradationPreferenceMAINTAIN_RESOLUTION, //分辨率優先
ARTVCDegradationPreferenceBALANCED, //自動平衡
*/
self.artvcEgnine.degradationPreference = ARTVCDegradationPreferenceMAINTAIN_FRAMERATE;
// 啟動相機預覽,默認使用前置攝像頭,如果設置為 YES 則使用后置攝像頭
[self.artvcEgnine startCameraPreviewUsingBackCamera:NO];
// 沒有房間時選擇創建房間
[IDRSSDK createRoom];
// 或者加入已有房間
// [IDRSSDK joinRoom:self.roomId token:self.rtoken];
} failure:^(NSError * _Nonnull error) {
}];
MRTC 代理回調
下面僅列出了部分常用回調 API,更多 API 信息請參見 ARTVCEngineDelegate。
啟動相機預覽后,如果本地 feed 沒有被回調過,則回調后返回一個 ARTVCFeed 對象,可用于關聯后續返回的渲染 View。
- (void)didReceiveLocalFeed:(ARTVCFeed*)localFeed { switch(localFeed.feedType){ case ARTVCFeedTypeLocalFeedDefault: self.localFeed = localFeed; break; case ARTVCFeedTypeLocalFeedCustomVideo: self.customLocalFeed = localFeed; break; case ARTVCFeedTypeLocalFeedScreenCapture: self.screenLocalFeed = localFeed; break; default: break; } }
本地和遠端 feed 相關的 renderView 回調。
重要此時不代表 renderView 已經渲染首幀。
- (void)didVideoRenderViewInitialized:(UIView*)renderView forFeed:(ARTVCFeed*)feed { //可觸發 UI 布局,把 renderView add 到 view 層級中去 [self.viewLock lock]; [self.contentView addSubview:renderView]; [self.viewLock unlock]; }
本地和遠端 feed 首幀渲染的回調。
//fist video frame has been rendered - (void)didFirstVideoFrameRendered:(UIView*)renderView forFeed:(ARTVCFeed*)feed { }
某個 feed 停止渲染的回調。
- (void)didVideoViewRenderStopped:(UIView*)renderView forFeed:(ARTVCFeed*)feed { }
創建房間成功,有房間信息回調。
-(void)didReceiveRoomInfo:(ARTVCRoomInfomation*)roomInfo { //拿到房間號、token // roomInfo.roomId // roomInfo.rtoken }
創建房間失敗,有 Error 回調。
-(void)didEncounterError:(NSError *)error forFeed:(ARTVCFeed*)feed{ //error.code == ARTVCErrorCodeProtocolErrorCreateRoomFailed }
加入房間成功,會有加入房間成功的回調以及房間已有成員的回調。
-(void)didJoinroomSuccess{ } -(void)didParticepantsEntered:(NSArray<ARTVCParticipantInfo*>*)participants{ }
加入房間失敗、推流失敗等音視頻通話中出現錯誤的回調。
-(void)didEncounterError:(NSError *)error forFeed:(ARTVCFeed*)feed{ //error.code == ARTVCErrorCodeProtocolErrorJoinRoomFailed }
成員離開房間后,房間其他成員會收到成員離開的回調
-(void)didParticepant:(ARTVCParticipantInfo*)participant leaveRoomWithReason:(ARTVCParticipantLeaveRoomReasonType)reason { }
創建或者加入房間成功后開始推流與拉流,推流/拉流過程中,有如下相關狀態回調。
-(void)didConnectionStatusChangedTo:(ARTVCConnectionStatus)status forFeed:(ARTVCFeed*)feed{ [self showToastWith:[NSString stringWithFormat:@"connection status:%d\nfeed:%@",status,feed] duration:1.0]; if((status == ARTVCConnectionStatusClosed) && [feed.uid isEqualToString:[self uid]]){ [self.artvcEgnine stopCameraPreview];//音視頻通話下,停止攝像頭預覽。 [self.artvcEgnine leaveRoom]; } }
狀態說明
推流成功后,其他房間成員會收到新 feed 的回調。
-(void)didNewFeedAdded:(ARTVCFeed*)feed { }
取消發布后,房間其他成員會收到取消發布的回調。
-(void)didFeedRemoved:(ARTVCFeed*)feed{ }
本地流音頻數據回調。
- (void)didOutputAudioBuffer:(ARTVCAudioData*)audioData { if (audioData.audioBufferList->mBuffers[0].mData != NULL && audioData.audioBufferList->mBuffers[0].mDataByteSize > 0) { pcm_frame_t pcmModelInput; pcmModelInput.len = audioData.audioBufferList->mBuffers[0].mDataByteSize; pcmModelInput.buf = (uint8_t*)audioData.audioBufferList->mBuffers[0].mData; pcmModelInput.sample_rate = audioData.sampleRate; pcm_frame_t pcmModelOutput; pcm_resample_16k(&pcmModelInput, &pcmModelOutput); NSData *srcData = [NSData dataWithBytes:pcmModelOutput.buf length:pcmModelOutput.len]; //檢測音頻數據 [self.idrs feedAudioFrame:srcData]; } }
遠端流音頻數據回調。
可用來檢測遠端語音,下面示例代碼以檢測遠端激活詞為例。
- (void)didOutputRemoteMixedAudioBuffer:(ARTVCAudioData *)audioData { if (audioData.audioBufferList->mBuffers[0].mData != NULL && audioData.audioBufferList->mBuffers[0].mDataByteSize > 0) { pcm_frame_t pcmModelInput; pcmModelInput.len = audioData.audioBufferList->mBuffers[0].mDataByteSize; pcmModelInput.buf = (uint8_t*)audioData.audioBufferList->mBuffers[0].mData; pcmModelInput.sample_rate = audioData.sampleRate; pcm_frame_t pcmModelOutput; pcm_resample_16k(&pcmModelInput, &pcmModelOutput); NSData *srcData = [NSData dataWithBytes:pcmModelOutput.buf length:pcmModelOutput.len]; //檢測音頻數據 [self.idrs feedAudioFrame:srcData]; } }
本地相機流數據回調。
可用來檢測人臉、手勢、簽名類型、身份證等,下方代碼以檢測人臉特征代碼為例。
dispatch_queue_t testqueue = dispatch_queue_create("testQueue", NULL); - (void)didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer { dispatch_sync(testqueue, ^{ @autoreleasepool { CVPixelBufferRef newBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); UIImage * image = [self.idrs getImageFromRPVideo:newBuffer]; IDRSFaceDetectParam *detectParam = [[IDRSFaceDetectParam alloc]init]; detectParam.dataType = IDRSFaceDetectInputTypePixelBuffer; detectParam.buffer = newBuffer; detectParam.inputAngle = 0; detectParam.outputAngle = 0; detectParam.faceNetType = 0; detectParam.supportFaceRecognition = false; detectParam.supportFaceLiveness = false; //人臉追蹤 [self.idrs faceTrackFromVideo:detectParam faceDetectionCallback:^(NSError *error, NSArray<FaceDetectionOutput *> *faces) { dispatch_async(dispatch_get_main_queue(), ^{ self.drawView.faceDetectView.detectResult = faces; }); }]; } }) }
自定義推音頻數據流回調時,MRTC 會自動觸發調用,此時則需要在下面這個方法中發送數據。
- (void)didCustomAudioDataNeeded { //[self.artvcEgnine sendCustomAudioData:data]; }
投屏
以下為 MRTC 自帶的應用內投屏代碼示例,如果需要使用跨應用投屏請參見 iOS 系統中如何實現跨應用共享屏幕
開始投屏
ARTVCCreateScreenCaputurerParams* screenParams = [[ARTVCCreateScreenCaputurerParams alloc] init];
screenParams.provideRenderView = YES;
[self.artvcEgnine startScreenCaptureWithParams:screenParams complete:^(NSError* error){
if(error){
[weakSelf showToastWith:[NSString stringWithFormat:@"Error:%@",error] duration:1.0];
}else {
ARTVCPublishConfig* config = [[ARTVCPublishConfig alloc] init];
config.videoSource = ARTVCVideoSourceType_Screen;
config.audioEnable = NO;
config.videoProfile = ARTVCVideoProfileType_ScreenRatio_1280_15Fps;
config.tag = @"MPIDRS_ShareScreen";
[weakSelf.artvcEgnine publish:config];
}
}];
停止投屏
- (void)stopScreenSharing {
[self.artvcEgnine stopScreenCapture];
ARTVCUnpublishConfig* config = [[ARTVCUnpublishConfig alloc] init];
config.feed = self.screenLocalFeed;
[self.artvcEgnine unpublish:config complete:^(){
}];
}
推 TTS 語音流
自定義推流(音頻數據專用)。
// MRTC 發布自定義推流用于TTS音頻文件 - (void)startCustomAudioCapture{ ARTVCCreateCustomVideoCaputurerParams* params = [[ARTVCCreateCustomVideoCaputurerParams alloc] init]; params.audioSourceType = ARTVCAudioSourceType_Custom; params.customAudioFrameFormat.sampleRate = 16000; params.customAudioFrameFormat.samplesPerChannel = 160; ARTVCPublishConfig* audioConfig = [[ARTVCPublishConfig alloc] init]; audioConfig.videoSource = ARTVCVideoSourceType_Custom; audioConfig.videoEnable = YES; audioConfig.audioSource = ARTVCAudioSourceType_Custom; audioConfig.tag = @"customAudioFeed"; self.audioConfig = audioConfig; self.customAudioCapturer = [_artvcEgnine createCustomVideoCapturer:params]; _artvcEgnine.autoPublish = NO; [_artvcEgnine publish:self.audioConfig]; }
推薦在自定義推流發布成功后開始 TTS 合成播報。
- (void)didConnectionStatusChangedTo:(ARTVCConnectionStatus)status forFeed:(ARTVCFeed*)feed { if (status == ARTVCConnectionStatusConnected && [feed isEqual:self.customLocalFeed]) { NSString * string = @"盛先生您好,被保險人于本附加合同生效(或最后復效)之日起一百八十日內"; self.customAudioData = [[NSMutableData alloc] init]; self.customAudioDataIndex = 0; self.ttsPlaying = YES; [self.idrs setTTSParam:@"extend_font_name" value:@"xiaoyun"]; [self.idrs setTTSParam:@"speed_level" value:@"1"]; [self.idrs startTTSWithText:string]; [self.idrs getTTSParam:@"speed_level"]; } }
在 TTS 代理回調中獲取合成的音頻數據。
- (void)onNuiTtsUserdataCallback:(NSString *)info infoLen:(int)info_len buffer:(char *)buffer len:(int)len taskId:(NSString *)task_id{ NSLog(@"remote :: onNuiTtsUserdataCallback:%@ -- %d",info,info_len); if (buffer) { NSData *audioData = [NSData dataWithBytes:buffer length:len]; [self.customAudioData appendData:audioData]; } }
在 MRTC 代理回調中發送音頻數據。
- (void)didCustomAudioDataNeeded { if (!self.ttsPlaying || ( self.customAudioDataIndex + 320 > (int)[self.customAudioData length])) { [self stopPublishCustomAudio]; return ; } NSRange range = NSMakeRange (self.customAudioDataIndex, 320); NSData *data = [self.customAudioData subdataWithRange:range]; //發送語音數據到MRTC [self.artvcEgnine sendCustomAudioData:data]; self.customAudioDataIndex += 320; }
開啟服務端錄制
成功初始化 MPIDRSSDK 和 MRTC 實例后相關錄制功能才生效。
開啟遠程錄制。
每開啟一次服務端錄制任務,錄制回調則返回一個錄制 ID,錄制 ID 可用來停止、變更對應的錄制任務。
MPRemoteRecordInfo *recordInfo = [[MPRemoteRecordInfo alloc] init]; recordInfo.roomId = self.roomId; // 控制臺水印id //recordInfo.waterMarkId = self.watermarkId; recordInfo.tagFilter = tagPrefix; recordInfo.userTag = self.uid; recordInfo.recordType = MPRemoteRecordTypeBegin; // 業務根據實際情況傳入是單流還是合流(混流) recordInfo.fileSuffix = MPRemoteRecordFileSingle; // 如果錄制時有流布局要求,可參考MPRemoteRecordInfo自定義 // recordInfo.tagPositions = tagModelArray; // 如果錄制時有端上自定義水印要求,可參考MPRemoteRecordInfo自定義 //recordInfo.overlaps = customOverlaps; [MPIDRSSDK executeRemoteRecord:recordInfo waterMarkHandler:^(NSError * _Nonnull error) { }];
變更錄制配置。
已經開啟的錄制任務,支持修改水印和流布局。
MPRemoteRecordInfo *recordInfo = [[MPRemoteRecordInfo alloc] init]; recordInfo.roomId = self.roomId; recordInfo.recordType = MPRemoteRecordTypeChange; // 1、錄制任務id recordInfo.recordId = recordId; // 2、流布局 recordInfo.tagPositions = tagModelArray; // 3、水印 recordInfo.overlaps = customOverlaps; [MPIDRSSDK executeRemoteRecord:recordInfo waterMarkHandler:^(NSError * _Nonnull error) { }];
MRTC 錄制回調。
- (void)didReceiveCustomSignalingResponse:(NSDictionary *)dictionary { id opcmdObject = [dictionary objectForKey:@"opcmd"]; if ([opcmdObject isKindOfClass:[NSNumber class]]) { int opcmd = [opcmdObject intValue]; switch (opcmd) { case MPRemoteRecordTypeBeginResponse: { self.startTime = [NSDate date]; // 回調的錄制id self.recordId = [dictionary objectForKey:@"recordId"]; if ([[dictionary objectForKey:@"msg"] isEqualToString:@"SUCCESS"]) { NSLog(@"開啟錄制成功"); }else { NSLog(@"開啟錄制失敗"); } } break; case MPRemoteRecordTypeChangeResponse: { if ([[dictionary valueForKey:@"msg"] isEqualToString:@"SUCCESS"]) { NSLog(@"修改錄制配置成功"); }else { NSLog(@"修改錄制配置失敗"); } } break; case MPRemoteRecordTypeStopResponse: { if ([[dictionary valueForKey:@"msg"] isEqualToString:@"SUCCESS"]) { NSLog(@"結束錄制成功"); }else { NSLog(@"結束錄制錯誤"); } } break; default: break; } } }
停止指定的服務端錄制任務。
[MPIDRSSDK stopRemoteRecord:@"錄制id"];
上傳錄制產物。
說明開啟多次錄制時,需要上傳多次。
self.duration = [[NSDate date] timeIntervalSince1970] - [self.startTime timeIntervalSince1970]; IDRSUploadManagerParam *param = [[IDRSUploadManagerParam alloc] init]; param.duration = self.duration; param.appId = AppId; param.ak = Ak; param.sk = Sk; param.type = IDRSRecordRemote; param.recordAt = self.startTime; param.roomId = self.roomId; [IDRSUploadManager uploadFileWithParam:param success:^(id _Nonnull responseObject) { [self showToastWith:responseObject duration:3]; } failure:^(NSError * _Nonnull error, IDRSUploadManagerParam * _Nonnull upLoadParam) { if (upLoadParam) { [self showToastWith:@"upload error" duration:3]; } }];