DingRTC為您提供了輸入外部音視頻流的功能。通過閱讀本文,您可以了解輸入外部音視頻流的方法。
輸入外部視頻流
說明
SDK允許先推流然后開啟外部視頻輸入,但這種情況下,默認開始推流時,先推送出的是本地原始采集源(攝像頭或屏幕捕獲)的視頻數據,直到啟用外部輸入。
調用setExternalVideoSource注冊外部視頻接口回調。
//獲取DingRtcEngine實例 DingRtcEngine mRtcEngine = DingRtcEngine.create(getApplicationContext(),""); mRtcEngine.setExternalVideoSource( boolean enable, boolean useTexture, DingRtcEngine.DingRtcVideoTrack.DingRtcVideoTrackCamera, DingRtcEngine.DingRtcRenderMode.DingRtcRenderModeAuto);
輸入視頻數據。
接口說明如下所示。
public abstract int pushExternalVideoFrame(DingRtcRawDataFrame rawDataFrame,DingRtcVideoTrack streameType);
參數說明,AliRawDataFrame 必填項如下所示。
參數
類型
描述
rawDataFrame
DingRtcRawDataFrame
幀數據。
streameType
DingRtcVideoTrack
視頻流類型。
示例方法如下所示。
private void pushVideoFrame() { String yuvPath = "<!-請輸入yuv文件地址->"; if (TextUtils.isEmpty(yuvPath)) { // 請先選擇YUV文件!!! return; } File yuvDataFile = new File(yuvPath); if (!yuvDataFile.exists()) { // 文件不存在!!! return; } //此處僅為示例代碼做法,請使用正確的流角度 int rotation = 90; //此處僅為示例代碼做法,請使用正確的流寬高 int width = 480; int height = 640; int fps = 25; //start push yuv new Thread() { @Override public void run() { File yuvDataFile = new File(yuvPath); RandomAccessFile raf = null; try { raf = new RandomAccessFile(yuvDataFile, "r"); } catch (FileNotFoundException e) { e.printStackTrace(); return; } try { byte[] buffer = new byte[width * height * 3 / 2]; while (!mIsStopPushVideo) { // mIsStopPushVideo 控制開始或者停止視頻推流 long start = System.currentTimeMillis(); int len = raf.read(buffer); if (len == -1) { raf.seek(0); } DingRtcEngine.DingRtcRawDataFrame rawDataFrame = new DingRtcEngine.DingRtcRawDataFrame(); rawDataFrame.format = DingRtcEngine.DingRtcVideoFormat.DingRtcVideoFormatI420;// 支持I420 rawDataFrame.frame = buffer; rawDataFrame.width = width; rawDataFrame.height = height; rawDataFrame.lineSize[0] = width; rawDataFrame.lineSize[1] = width / 2; rawDataFrame.lineSize[2] = height / 2; rawDataFrame.lineSize[3] = 0; rawDataFrame.rotation = rotation; rawDataFrame.videoFrameLength = buffer.length; rawDataFrame.timestamp = System.nanoTime() / 1000; if (mRtcEngine != null) { mRtcEngine.pushExternalVideoFrame(rawDataFrame, DingRtcEngine.DingRtcVideoTrack.DingRtcVideoTrackCamera); } //通過sleep的時間來控制發送的幀率 long use = System.currentTimeMillis() - start; long delay = 1000/fps - use; if(delay < 0) { delay = 0; } Thread.sleep(delay); } } catch (IOException | InterruptedException ex) { ex.printStackTrace(); } finally { try { raf.close(); } catch (IOException e) { e.printStackTrace(); } } } }.start(); }
調用setExternalVideoSource取消注冊外部視頻接口回調。
//獲取DingRtcEngine實例 DingRtcEngine mRtcEngine = DingRtcEngine.create(getApplicationContext(),""); mRtcEngine.setExternalVideoSource( boolean enable, boolean useTexture, DingRtcEngine.DingRtcVideoTrack.DingRtcVideoTrackCamera, DingRtcEngine.DingRtcRenderMode.DingRtcRenderModeAuto);
參數
類型
描述
enable
boolean
啟用外部視頻輸入源,取值:true:啟用。false(默認值):關閉。
useTexture
boolean
是否使用texture模式,取值:true:使用。false(默認值):不使用。
type
AliRtcVideoTrack
視頻流類型。
renderMode
AliRtcRenderMode
渲染模式。
輸入外部音頻流
調用setExternalAudioSource啟用外部音頻輸入。
//獲取DingRtcEngine實例 DingRtcEngine mRtcEngine = DingRtcEngine.create(getApplicationContext(),""); //設置開啟外部音頻輸入源 mRtcEngine.setExternalAudioSource(true, 44100, 1);
調用pushExternalAudioFrame輸入音頻數據。
private void decodePCMRawData(){ String pcmPath = "/sdcard/123.pcm"; if (TextUtils.isEmpty(pcmPath)) { ToastUtils.LongToast("請先選擇PCM文件!!!"); return; } File pcmDataFile = new File(pcmPath); if (!pcmDataFile.exists()) { ToastUtils.LongToast(pcmPath + " 文件不存在!!!"); return; } ToastUtils.LongToast("inputing pcm data"); new Thread() { @Override public void run() { File pcmDataFile = new File(pcmPath); RandomAccessFile raf = null; try { raf = new RandomAccessFile(pcmDataFile, "r"); } catch (FileNotFoundException e) { e.printStackTrace(); return; } try { int sampleRate = 44100; int length_ms = 10; int numChannels = 1; int bytePerSample = 2 * numChannels; ByteBuffer byteBuffer = ByteBuffer.allocateDirect(sampleRate * length_ms * numChannels * bytePerSample / 1000); final int sizeInBytes = byteBuffer.capacity(); while (true) { long start = System.currentTimeMillis(); int len = raf.read(byteBuffer.array()); if (len == -1) { raf.seek(0); } DingRtcEngine.DingRtcAudioFrame audioFrame = new DingRtcEngine.DingRtcAudioFrame(); audioFrame.data = byteBuffer; audioFrame.numSamples = sampleRate * length_ms /1000; audioFrame.bytesPerSample = bytePerSample; audioFrame.numChannels = numChannels; audioFrame.samplesPerSec = sampleRate; mRtcEngine.pushExternalAudioFrame(audioFrame); byteBuffer.rewind(); long end = System.currentTimeMillis(); long sleep = 10 - (end - start); if(sleep > 0) { SystemClock.sleep(sleep); } } } catch (IOException | InterruptedException ex) { ex.printStackTrace(); } finally { try { raf.close(); } catch (IOException e) { e.printStackTrace(); } } } }.start(); }
文檔內容是否對您有幫助?