Android WebRtc本地视频流不在棉花糖上显示,但在棒棒糖上工作 添加的库:
Android WebRtc本地视频流不在棉花糖上显示,但在棒棒糖上工作 添加的库:,android,webrtc,Android,Webrtc,libjingle\u peerconnection.jar(版本:1.7.0\u 101) libjingle\u peerconnection\u so.so 渐变依赖性: fi.vtt.nubomedia:utilities android:1.0。1@aar 在联想K3笔记本(安卓6棉花糖)上测试 使用以下代码检索VideoCapturer: 使用以下代码更新自定义VideoRenderGui: 在调用PeerConnectionFactory.createVideoSource之前,我
libjingle\u peerconnection.jar(版本:1.7.0\u 101)
libjingle\u peerconnection\u so.so
渐变依赖性:
fi.vtt.nubomedia:utilities android:1.0。1@aar
在联想K3笔记本(安卓6棉花糖)上测试
使用以下代码检索VideoCapturer:
使用以下代码更新自定义VideoRenderGui:
在调用
PeerConnectionFactory.createVideoSource
之前,我们已将适当的EGL上下文传递给PeerConnectionFactory.setVideohAccelerationOptions
下面是代码
factory.setVideoHwAccelerationOptions(rootEglBase.getEglBaseContext(), rootEglBase.getEglBaseContext());
localMS = factory.createLocalMediaStream("ARDAMS");
if (pcParams.videoCallEnabled) {
getVideoCapturer();
videoSource = factory.createVideoSource(videoCapturer);
videoCapturer.startCapture(pcParams.videoWidth, pcParams.videoHeight, pcParams.videoFps);
videoTrack = factory.createVideoTrack("ARDAMSv0", videoSource);
videoTrack.setEnabled(true);
localMS.addTrack(videoTrack);
}
audioSource = factory.createAudioSource(new MediaConstraints());
audioTrack = factory.createAudioTrack("ARDAMSa0", audioSource);
localMS.addTrack(audioTrack);
mListener.onLocalStream(localMS, true);
Eglcontext
是在创建surfaceviewrender
的活动上创建的。
并将其作为参数传递给setVideohAccelerationOptions
方法
下一行显示了如何创建EGL上下文
rootEglBase = EglBase.create();
有关详细信息,请参阅以下链接:
您好,我正在使用libjingle\u peerconnection.jar和libjingle\u peerconnection\u so.so文件,但尽管将最大宽度和最大高度设置为1920 X1080,但发送的流是720p。我已将我的三星tblet设置为1080p分辨率。请您指导我如何获得1080p或使用上述文件更新项目。这将非常有帮助。或者至少指导我如何获取这些文件。我们将等待您的答复。
private void createCapturer(CameraEnumerator enumerator) {
final String[] deviceNames = enumerator.getDeviceNames();
Logging.d(TAG, "Looking for front facing cameras.");
for (String deviceName : deviceNames) {
if (enumerator.isFrontFacing(deviceName)) {
Logging.d(TAG, "Creating front facing camera capturer.");
videoCapturer = enumerator.createCapturer(deviceName, null);
if (videoCapturer != null) {
System.out.println("deviceName = " + deviceName);
return;
}
}
}
// Front facing camera not found, try something else
Logging.d(TAG, "Looking for other cameras.");
for (String deviceName : deviceNames) {
if (!enumerator.isFrontFacing(deviceName)) {
Logging.d(TAG, "Creating other camera capturer.");
videoCapturer = enumerator.createCapturer(deviceName, null);
if (videoCapturer != null) {
return;
}
}
}
}
localStream.videoTracks.get(0).addRenderer(new VideoRenderer(localRender));
localVideoRenderGui.update(localRender,
LOCAL_X_CONNECTING, LOCAL_Y_CONNECTING,
LOCAL_WIDTH_CONNECTING, LOCAL_HEIGHT_CONNECTING,
scalingType, true);
factory.setVideoHwAccelerationOptions(rootEglBase.getEglBaseContext(), rootEglBase.getEglBaseContext());
localMS = factory.createLocalMediaStream("ARDAMS");
if (pcParams.videoCallEnabled) {
getVideoCapturer();
videoSource = factory.createVideoSource(videoCapturer);
videoCapturer.startCapture(pcParams.videoWidth, pcParams.videoHeight, pcParams.videoFps);
videoTrack = factory.createVideoTrack("ARDAMSv0", videoSource);
videoTrack.setEnabled(true);
localMS.addTrack(videoTrack);
}
audioSource = factory.createAudioSource(new MediaConstraints());
audioTrack = factory.createAudioTrack("ARDAMSa0", audioSource);
localMS.addTrack(audioTrack);
mListener.onLocalStream(localMS, true);
rootEglBase = EglBase.create();