Warning: file_get_contents(/data/phpspider/zhask/data//catemap/1/dart/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Flutter 颤振WebRTC播放音频_Flutter_Dart_Webrtc_Just Audio - Fatal编程技术网

Flutter 颤振WebRTC播放音频

Flutter 颤振WebRTC播放音频,flutter,dart,webrtc,just-audio,Flutter,Dart,Webrtc,Just Audio,我希望播放来自webrtc连接远程流的音频,在Flutter中。Flatter webrtc的示例使用RTCDevideoRenderer,但在我的示例中没有视频。远程流仅包含音频 简言之: pc.onTrack = (event) { // how can I play the Audio stream in event.streams[0] ? }; 代码: MediaStream\u localStream=wait createStream(); RTPEERCONNECTION

我希望播放来自webrtc连接远程流的音频,在Flutter中。Flatter webrtc的示例使用
RTCDevideoRenderer
,但在我的示例中没有视频。远程流仅包含音频

简言之:

pc.onTrack = (event) {
  // how can I play the Audio stream in event.streams[0] ?
};
代码:

MediaStream\u localStream=wait createStream();
RTPEERCONNECTION pc=等待createPeerConnection({});
_localStream.getTracks().forEach((track)async=>wait pc.addTrack(track,_localStream));
pc.onTrack=(事件){
//如何在event.streams[0]中播放音频流?
};
Future createStream()异步{
最终映射媒体约束={
“音频”:正确,
“视频”:错误
};
MediaStream=等待MediaDevices.getUserMedia(mediaConstraints);
回流;
}

如何在Flatter中播放远程流中的音频?

我认为需要添加到视频渲染器中。
MediaStream _localStream = await createStream();
RTCPeerConnection pc = await createPeerConnection({});

_localStream.getTracks().forEach((track) async => await pc.addTrack(track, _localStream));  

pc.onTrack = (event) {
  // how can I play the Audio stream in event.streams[0] ?
};
    
Future<MediaStream> createStream() async {
  final Map<String, dynamic> mediaConstraints = {
    'audio': true,
    'video': false
  };

  MediaStream stream = await MediaDevices.getUserMedia(mediaConstraints);
  return stream;
}