Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/android/226.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
android webrtc无法流式传输语音_Android_Webrtc - Fatal编程技术网

android webrtc无法流式传输语音

android webrtc无法流式传输语音,android,webrtc,Android,Webrtc,我正在用webrtcI在android上构建一个语音聊天应用程序,我成功地在我的PC上与我的模拟器建立了连接,并在两个方向上传输我的语音。我的手机是安卓5.1,我的手机坏了,现在我用的是4.4.2版。当我尝试该应用程序时,它连接良好,但无法流式传输我的语音,这是我在日志中看到的: D/OFFER: v=0 o=- 757416304722047422 2 IN IP4 127.0.0.1 s=- t=0 0

我正在用webrtcI在android上构建一个语音聊天应用程序,我成功地在我的PC上与我的模拟器建立了连接,并在两个方向上传输我的语音。我的手机是安卓5.1,我的手机坏了,现在我用的是4.4.2版。当我尝试该应用程序时,它连接良好,但无法流式传输我的语音,这是我在日志中看到的:

D/OFFER: v=0
             o=- 757416304722047422 2 IN IP4 127.0.0.1
             s=-
             t=0 0
             a=group:BUNDLE audio
             a=msid-semantic: WMS LOCAL_MEDIA_STREAM_ID
             m=audio 9 RTP/SAVPF 111 103 9 102 0 8 106 105 13 127 126
             c=IN IP4 0.0.0.0
             a=rtcp:9 IN IP4 0.0.0.0
             a=ice-ufrag:FkcTrOiVjoakQJoa
             a=ice-pwd:16SQFwIpnPEdmqyYC2PdSDzI
             a=fingerprint:sha-1 1F:85:D7:8C:DB:98:72:E7:D2:DE:52:A7:A4:B5:48:85:F1:BC:F3:AC
             a=setup:actpass
             a=mid:audio
             a=extmap:1 urn:ietf:params:rtp-hdrext:ssrc-audio-level
             a=extmap:3 http://www.webrtc.org/experiments/rtp-hdrext/abs-send-time
             a=sendrecv
             a=rtcp-mux
             a=rtpmap:111 opus/48000/2
             a=fmtp:111 minptime=10; useinbandfec=1
             a=rtpmap:103 ISAC/16000
             a=rtpmap:9 G722/8000
             a=rtpmap:102 ILBC/8000
             a=rtpmap:0 PCMU/8000
             a=rtpmap:8 PCMA/8000
             a=rtpmap:106 CN/32000
             a=rtpmap:105 CN/16000
             a=rtpmap:13 CN/8000
             a=rtpmap:127 red/8000
             a=rtpmap:126 telephone-event/8000
             a=maxptime:60
             a=ssrc:954003986 cname:QR3nFlCQG7p7qQNo
             a=ssrc:954003986 msid:LOCAL_MEDIA_STREAM_ID AUDIO_TRACK_ID_LOCAL
             a=ssrc:954003986 mslabel:LOCAL_MEDIA_STREAM_ID
             a=ssrc:954003986 label:AUDIO_TRACK_ID_LOCAL
    D/AudioManager: SetCommunicationMode(1)@[tid=19672]
    D/HelpersAndroid: Attaching thread to JVM@[tid=19672]
    D/WebRtcAudioManager: setCommunicationMode(true)@[name=Thread-303, id=303]
    D/WebRtcAudioManager: changing audio mode to: MODE_IN_COMMUNICATION
    D/HelpersAndroid: Detaching thread from JVM@[tid=19672]
    D/AudioTrackJni: InitPlayout@[tid=19672]
    D/HelpersAndroid: Attaching thread to JVM@[tid=19672]
    D/WebRtcAudioTrack: InitPlayout(sampleRate=44100, channels=1)
    D/WebRtcAudioTrack: byteBuffer.capacity: 882
    D/AudioTrackJni: OnCacheDirectBufferAddress
    D/AudioTrackJni: direct buffer capacity: 882
    D/AudioTrackJni: frames_per_buffer: 441
    D/WebRtcAudioTrack: AudioTrack.getMinBufferSize: 4096
    D/AudioTrackJni: delay_in_milliseconds: 46
    D/HelpersAndroid: Detaching thread from JVM@[tid=19672]
    D/AudioTrackJni: StartPlayout@[tid=19672]
    D/HelpersAndroid: Attaching thread to JVM@[tid=19672]
    D/WebRtcAudioTrack: StartPlayout
    I/dalvikvm: Could not find method android.media.AudioTrack.write, referenced from method org.webrtc.voiceengine.WebRtcAudioTrack$AudioTrackThread.run
    W/dalvikvm: VFY: unable to resolve virtual method 1101: Landroid/media/AudioTrack;.write (Ljava/nio/ByteBuffer;II)I
    D/dalvikvm: VFY: replacing opcode 0x6e at 0x0078
    D/HelpersAndroid: Detaching thread from JVM@[tid=19672]
    D/WebRtcAudioTrack: AudioTrackThread@[name=AudioTrackJavaThread, id=307]
    D/ICE: IceCandidate added :candidate:547260449 1 udp 2122260223 10.0.2.15 36170 typ host generation 0
    D/ICE: IceCandidate added :candidate:1847424209 1 tcp 1518280447 10.0.2.15 60568 typ host tcptype passive generation 0
    D/AudioTrackJni: StopPlayout@[tid=19672]
    D/HelpersAndroid: Attaching thread to JVM@[tid=19672]
    D/WebRtcAudioTrack: StopPlayout
    D/HelpersAndroid: Detaching thread from JVM@[tid=19672]
    D/AudioManager: SetCommunicationMode(0)@[tid=19672]
    D/HelpersAndroid: Attaching thread to JVM@[tid=19672]
    D/WebRtcAudioManager: setCommunicationMode(false)@[name=Thread-319, id=319]
    D/WebRtcAudioManager: restoring audio mode to: MODE_NORMAL
    D/HelpersAndroid: Detaching thread from JVM@[tid=19672]
    D/AudioManager: SetCommunicationMode(1)@[tid=19672]
    D/HelpersAndroid: Attaching thread to JVM@[tid=19672]
    D/WebRtcAudioManager: setCommunicationMode(true)@[name=Thread-320, id=320]
    D/WebRtcAudioManager: changing audio mode to: MODE_IN_COMMUNICATION
    D/HelpersAndroid: Detaching thread from JVM@[tid=19672]
    D/AudioTrackJni: InitPlayout@[tid=19672]
    D/HelpersAndroid: Attaching thread to JVM@[tid=19672]
    D/WebRtcAudioTrack: InitPlayout(sampleRate=44100, channels=1)
    D/WebRtcAudioTrack: byteBuffer.capacity: 882
    D/AudioTrackJni: OnCacheDirectBufferAddress
    D/AudioTrackJni: direct buffer capacity: 882
    D/AudioTrackJni: frames_per_buffer: 441
    D/WebRtcAudioTrack: AudioTrack.getMinBufferSize: 4096
    D/AudioTrackJni: delay_in_milliseconds: 46
    D/HelpersAndroid: Detaching thread from JVM@[tid=19672]
    D/AudioTrackJni: StartPlayout@[tid=19672]
    D/HelpersAndroid: Attaching thread to JVM@[tid=19672]
    D/WebRtcAudioTrack: StartPlayout
    D/HelpersAndroid: Detaching thread from JVM@[tid=19672]
    D/WebRtcAudioTrack: AudioTrackThread@[name=AudioTrackJavaThread, id=324]
    D/OFFER: v=0
             o=- 2122221720328118009 2 IN IP4 127.0.0.1
             s=-
             t=0 0
             a=group:BUNDLE audio
             a=msid-semantic: WMS LOCAL_MEDIA_STREAM_ID
             m=audio 9 RTP/SAVPF 111 103 9 102 0 8 106 105 13 127 126
             c=IN IP4 0.0.0.0
             a=rtcp:9 IN IP4 0.0.0.0
             a=ice-ufrag:FoasgPNFAm6dZWo8
             a=ice-pwd:lGnZzKSNLhH0vjt0sPw+NIaQ
             a=fingerprint:sha-1 45:15:D5:D0:6B:87:81:5D:61:A4:F8:AC:56:EB:E4:2F:1A:59:AA:16
             a=setup:actpass
             a=mid:audio
             a=extmap:1 urn:ietf:params:rtp-hdrext:ssrc-audio-level
             a=extmap:3 http://www.webrtc.org/experiments/rtp-hdrext/abs-send-time
             a=sendrecv
             a=rtcp-mux
             a=rtpmap:111 opus/48000/2
             a=fmtp:111 minptime=10; useinbandfec=1
             a=rtpmap:103 ISAC/16000
             a=rtpmap:9 G722/8000
             a=rtpmap:102 ILBC/8000
             a=rtpmap:0 PCMU/8000
             a=rtpmap:8 PCMA/8000
             a=rtpmap:106 CN/32000
             a=rtpmap:105 CN/16000
             a=rtpmap:13 CN/8000
             a=rtpmap:127 red/8000
             a=rtpmap:126 telephone-event/8000
             a=maxptime:60
             a=ssrc:3345216954 cname:chD7I2bAd/Iwdbk1
             a=ssrc:3345216954 msid:LOCAL_MEDIA_STREAM_ID AUDIO_TRACK_ID_LOCAL
             a=ssrc:3345216954 mslabel:LOCAL_MEDIA_STREAM_ID
             a=ssrc:3345216954 label:AUDIO_TRACK_ID_LOCAL
    D/ICE: IceCandidate added :candidate:547260449 1 udp 2122260223 10.0.2.15 41078 typ host generation 1
    D/ICE: IceCandidate added :candidate:1847424209 1 tcp 1518280447 10.0.2.15 33099 typ host tcptype passive generation 1.
我在一些论坛上读到,这是因为5.0之前的android版本中没有android.media.AudioTrack.write。我使用的是io。pristine:libjingle:9127@aar我能做些什么来解决这个问题

以下是我的源代码:

package com.example.nyari.webopeer;

    import android.support.v7.app.AppCompatActivity;
    import android.os.Bundle;
    import android.util.Log;
    import android.view.View;
    import android.widget.ArrayAdapter;
    import android.widget.Button;
    import android.widget.EditText;
    import android.widget.ListView;
    import android.widget.TextView;
    import android.widget.Toast;

    import org.json.JSONException;
    import org.json.JSONObject;
    import org.webrtc.AudioSource;
    import org.webrtc.AudioTrack;
    import org.webrtc.DataChannel;
    import org.webrtc.IceCandidate;
    import org.webrtc.MediaConstraints;
    import org.webrtc.MediaStream;
    import org.webrtc.PeerConnection;
    import org.webrtc.PeerConnectionFactory;
    import org.webrtc.SdpObserver;
    import org.webrtc.SessionDescription;

    import java.util.ArrayList;
    import java.util.List;

    import io.socket.client.Socket;
    import io.socket.emitter.Emitter;

    public class MainActivity extends AppCompatActivity implements PeerConnection.Observer,SdpObserver {

        static {
            System.loadLibrary("louts");
        }

        public native Socket socketIO();

        EditText edit;
        TextView hello;
        Button button, button2, button3;
        ListView listView;
        ArrayList<String> list;
        ArrayAdapter<String> adapter;
        Socket client;
        Thread t, voice, play_v;
        Emitter.Listener Hello, enterGroup, leaveGroup, message, androidi, connect, candidate, offer, answer;
        String MESSAGE;
        List<PeerConnection.IceServer> iceServer;
        String Type_Signal;
        /////SOUND MANAGEMENT
        private static String AUDIO_TRACK_ID_LOCAL = "AUDIO_TRACK_ID_LOCAL";
        private static String AUDIO_TRACK_ID_REMOTE = "AUDIO_TRACK_ID_REMOTE";
        private static String LOCAL_MEDIA_STREAM_ID = "LOCAL_MEDIA_STREAM_ID";
        PeerConnectionFactory peerConnectionFactory;
        AudioTrack localAudioTrack, remoteAudioTrack;
        PeerConnection peerConnection;
        MediaConstraints audioConstraints;
        @Override
        protected void onCreate(Bundle savedInstanceState) {
            super.onCreate(savedInstanceState);
            //INITIALISE Peerconnection factory and verifying if it is initialised if not donot continue
            boolean peer = PeerConnectionFactory.initializeAndroidGlobals(
                    getApplicationContext(),
                    true,//boolean for initializing audio portion of webrtc
                    false,//boolean for initializing video portiong of webrtc
                    true,//boolean for hardware acceleration
                    null//renderEGLContext Can be provided to support HW video decoding to texture and will be used to create a shared EGL context on video decoding thread
            );
            if (peer = true) {
                Toast.makeText(getApplicationContext(), "Webrtc initialised", Toast.LENGTH_LONG).show();
                ///IF peerconnectionfactory is correctly initialed now create a peerconnectionfactory object
                peerConnectionFactory = new PeerConnectionFactory();
                Log.i("PCTEST", " factory value " + String.valueOf(peerConnectionFactory));
            } else {
                Toast.makeText(getApplicationContext(), "Webrtc did not initialised", Toast.LENGTH_LONG).show();
            }

            //SET mediaconstaints
            audioConstraints = new MediaConstraints();
            audioConstraints.mandatory.add(new MediaConstraints.KeyValuePair("OfferToReceiveAudio", "true"));
            audioConstraints.mandatory.add(new MediaConstraints.KeyValuePair("OfferToReceiveVideo", "false"));
            audioConstraints.optional.add(new MediaConstraints.KeyValuePair("DtlsSrtpKeyAgreement", "true"));
            //// First we create an AudioSource
            AudioSource audioSource = peerConnectionFactory.createAudioSource(audioConstraints);
            // Once we have that, we can create our AudioTrack
    // Note that AUDIO_TRACK_ID can be any string that uniquely
    // identifies that audio track in your application
            localAudioTrack = peerConnectionFactory.createAudioTrack(AUDIO_TRACK_ID_LOCAL, audioSource);
            // We start out with an empty MediaStream object,
    // created with help from our PeerConnectionFactory
    // Note that LOCAL_MEDIA_STREAM_ID can be any string
            MediaStream mediaStream = peerConnectionFactory.createLocalMediaStream(LOCAL_MEDIA_STREAM_ID);
            mediaStream.addTrack(localAudioTrack);

            /////////////
            //////////////////////
            //BELOW E DEAL WITH SIGNALING AND SOCKET.IO
            setContentView(R.layout.activity_main);
            hello = (TextView) findViewById(R.id.textView);
            ///////////////
            list = new ArrayList<String>();
            adapter = new ArrayAdapter<String>(this, android.R.layout.simple_expandable_list_item_2, list);

            listView = (ListView) findViewById(R.id.list);
            listView.setAdapter(adapter);
    ////////////////////////
            edit = (EditText) findViewById(R.id.edit);
            ///////////////////////
            emit();
            button = (Button) findViewById(R.id.button1);
            button.setOnClickListener(new View.OnClickListener() {
                @Override
                public void onClick(View view) {

                }
            });
            //////////////////////
            button2 = (Button) findViewById(R.id.button2);
            button2.setOnClickListener(new View.OnClickListener() {
                @Override
                public void onClick(View view) {

                }
            });
            /////////////////////
            button3 = (Button) findViewById(R.id.button3);
            button3.setOnClickListener(new View.OnClickListener() {
                @Override
                public void onClick(View view) {

                }
            });
    ////INIT WEBRTC PEERCONNECTION
            iceServer = new ArrayList<PeerConnection.IceServer>();
            iceServer.add(new PeerConnection.IceServer("", "", ""));
            peerConnection = peerConnectionFactory.createPeerConnection(iceServer, audioConstraints, this);
            peerConnection.addStream(mediaStream);
            t = new Thread(new Runnable() {

                @Override
                public void run() {
                    client = socketIO();
                    client.on("Hello", Hello);
                    client.on("connect", connect);
                    client.on("candidate", candidate);
                    client.on("offer", offer);
                    client.on("answer", answer);
                    client.on("android", androidi);
           /* client.on("leaveGroup",leaveGroup);
            client.on("message",message);*/
                    client.connect();
                }
            });
            t.start();


        }

        @Override
        protected void onDestroy() {
            super.onDestroy();

            client.disconnect();
            t.interrupt();
        }

        private void emit() {
            Hello = new Emitter.Listener() {///HELLO MESSAGE WITH SERVER
                @Override
                public void call(final Object... args) {
                    runOnUiThread(new Runnable() {
                        @Override
                        public void run() {
                            final JSONObject obj = (JSONObject) args[0];
                            try {
                                MESSAGE = obj.getString("ki");
                               hello.setText(MESSAGE);

                            } catch (JSONException e) {
                                e.printStackTrace();
                            }
                        }
                    });

                }
            };
            //////////
            connect = new Emitter.Listener() {///RECEIVE VOICE DATA FROM SERVER
                @Override
                public void call(final Object... args) {
                    JSONObject reg = new JSONObject();
                    try {
                        reg.put("grp", "form_1");
                    } catch (JSONException e) {
                        e.printStackTrace();
                    }
                    client.emit("enterGroup", reg);
                }
            };
            ///////
            //////////
            candidate = new Emitter.Listener() {///RECEIVE VOICE DATA FROM SERVER
                @Override
                public void call(final Object... args) {
                    JSONObject reg = new JSONObject();
                    try {
                        reg.put("grp", "form_1");
                    } catch (JSONException e) {
                        e.printStackTrace();
                    }

                }
            };
            //////////
            offer = new Emitter.Listener() {///RECEIVE VOICE DATA FROM SERVER
                @Override
                public void call(final Object... args) {
                    Type_Signal="answer";
                   /* peerConnection.createAnswer(MainActivity.this,audioConstraints);*/
                  //  SessionDescription fi=(SessionDescription)args[0];
                    final SessionDescription sesso=new SessionDescription(SessionDescription.Type.OFFER,args[0].toString());

                    peerConnection.setRemoteDescription(MainActivity.this,sesso);
                    peerConnection.createAnswer(MainActivity.this,audioConstraints);
                    /*
    runOnUiThread(new Runnable(){

        @Override
        public void run() {
            Toast.makeText(getApplicationContext(),sesso.toString(), Toast.LENGTH_LONG).show();
        }
    });*/
                    Log.d("OFFER", sesso.description);

                }
            };

            //////////
            answer = new Emitter.Listener() {///RECEIVE VOICE DATA FROM SERVER
                @Override
                public void call(final Object... args) {
                    final SessionDescription sesso=new SessionDescription(SessionDescription.Type.ANSWER,args[0].toString());
                    peerConnection.setRemoteDescription(MainActivity.this,sesso);
                 /*   runOnUiThread(new Runnable(){

                        @Override
                        public void run() {
                            Toast.makeText(getApplicationContext(),sesso.toString(), Toast.LENGTH_LONG).show();
                        }
                    });*/
                    Log.d("ANSWER",sesso.toString());
                }
            };
            //////////
            androidi = new Emitter.Listener() {///RECEIVE VOICE DATA FROM SERVER
                @Override
                public void call(final Object... args) {
                    ///////////////USED WHEN OFFER IS CREATED TO EMIT OFFER
            /*       Type_Signal="offer";
                    peerConnection.createOffer(MainActivity.this, audioConstraints);*/
                }
            };
        }


        @Override
        public void onStart() {
            super.onStart();


        }

        @Override
        public void onStop() {
            super.onStop();


        }

        @Override
        public void onSignalingChange(PeerConnection.SignalingState signalingState) {

        }

        @Override
        public void onIceConnectionChange(PeerConnection.IceConnectionState iceConnectionState) {

        }

        @Override
        public void onIceGatheringChange(PeerConnection.IceGatheringState iceGatheringState) {

        }

        @Override
        public void onIceCandidate(IceCandidate iceCandidate) {
    peerConnection.addIceCandidate(iceCandidate);
            Log.d("ICE","IceCandidate added :"+iceCandidate.sdp);
        }

        @Override
        public void onAddStream(MediaStream mediaStream) {
            if(mediaStream.audioTracks.size()>0){
                remoteAudioTrack=mediaStream.audioTracks.get(0);
            }
            Log.d("STREAMA","Receiving streams");
        }

        @Override
        public void onRemoveStream(MediaStream mediaStream) {
            remoteAudioTrack=mediaStream.audioTracks.remove();
        }

        @Override
        public void onDataChannel(DataChannel dataChannel) {

        }

        @Override
        public void onRenegotiationNeeded() {

        }

        @Override
        public void onCreateSuccess(SessionDescription sessionDescription) {
          //  hello.setText(sessionDescription.description);
            peerConnection.setLocalDescription(MainActivity.this,sessionDescription);
        /*    JSONObject regu = new JSONObject();
            try {
                regu.put(Type_Signal, sessionDescription);
                client.emit(Type_Signal,regu);

            } catch (JSONException e) {
                e.printStackTrace();
            }*/
            client.emit(Type_Signal,sessionDescription.description);
           // Log.d("ANSWERING",sessionDescription.description);
        }

        @Override
        public void onSetSuccess() {

        }

        @Override
        public void onCreateFailure(final String s) {
            runOnUiThread(new Runnable(){

                @Override
                public void run() {
                    Toast.makeText(getApplicationContext(), "Failed to create offer because " + s, Toast.LENGTH_LONG).show();
                }
            });

        }

        @Override
        public void onSetFailure(final String s) {
            runOnUiThread(new Runnable(){

                @Override
                public void run() {
                    Toast.makeText(getApplicationContext(), "Failed to set offer because " + s, Toast.LENGTH_LONG).show();

                }
            });
        }
    }
package com.example.nyari.webopeer;
导入android.support.v7.app.AppActivity;
导入android.os.Bundle;
导入android.util.Log;
导入android.view.view;
导入android.widget.ArrayAdapter;
导入android.widget.Button;
导入android.widget.EditText;
导入android.widget.ListView;
导入android.widget.TextView;
导入android.widget.Toast;
导入org.json.JSONException;
导入org.json.JSONObject;
导入org.webrtc.AudioSource;
导入org.webrtc.AudioTrack;
导入org.webrtc.DataChannel;
导入org.webrtc.IceCandidate;
导入org.webrtc.MediaConstraints;
导入org.webrtc.MediaStream;
导入org.webrtc.PeerConnection;
导入org.webrtc.PeerConnectionFactory;
导入org.webrtc.SdpObserver;
导入org.webrtc.SessionDescription;
导入java.util.ArrayList;
导入java.util.List;
导入io.socket.client.socket;
导入io.socket.emitter.emitter;
公共类MainActivity扩展AppCompatActivity实现PeerConnection.Observer、SdpObserver{
静止的{
系统加载库(“louts”);
}
公共本机套接字socketIO();
编辑文本编辑;
文本视图你好;
按钮,按钮2,按钮3;
列表视图列表视图;
数组列表;
阵列适配器;
套接字客户端;
线程t、语音、播放v;
Emitter.Listener Hello、enterGroup、leaveGroup、message、androidi、connect、候选者、提议、回答;
字符串消息;
列出iceServer;
串型信号;
/////健全管理
专用静态字符串AUDIO\u TRACK\u ID\u LOCAL=“AUDIO\u TRACK\u ID\u LOCAL”;
专用静态字符串AUDIO\u TRACK\u ID\u REMOTE=“AUDIO\u TRACK\u ID\u REMOTE”;
私有静态字符串LOCAL\u MEDIA\u STREAM\u ID=“LOCAL\u MEDIA\u STREAM\u ID”;
PeerConnectionFactory PeerConnectionFactory;
AudioTrack localAudioTrack、remoteAudioTrack;
对等连接对等连接;
媒体约束和音频约束;
@凌驾
创建时受保护的void(Bundle savedInstanceState){
super.onCreate(savedInstanceState);
//初始化Peerconnection工厂并验证其是否已初始化(如果未初始化,则不继续
布尔对等=PeerConnectionFactory.initializeAndroidGlobals(
getApplicationContext(),
true,//用于初始化webrtc的音频部分的布尔值
false,//用于初始化webrtc的视频移植的布尔值
true,//用于硬件加速的布尔值
可以提供null//renderEGLContext以支持对纹理的硬件视频解码,并将用于在视频解码线程上创建共享EGL上下文
);
如果(对等=真){
Toast.makeText(getApplicationContext(),“Webrtc初始化”,Toast.LENGTH_LONG.show();
///如果正确初始化了peerconnectionfactory,现在创建peerconnectionfactory对象
peerConnectionFactory=新的peerConnectionFactory();
Log.i(“PCTEST”,“factory value”+String.valueOf(peerConnectionFactory));
}否则{
Toast.makeText(getApplicationContext(),“Webrtc未初始化”,Toast.LENGTH\u LONG.show();
}
//设置中间约束
audioConstraints=新媒体约束();
audioConstraints.emdatory.add(newmediaconstraints.KeyValuePair(“OfferToReceiveAudio”,“true”);
audioConstraints.mandatory.add(newmediaconstraints.KeyValuePair(“OfferToReceiveVideo”,“false”));
audioConstraints.optional.add(newmediasconstraints.KeyValuePair(“DtlsSrtpKeyAgreement”,“true”);
////首先,我们创建一个音频源
AudioSource AudioSource=peerConnectionFactory.createAudioSource(audioConstraints);
//一旦我们有了它,我们就可以制作我们的音轨了
//请注意,AUDIO_TRACK_ID可以是唯一
//标识应用程序中的音频曲目
localAudioTrack=peerConnectionFactory.createAudioTrack(音频\u曲目\u ID\u本地,音频源);
//我们从一个空的MediaStream对象开始,
//在PeerConnectionFactory的帮助下创建
//请注意,本地\u媒体\u流\u ID可以是任何字符串
MediaStream MediaStream=peerConnectionFactory.createLocalMediaStream(本地\u媒体\u流\u ID);
mediaStream.addTrack(localAudioTrack);
/////////////
//////////////////////
//下面E处理信令和SOCKET.IO
setContentView(R.layout.activity_main);
hello=(TextView)findViewById(R.id.TextView);
///////////////
列表=新的ArrayList();
adapter=new ArrayAdapter(这是android.R.layout.simple\u expandable\u list\u item\u 2,list);
listView=(listView)findViewById(R.id.list);
setAdapter(适配器);
////////////////////////
edit=(EditText)findViewById(R.id.edit);
///////////////////////
发射();
按钮=(按钮)findViewById(R.id.button1);
setOnClickListener(新视图.OnClickListener(){
@凌驾
公共void onClick(视图){