Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/java/328.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/search/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java 对等adhoc网络中实时传输协议RTP(realtimetransportprotocol)流的中继_Java_Rtp_Jmf - Fatal编程技术网

Java 对等adhoc网络中实时传输协议RTP(realtimetransportprotocol)流的中继

Java 对等adhoc网络中实时传输协议RTP(realtimetransportprotocol)流的中继,java,rtp,jmf,Java,Rtp,Jmf,我正在使用JMF框架为ad hoc网络建立对等实时语音通信。我使用了两个线程,即:每个对等点的接收方和发送方。接收方线程用于接收实时音频数据,发送方线程用于在对等方之间传输实时数据 它可以在一跳距离的两个对等点之间正常工作,但我想在两跳距离的两个对等点之间中继RTP流,如a、b和c 是三个对等点,它们以类似于a->b->c的方式连接(这里,a连接到b,所以b连接到c,但a没有连接到c),我想通过b在a到c之间中继rtp流 他们有没有办法在JAVA、JMF或任何其他库中实现这一点 \接收线 pub

我正在使用JMF框架为ad hoc网络建立对等实时语音通信。我使用了两个线程,即:每个对等点的接收方和发送方。接收方线程用于接收实时音频数据,发送方线程用于在对等方之间传输实时数据

它可以在一跳距离的两个对等点之间正常工作,但我想在两跳距离的两个对等点之间中继RTP流,如a、b和c 是三个对等点,它们以类似于a->b->c的方式连接(这里,a连接到b,所以b连接到c,但a没有连接到c),我想通过b在a到c之间中继rtp流

他们有没有办法在JAVA、JMF或任何其他库中实现这一点

\接收线

public void run() {
    String url= "rtp://"+ip+":"+port+"/audio/16";

    MediaLocator mrl= new MediaLocator(url);
    System.out.println(mrl);
    if (mrl == null) {
        System.err.println("Can't build MRL for RTP");
        System.exit(-1);
    }

    // Create a player for this rtp session
    Player player = null;
    try {
        player = Manager.createPlayer(mrl);
    } catch (Exception e) {
        System.err.println("Error:" + e);
        System.exit(-1);
    } 

    //System.out.println(player);

    if (player != null) {
        System.out.println("Player created.");
        player.realize();
        while (player.getState() != Player.Realized){
            try {
                Thread.sleep(10);
            } catch (InterruptedException e) {
                e.printStackTrace();
            }
        }
        player.start();
    } else {
        System.err.println("Player doesn't created.");
        System.exit(-1);
    }
}
}

\发送线程

    public void run() {    
    AudioFormat format= new AudioFormat(AudioFormat.LINEAR,8000,8,1); 
    Vector devices=CaptureDeviceManager.getDeviceList(format);
    CaptureDeviceInfo di=null;
    if (devices.size() > 0) {
         di = (CaptureDeviceInfo) devices.elementAt( 0);
                            } 
    else {

        System.out.println("hii");
        System.exit(-1); 
    }
    Processor processor = null;
    try { 
        processor = Manager.createProcessor (di.getLocator());
    } catch (Exception e) { 
        System.exit(-1);}
   // configure the processor  
   processor.configure();
   while (processor.getState() != Processor.Configured){
       try {
           Thread.sleep(100);
       } catch (InterruptedException e) {
           // TODO Auto-generated catch block
           e.printStackTrace();
       }
   }
   processor.setContentDescriptor( 
       new ContentDescriptor( ContentDescriptor.RAW));
   TrackControl track[] = processor.getTrackControls(); 
   boolean encodingOk = false;
   for (int i = 0; i < track.length; i++) { 
        if (!encodingOk && track[i] instanceof FormatControl) {  
            if (((FormatControl)track[i]).
          setFormat( new AudioFormat(AudioFormat.GSM_RTP, 
                                           8000, 
                                           8, 
                                           1)) == null) {
               track[i].setEnabled(false); 
            }
            else {
                encodingOk = true; 
            }
        } else { 
            // we could not set this track to gsm, so disable it 
            track[i].setEnabled(false); 
        } 
    } 
    if (encodingOk) { 
        processor.realize(); 
        while (processor.getState() != Processor.Realized){
           try {
               Thread.sleep(100);
           } catch (InterruptedException e) {
               // TODO Auto-generated catch block
               e.printStackTrace();
           }
        }
        // get the output datasource of the processor and exit 
        // if we fail 
        DataSource ds = null;

        try { 
            ds = processor.getDataOutput(); 
        } catch (NotRealizedError e) { 
            System.exit(-1);
        }
        // hand this datasource to manager for creating an RTP 
        // datasink our RTP datasink will multicast the audio 
        try {
            String url= "rtp://"+ip+":"+port+"/audio/16";
            MediaLocator m = new MediaLocator(url); 
            DataSink d = Manager.createDataSink(ds, m);
            d.open();
            d.start();
            processor.start();
        } catch (Exception e) {
            System.exit(-1);
        }     
    }    
     }
public void run(){
AudioFormat格式=新的AudioFormat(AudioFormat.LINEAR,8000,8,1);
向量设备=CaptureDeviceManager.getDeviceList(格式);
CaptureDeviceInfo di=null;
如果(devices.size()>0){
di=(CaptureDeviceInfo)设备。elementAt(0);
} 
否则{
System.out.println(“hii”);
系统退出(-1);
}
处理器=空;
试试{
processor=Manager.createProcessor(di.getLocator());
}捕获(例外e){
System.exit(-1);}
//配置处理器
processor.configure();
while(processor.getState()!=processor.Configured){
试一试{
睡眠(100);
}捕捉(中断异常e){
//TODO自动生成的捕捉块
e、 printStackTrace();
}
}
processor.setContentDescriptor(
新的ContentDescriptor(ContentDescriptor.RAW));
TrackControl track[]=处理器。getTrackControls();
布尔编码OK=false;
对于(int i=0;i

}

我认为B应该与C建立单独的连接,并将数据包从a转发到C。