Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Apache spark SparkR和Pyspark在启动时抛出Java.net.Bindexception,但Spark Shell没有?_Apache Spark_Pyspark_Sparkr - Fatal编程技术网

Apache spark SparkR和Pyspark在启动时抛出Java.net.Bindexception,但Spark Shell没有?

Apache spark SparkR和Pyspark在启动时抛出Java.net.Bindexception,但Spark Shell没有?,apache-spark,pyspark,sparkr,Apache Spark,Pyspark,Sparkr,我已经尝试将SPARK_LOCAL_IP设置为“127.0.0.1”,并检查端口是否已被占用。以下是完整的错误文本: Launching java with spark-submit command /usr/hdp/2.4.0.0- 169/spark/bin/spark-submit "sparkr-shell" /tmp/RtmpZo44il/backend_port998540c56917 /usr/hdp/2.4.0.0-169/spark/bin/load-spar

我已经尝试将SPARK_LOCAL_IP设置为“127.0.0.1”,并检查端口是否已被占用。以下是完整的错误文本:

Launching java with spark-submit command /usr/hdp/2.4.0.0-  

    169/spark/bin/spark-submit   "sparkr-shell" /tmp/RtmpZo44il/backend_port998540c56917
/usr/hdp/2.4.0.0-169/spark/bin/load-spark-env.sh: line 72: export: `load-spark-env.sh': not a valid identifier
16/06/13 11:28:24 ERROR RBackend: Server shutting down: failed with exception
java.net.BindException: Cannot assign requested address
        at sun.nio.ch.Net.bind0(Native Method)
        at sun.nio.ch.Net.bind(Net.java:433)
        at sun.nio.ch.Net.bind(Net.java:425)
        at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
        at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
        at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125)
        at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485)
        at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089)
        at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430)
        at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415)
        at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903)
        at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198)
        at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348)
        at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
        at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
        at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:137)
        at java.lang.Thread.run(Thread.java:745)
Error in SparkR::sparkR.init() : JVM is not ready after 10 seconds
以上错误发生在启动时。/bin/sparkR。火花壳将再次正常执行


更多信息。Spark shell在启动时将自动搜索端口,直到解决了一个没有绑定异常的端口。即使我将默认SparkR后端端口设置为未使用的端口,它也会失败

我发现了这个问题。另一个用户已删除我的etc/hosts文件。我用localhost重新配置了该文件,它似乎运行sparkR。我仍然很好奇spark shell是如何运行该文件的

我发现了这个问题。另一个用户已删除我的etc/hosts文件。我用localhost重新配置了该文件,它似乎运行sparkR。我仍然很好奇spark shell是如何运行该文件的