Scala 异常:错误SparkContext-初始化本地SparkContext时出错。java.net.BindException

Scala 异常:错误SparkContext-初始化本地SparkContext时出错。java.net.BindException,scala,testing,intellij-idea,apache-spark,Scala,Testing,Intellij Idea,Apache Spark,我正在尝试为spark应用程序编写一个测试,但在尝试运行下一个测试时出现了此异常 class BasicIT { val sparkConf: SparkConf = new SparkConf().setAppName("basic.phase.it").setMaster("local[1]") var context:SparkContext = new SparkContext(sparkConf) @Test def myTest()

我正在尝试为spark应用程序编写一个测试,但在尝试运行下一个测试时出现了此异常

     class BasicIT {

      val sparkConf: SparkConf = new SparkConf().setAppName("basic.phase.it").setMaster("local[1]")
      var context:SparkContext = new SparkContext(sparkConf)
    @Test
    def myTest(): Unit = {
      print("test")
     }
    }
失败,但出现此异常:

2016-07-24 21:04:39,956 [main,95] ERROR SparkContext - Error initializing SparkContext.
java.net.BindException: Can't assign requested address: Service 'sparkDriver' failed after 16 retries!
    at sun.nio.ch.Net.bind0(Native Method)
    at sun.nio.ch.Net.bind(Net.java:433)
    at sun.nio.ch.Net.bind(Net.java:425)
    at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
    at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
    at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125)
    at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485)
    at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089)
    at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430)
    at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415)
    at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903)
    at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198)
    at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348)
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
    at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
    at java.lang.Thread.run(Thread.java:745)

java.net.BindException: Can't assign requested address: Service 'sparkDriver' failed after 16 retries!
目前正在与IntelliJ合作开发OS x Yosemite


我做错了什么?与工作时使用的代码相同

您可能有更多的日志记录告诉您配置指定的UI端口已被使用。如果是这种情况,您需要将
spark.ui.port
明确设置为某个值,您知道该值将是主机上的可用端口。当特定端口不可用时,Spark会尝试增加端口号

示例:

val sparkConf = new SparkConf().setAppName("basic.phase.it")
                               .setMaster("local[1]")
                               .set("spark.ui.port", "4080");
试用 在运行SPARK应用程序之前,将SPARK_LOCAL_IP=“127.0.0.1”导出到load-SPARK-env.sh或仅设置SPARK_LOCAL_IP=“127.0.0.1”。这对我很有效。

尝试添加spark.driver.host作为本地主机

SparkConf conf = new SparkConf().setMaster("local[2]").setAppName("AnyName").set("spark.driver.host", "localhost");

对我不起作用,有没有其他参数我应该设置?“不起作用”不够具体。您在日志中看到了什么?这是一个致命的异常,但我相信您的日志记录比这个多,因为它重试了16次请参阅。这就为解决类似问题提供了很多选择。是的,这就是答案,我已经在spark conf中添加了它,谢谢!为我工作。对于sparkseesion:val spark=SparkSession.builder().appName(“spark2”).master(“local”).config(“spark.driver.host”、“localhost”).getOrCreate();