Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Scala ';连接被拒绝';使用流式查询时发生异常_Scala_Apache Spark_Streaming - Fatal编程技术网

Scala ';连接被拒绝';使用流式查询时发生异常

Scala ';连接被拒绝';使用流式查询时发生异常,scala,apache-spark,streaming,Scala,Apache Spark,Streaming,我正在尝试读取流数据输入,如下所示 object SocketReadExample { def main(args: Array[String]): Unit = { val sparkSession = SparkSession.builder .master("local") .appName("example") .config("spark.driver.bindAddress", "127.0.0.1")

我正在尝试读取流数据输入,如下所示

object SocketReadExample {

    def main(args: Array[String]): Unit = {

      val sparkSession = SparkSession.builder
        .master("local")
        .appName("example")
        .config("spark.driver.bindAddress", "127.0.0.1")
        .getOrCreate()
      //create stream from socket
      val socketStreamDf = sparkSession.readStream
        .format("socket")
        .option("host", "localhost")
        .option("port", 50050)
        .load()

      val consoleDataFrameWriter = socketStreamDf.writeStream
        .format("console")
        .outputMode(OutputMode.Append())

      val query = consoleDataFrameWriter.start()

      query.awaitTermination()
       }
   }
为此,我面临以下错误:

 Exception in thread "main" org.apache.spark.sql.streaming.StreamingQueryException: Connection
 refused
 === Streaming Query ===
 Identifier: [id = 2bdde43c-319d-48fc-941a-e8d794294a1d, runId = 8b1fd51e-b610-497b-b903-d66367856302]
 Current Committed Offsets: {}
 Current Available Offsets: {}

 Current State: INITIALIZING
 Thread State: RUNNABLE
    at org.apache.spark.sql.execution.streaming.StreamExecution.org$apache$spark$sql$execution$streaming$StreamExecution$$runBatches(StreamExecution.scala:343)
    at org.apache.spark.sql.execution.streaming.StreamExecution$$anon$1.run(StreamExecution.scala:206)
 Caused by: java.net.ConnectException: Connection refused
    at java.net.PlainSocketImpl.socketConnect(Native Method)
    at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
    at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
    at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
    at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
    at java.net.Socket.connect(Socket.java:589)
    at java.net.Socket.connect(Socket.java:538)
    at java.net.Socket.<init>(Socket.java:434)
    at java.net.Socket.<init>(Socket.java:211)
线程“main”org.apache.spark.sql.streaming.StreamingQueryException中的异常:连接 拒绝 ==流式查询=== 标识符:[id=2bdde43c-319d-48fc-941a-e8d794294a1d,runId=8b1fd51e-b610-497b-b903-d66367856302] 当前提交的偏移量:{} 当前可用偏移量:{} 当前状态:正在初始化 线程状态:可运行 位于org.apache.spark.sql.execution.streaming.StreamExecution.org$apache$spark$sql$execution$streaming$streaming$StreamExecution$$runBatches(StreamExecution.scala:343) 位于org.apache.spark.sql.execution.streaming.StreamExecution$$anon$1.run(StreamExecution.scala:206) 原因:java.net.ConnectException:连接被拒绝 位于java.net.PlainSocketImpl.socketConnect(本机方法) 位于java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350) 位于java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206) 位于java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188) 位于java.net.socksocketimpl.connect(socksocketimpl.java:392) 位于java.net.Socket.connect(Socket.java:589) 位于java.net.Socket.connect(Socket.java:538) 位于java.net.Socket。(Socket.java:434) 位于java.net.Socket。(Socket.java:211)
我也遇到了同样的问题,你给了我检查spark驱动程序配置的想法,我通过如下设置主机和端口解决了这个问题

val session: SparkSession = SparkSession.builder()
  .appName("Spark example")
  .master("local[2]")
  .config("spark.driver.host", "127.0.0.1")
  .config("spark.driver.port", "9999")
  .config("spark.testing.memory", "2147480000")
  .getOrCreate()

  ...

  val query = consoleDataFrameWriter.start()
  query.awaitTermination()

我以前遇到过此问题。在启动程序之前,应先打开端口,如下所示:

nc -lk 50050
那就好了