Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Apache spark 每当我在dockercontainer中获得spark工作时,我就会断开引导代理localhost:9092(id:-1 rack:null)的连接_Apache Spark_Apache Kafka_Docker Compose_Spark Streaming - Fatal编程技术网

Apache spark 每当我在dockercontainer中获得spark工作时,我就会断开引导代理localhost:9092(id:-1 rack:null)的连接

Apache spark 每当我在dockercontainer中获得spark工作时,我就会断开引导代理localhost:9092(id:-1 rack:null)的连接,apache-spark,apache-kafka,docker-compose,spark-streaming,Apache Spark,Apache Kafka,Docker Compose,Spark Streaming,我得到的错误是:- 20/11/02 13:34:51警告网络客户端:[消费者客户端id=Consumer-spark-kafka-source-366ac503-c5a4-4338-869c-84786983aab3--188679505-driver-0-1,组id=spark-kafka-source-366ac503-c5a4-4338-869c-84786983aab3---188679505-driver-0]引导代理本地主机:9092(id:-1机架:空)已断开连接 我的docke

我得到的错误是:-

20/11/02 13:34:51警告网络客户端:[消费者客户端id=Consumer-spark-kafka-source-366ac503-c5a4-4338-869c-84786983aab3--188679505-driver-0-1,组id=spark-kafka-source-366ac503-c5a4-4338-869c-84786983aab3---188679505-driver-0]引导代理本地主机:9092(id:-1机架:空)已断开连接

我的docker-compose.yml文件

 version: "3"
services:
  zookeeper:
    image: wurstmeister/zookeeper
    container_name: zookeeper
    ports:
    - 2181:2181

  kafka:
    image: wurstmeister/kafka
    container_name: kafka
    ports:
    - 9092:9092
    
    environment:
      KAFKA_LISTENERS: PLAINTEXT://:9092
      KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://kafka:9092
      KAFKA_BROKER_ID: 1
      KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
      KAFKA_CREATE_TOPICS: "kafkatutorial:1:1"
      ALLOW_PLAINTEXT_LISTENER: "yes"
      SSL: localhost:9092
    expose:
     - 9092
      
  spark:
   build:
    dockerfile: DockerFileSpark
    context: .

   
   environment:
      - SPARK_MODE=master
      - SPARK_RPC_AUTHENTICATION_ENABLED=no
      - SPARK_RPC_ENCRYPTION_ENABLED=no
      - SPARK_LOCAL_STORAGE_ENCRYPTION_ENABLED=no
      - SPARK_SSL_ENABLED=no
   
   ports:
      - '8080:8080'
   links:
    - kafka
   
   depends_on:
    - zookeeper
    - kafka
    
  • SSL:localhost:9092
    不是kafka容器的有效环境变量;您需要在两个侦听器变量中设置SSL,并将SSL证书添加到映像中

  • 不清楚Spark代码在哪里运行,但在容器中,
    localhost
    永远无法连接到外部服务。删除yaml中的
    链接
    ,并连接到代码中的
    kafka:9092


  • 如果你有导师,那么他/她应该教你。你过去的两个问题都是Docker网络问题,与卡夫卡无关。简单地在这一过程中添加火花是太多了