Docker 将容器链接到自身时,link命令挂起

Docker 将容器链接到自身时,link命令挂起,docker,Docker,试图使用我为自己为ApacheSpark构建的Docker映像。我发现当我试图运行容器中包含的脚本时,Java抛出了一个异常,因为找不到容器的名称,spark\u master 此问题的根本原因是我试图通过脚本/start master.sh,在Docker容器内运行Spark,但它会引发以下错误: 原因:java.net.UnknownHostException:spark\u master 因此,我在谷歌上搜索了这个问题,并遵循以下建议: 问题是当我运行命令时: docker run-d-t

试图使用我为自己为ApacheSpark构建的Docker映像。我发现当我试图运行容器中包含的脚本时,Java抛出了一个异常,因为找不到容器的名称,
spark\u master

此问题的根本原因是我试图通过脚本
/start master.sh
,在Docker容器内运行Spark,但它会引发以下错误:

原因:java.net.UnknownHostException:spark\u master

因此,我在谷歌上搜索了这个问题,并遵循以下建议:

问题是当我运行命令时:

docker run-d-t-p——名称spark\u master——链接spark\u master:spark\u master Berniaei/docker spark

Docker突然挂起,守护进程没有响应。没有错误,只是挂起

你知道怎么了吗?有没有更好的方法来解决根本原因

添加Dockerfile
############################################################
#Apache Spark开发环境的Dockerfile
#基于Ubuntu图像
############################################################
来自ubuntu:最新版本
维护者Justin Long
ENV SPARK_版本1.6.1
ENV SCALA_版本2.11.7
ENV SPARK_BIN_版本$SPARK_版本-BIN-hadoop2.6
环境SPARK_HOME/usr/local/SPARK
环境SCALA_HOME/usr/local/SCALA
环境路径$PATH:$SPARK\u HOME/bin:$SCALA\u HOME/bin
#更新APT缓存
运行sed-i.bak的/main$/main universe/'/etc/apt/sources.list
运行apt获取更新
运行apt获取升级-y
#安装和设置项目依赖项
运行apt get install-y curl wget git
运行locale gen en_US en_US.UTF-8
#准备Java下载
运行apt get install-y python软件属性
运行apt get install-y软件属性common
#grab oracle java(自动接受许可证)
运行addaptreepository-yppa:webupd8team/java
运行apt获取更新
运行echo oracle-java8-installer shared/accepted-oracle-license-v1-1 select true |/usr/bin/debconf set selections
运行apt get install-y oracle-java8-installer
#安装Scala
运行wgethttp://downloads.typesafe.com/scala/$SCALA_VERSION/SCALA-$SCALA_VERSION.tgz&&\
tar-zxf/scala-$scala_VERSION.tgz-C/usr/local/&&\
ln-s/usr/local/scala-$scala\u版本$scala\u HOME&&\
rm/scala-$scala_VERSION.tgz
#为Hadoop安装Spark
运行wgethttp://d3kbcqa49mib13.cloudfront.net/spark-$SPARK_BIN_VERSION.tgz&&\
tar-zxf/spark-$spark_BIN_VERSION.tgz-C/usr/local/&&\
ln-s/usr/local/spark-$spark\u BIN\u版本$spark\u HOME&&\
rm/spark-$spark\u BIN\u VERSION.tgz
添加脚本/start-master.sh/start-master.sh
添加脚本/start-worker/start-worker.sh
添加脚本/spark-shell.sh/spark-shell.sh
添加脚本/spark-defaults.conf/spark-defaults.conf
添加脚本/remove_alias.sh/remove_alias.sh
ENV SPARK_MASTER_OPTS=“-Dspark.driver.port=7001-Dspark.fileserver.port=7002-Dspark.broadcast.port=7003-Dspark.replClassServer.port=7004-Dspark.blockManager.port=7005-Dspark.executor.port=7006-Dspark.ui.port=4040-Dspark.broadcast.factory=org.apache.SPARK.broadcast.HttpBroadcastFactory”
ENV SPARK_WORKER_OPTS=“-Dspark.driver.port=7001-Dspark.fileserver.port=7002-Dspark.broadcast.port=7003-Dspark.replClassServer.port=7004-Dspark.blockManager.port=7005-Dspark.executor.port=7006-Dspark.ui.port=4040-Dspark.broadcast.factory=org.apache.SPARK.broadcast.httpbroadcast.factory”
环境火花_主机_端口7077
ENV SPARK_MASTER_WEBUI_端口8080
环境火花_工人_端口8888
环境火花\u工人\u WEBUI\u端口8081
曝光8080707788880814040700170027027027037047047057006

使用-h标志运行。它将主机名设置为spark_master

docker run -it --rm --name spark_master -h spark_master  bernieai/docker-spark ./start-master.sh
这是输出

starting org.apache.spark.deploy.master.Master, logging to /usr/local/spark/logs/spark--org.apache.spark.deploy.master.Master-1-spark_master.out

root@spark_master:/# tail usr/local/spark/logs/spark--org.apache.spark.deploy.master.Master-1-spark_master.out
16/04/10 03:12:04 INFO SecurityManager: Changing modify acls to: root
16/04/10 03:12:04 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)
16/04/10 03:12:05 INFO Utils: Successfully started service 'sparkMaster' on port 7077.
16/04/10 03:12:05 INFO Master: Starting Spark master at spark://spark_master:7077
16/04/10 03:12:05 INFO Master: Running Spark version 1.6.1
16/04/10 03:12:06 INFO Utils: Successfully started service 'MasterUI' on port 8080.
16/04/10 03:12:06 INFO MasterWebUI: Started MasterWebUI at http://172.17.0.2:8080
16/04/10 03:12:06 INFO Utils: Successfully started service on port 6066.
16/04/10 03:12:06 INFO StandaloneRestServer: Started REST server for submitting applications on port 6066
16/04/10 03:12:06 INFO Master: I have been elected leader! New state: ALIVE

可能只是编辑容器中的
/etc/hosts
文件,你介意在这里发布你的dockerfile吗?是的,我已经按要求添加了dockerfile。这确实有效,很好!非常感谢:)请注意,您需要检查./start master的输出,以查看Spark绑定到的地址。就我而言,我能够使用
spark submit
to
spark://172.17.0.2:6066
starting org.apache.spark.deploy.master.Master, logging to /usr/local/spark/logs/spark--org.apache.spark.deploy.master.Master-1-spark_master.out

root@spark_master:/# tail usr/local/spark/logs/spark--org.apache.spark.deploy.master.Master-1-spark_master.out
16/04/10 03:12:04 INFO SecurityManager: Changing modify acls to: root
16/04/10 03:12:04 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)
16/04/10 03:12:05 INFO Utils: Successfully started service 'sparkMaster' on port 7077.
16/04/10 03:12:05 INFO Master: Starting Spark master at spark://spark_master:7077
16/04/10 03:12:05 INFO Master: Running Spark version 1.6.1
16/04/10 03:12:06 INFO Utils: Successfully started service 'MasterUI' on port 8080.
16/04/10 03:12:06 INFO MasterWebUI: Started MasterWebUI at http://172.17.0.2:8080
16/04/10 03:12:06 INFO Utils: Successfully started service on port 6066.
16/04/10 03:12:06 INFO StandaloneRestServer: Started REST server for submitting applications on port 6066
16/04/10 03:12:06 INFO Master: I have been elected leader! New state: ALIVE