Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/hadoop/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Hadoop 无法在dse 4.5中运行spark master,并且缺少从属文件_Hadoop_Apache Spark_Datastax Enterprise_Cassandra 2.0 - Fatal编程技术网

Hadoop 无法在dse 4.5中运行spark master,并且缺少从属文件

Hadoop 无法在dse 4.5中运行spark master,并且缺少从属文件,hadoop,apache-spark,datastax-enterprise,cassandra-2.0,Hadoop,Apache Spark,Datastax Enterprise,Cassandra 2.0,我在DSE 4.5中有5个节点的群集正在运行和启动。在5个节点中,有1个节点已启用hadoop_和spark_,但spark master未运行 ERROR [Thread-709] 2014-07-02 11:35:48,519 ExternalLogger.java (line 73) SparkMaster: Exception in thread "main" org.jboss.netty.channel.ChannelException: Failed to bind to: /5

我在DSE 4.5中有5个节点的群集正在运行和启动。在5个节点中,有1个节点已启用hadoop_和spark_,但spark master未运行

 ERROR [Thread-709] 2014-07-02 11:35:48,519 ExternalLogger.java (line 73) SparkMaster: Exception in thread "main" org.jboss.netty.channel.ChannelException: Failed to bind to: /54.xxx.xxx.xxx:7077
有人知道这个吗??我也尝试出口SPARK_LOCAL_IP,但这也不起作用

DSE文档错误地提到spark-env.sh配置文件是resources/spark/conf/spark-env.sh。配置目录的实际路径是/etc/dse/spark

conf dir中也缺少从属文件,bin dir中也缺少运行文件。 我在犯错误 $DSE SPARK

 Welcome to
   ____              __
  / __/__  ___ _____/ /__
 _\ \/ _ \/ _ `/ __/  '_/
/___/ .__/\_,_/_/ /_/\_\   version 0.9.1
   /_/

 Using Scala version 2.10.3 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_51)
 Type in expressions to have them evaluated.
 Type :help for more information.
 Creating SparkContext...
 14/07/03 11:37:41 ERROR Remoting: Remoting error: [Startup failed] [
 akka.remote.RemoteTransportException: Startup failed
    at akka.remote.Remoting.akka$remote$Remoting$$notifyError(Remoting.scala:129)
    at akka.remote.Remoting.start(Remoting.scala:194)
    at akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184)
    at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:579)
    at akka.actor.ActorSystemImpl._start(ActorSystem.scala:577)
    at akka.actor.ActorSystemImpl.start(ActorSystem.scala:588)
    at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
    at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
    at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:96)
    at org.apache.spark.SparkEnv$.create(SparkEnv.scala:126)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:139)
    at shark.SharkContext.<init>(SharkContext.scala:42)
    at shark.SharkEnv$.initWithSharkContext(SharkEnv.scala:90)
    at com.datastax.bdp.spark.SparkILoop.createSparkContext(SparkILoop.scala:41)
    at $line3.$read$$iwC$$iwC.<init>(<console>:10)
    at $line3.$read$$iwC.<init>(<console>:32)
    at $line3.$read.<init>(<console>:34)
    at $line3.$read$.<init>(<console>:38)
    at $line3.$read$.<clinit>(<console>)
    at $line3.$eval$.<init>(<console>:7)
    at $line3.$eval$.<clinit>(<console>)
    at $line3.$eval.$print(<console>)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:772)
    at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1040)
    at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:609)
    at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:640)
    at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:604)
    at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:793)
    at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:838)
    at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:750)
    at com.datastax.bdp.spark.SparkILoop$$anonfun$initializeSparkContext$1.apply(SparkILoop.scala:66)
    at com.datastax.bdp.spark.SparkILoop$$anonfun$initializeSparkContext$1.apply(SparkILoop.scala:66)
    at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:258)
    at com.datastax.bdp.spark.SparkILoop.initializeSparkContext(SparkILoop.scala:65)
    at com.datastax.bdp.spark.SparkILoop.initializeSpark(SparkILoop.scala:47)
    at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:908)
    at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:140)
    at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:53)
    at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:102)
    at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:53)
    at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:925)
    at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:881)
    at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:881)
    at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
    at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:881)
    at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:973)
    at com.datastax.bdp.spark.SparkReplMain$.main(SparkReplMain.scala:22)
    at com.datastax.bdp.spark.SparkReplMain.main(SparkReplMain.scala)
 Caused by: org.jboss.netty.channel.ChannelException: Failed to bind to: /54.xx.xx.xx:0
    at org.jboss.netty.bootstrap.ServerBootstrap.bind(ServerBootstrap.java:272)
    at akka.remote.transport.netty.NettyTransport$$anonfun$listen$1.apply(NettyTransport.scala:391)
    at akka.remote.transport.netty.NettyTransport$$anonfun$listen$1.apply(NettyTransport.scala:388)
    at scala.util.Success$$anonfun$map$1.apply(Try.scala:206)
    at scala.util.Try$.apply(Try.scala:161)
    at scala.util.Success.map(Try.scala:206)
    at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
    at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
    at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
    at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.processBatch$1(BatchingExecutor.scala:67)
    at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply$mcV$sp(BatchingExecutor.scala:82)
    at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:59)
    at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:59)
    at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:72)
    at akka.dispatch.BatchingExecutor$Batch.run(BatchingExecutor.scala:58)
    at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:42)
    at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)
    at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
    at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
    at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
    at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
 Caused by: java.net.BindException: Cannot assign requested address
    at sun.nio.ch.Net.bind0(Native Method)
    at sun.nio.ch.Net.bind(Net.java:444)
    at sun.nio.ch.Net.bind(Net.java:436)
    at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
    at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
    at org.jboss.netty.channel.socket.nio.NioServerBoss$RegisterTask.run(NioServerBoss.java:193)
    at org.jboss.netty.channel.socket.nio.AbstractNioSelector.processTaskQueue(AbstractNioSelector.java:366)
    at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:290)
    at org.jboss.netty.channel.socket.nio.NioServerBoss.run(NioServerBoss.java:42)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:744)
]
org.jboss.netty.channel.ChannelException: Failed to bind to: /54.xxx.xxx.xxx.xxx:0
    at org.jboss.netty.bootstrap.ServerBootstrap.bind(ServerBootstrap.java:272)
    at    akka.remote.transport.netty.NettyTransport$$anonfun$listen$1.apply(NettyTransport.scala:391)
    at   akka.remote.transport.netty.NettyTransport$$anonfun$listen$1.apply(NettyTransport.scala:388)
    at scala.util.Success$$anonfun$map$1.apply(Try.scala:206)
    at scala.util.Try$.apply(Try.scala:161)
    at scala.util.Success.map(Try.scala:206)
    at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
    at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
    at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
    at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.processBatch$1(BatchingExecutor.scala:67)
    at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply$mcV$sp(BatchingExecutor.scala:82)
    at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:59)
    at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:59)
    at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:72)
    at akka.dispatch.BatchingExecutor$Batch.run(BatchingExecutor.scala:58)
    at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:42)
    at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)
    at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
    at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
    at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
    at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.net.BindException: Cannot assign requested address
    at sun.nio.ch.Net.bind0(Native Method)
    at sun.nio.ch.Net.bind(Net.java:444)
    at sun.nio.ch.Net.bind(Net.java:436)
    at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
    at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
    at org.jboss.netty.channel.socket.nio.NioServerBoss$RegisterTask.run(NioServerBoss.java:193)
    at org.jboss.netty.channel.socket.nio.AbstractNioSelector.processTaskQueue(AbstractNioSelector.java:366)
    at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:290)
    at org.jboss.netty.channel.socket.nio.NioServerBoss.run(NioServerBoss.java:42)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:744)

Type in expressions to have them evaluated.
Type :help for more information.

scala>

查看reference.conf文件中为您的SparkConf配置的Akka Remote host:port和任何相关Akka配置。这似乎是与akka remote和host:port相关的akka启动冲突,该端口预期使用,但已使用:即ChannelException。Spark的akka Actor系统启动时,其他一些节点已经在使用54.xx.xx.xx:0。

只是为了说明:您有4个节点运行纯Cassandra,一个节点是混合Hadoop/Spark?在5节点集群中,2个是Cassandra,2个是solr,一个在Hadoop/Spark上。这是异常中提到的接口,你想把它绑到的那个?也许master.log中有更详细的信息-您能检查一下吗?Spark master使用广播地址作为它绑定到的接口。如果你想改变它,你可以考虑在CasDANRA.YAMLL中改变ListNeX地址参数。我的集群在EC2上,那就是Y我留下了ListNeX地址空白。当我给出广播地址时,由于广播消息,我得到了无效yaml的错误。但是我提到的IP地址54.xx.xx.xx是我节点的正确IP。但它仍然无法绑定。我还检查了master.log。相同的错误消息。SparkMaster:远程处理错误:[启动失败]。SparkMaster:线程“main”org.jboss.netty.channel.ChannelException:无法绑定到:/54.xx.xx.xx:7077我可以在哪里找到reference.conf文件?正如您所说的,这可能是端口冲突,这就是为什么绑定54.xx.xx.xx:7077时出错的原因?但是我使用了netstat-anp | grep7077,端口7077在任何地方都没有使用。是否有其他方法可以检查分配给任何其他进程的54.xx.xx.xx.xx:7077或54.xx.xx.xx:0端口。akka actor系统由spark内部创建,使用主机端口:“spark.driver.host”“spark.driver.port”。在akka中,端口0被传递给Netty,Netty使用0作为使用动态端口号的信号。您是否已经有一个节点在运行?请尝试在两个从属节点中添加SPARK_LOCAL_IP=master,或者检查您的SPARK-env.shHelena,我正在尝试配置独立的SPARK&hadoop节点。请查看我的SPARK-env.sh文件(更新了问题)。这个东西是正确的还是我需要修改里面的东西?
   export SPARK_HOME="/usr/share/dse/spark"
   export SPARK_MASTER_IP=54.xx.xx.xx (public IP)
   export SPARK_MASTER_PORT=7077
   export SPARK_MASTER_WEBUI_PORT=7080
   export SPARK_WORKER_WEBUI_PORT=7081
   export SPARK_WORKER_MEMORY="4g"
   export SPARK_MEM="2g"
   export SPARK_REPL_MEM="2g"
   export SPARK_CONF_DIR="/etc/dse/spark"
   export SPARK_TMP_DIR="$SPARK_HOME/tmp"
   export SPARK_LOG_DIR="$SPARK_HOME/logs"
   export SPARK_LOCAL_IP=54.xx.xx.xx (public IP)
   export SPARK_COMMON_OPTS="$SPARK_COMMON_OPTS -Dspark.kryoserializer.buffer.mb=10 "
   export SPARK_MASTER_OPTS=" -Dspark.deploy.defaultCores=1 -    Dspark.local.dir=$SPARK_TMP_DIR/master -Dlog4j.configuration=file://$SPARK_CONF_DIR/log4j- server.properties -Dspark.log.file=$SPARK_LOG_DIR/master.log "
   export SPARK_WORKER_OPTS=" -Dspark.local.dir=$SPARK_TMP_DIR/worker -Dlog4j.configuration=file://$SPARK_CONF_DIR/log4j-server.properties -Dspark.log.file=$SPARK_LOG_DIR/worker.log "
   export SPARK_EXECUTOR_OPTS=" -Djava.io.tmpdir=$SPARK_TMP_DIR/executor -Dlog4j.configuration=file://$SPARK_CONF_DIR/log4j-executor.properties "
   export SPARK_REPL_OPTS=" -Djava.io.tmpdir=$SPARK_TMP_DIR/repl/$USER "
   export SPARK_APP_OPTS=" -Djava.io.tmpdir=$SPARK_TMP_DIR/app/$USER "

   # Directory to run applications in, which will include both logs and scratch space  (default: SPARK_HOME/work).
   export SPARK_WORKER_DIR="$SPARK_HOME/work"