Apache spark 为什么Spark不在本地运行,而根据文档,它应该是可能的?

Apache spark 为什么Spark不在本地运行,而根据文档,它应该是可能的?,apache-spark,Apache Spark,目的是通过执行一些示例开始Spark,并研究输出 我已经克隆了,按照自述文件中的说明构建了,并运行了/bin/spark shell,结果是: Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For Spa

目的是通过执行一些示例开始Spark,并研究输出

我已经克隆了,按照自述文件中的说明构建了,并运行了
/bin/spark shell
,结果是:

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
16/11/10 08:47:48 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/11/10 08:47:48 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/11/10 08:47:48 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/11/10 08:47:48 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/11/10 08:47:48 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/11/10 08:47:48 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/11/10 08:47:48 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/11/10 08:47:48 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/11/10 08:47:48 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/11/10 08:47:48 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/11/10 08:47:48 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/11/10 08:47:48 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/11/10 08:47:48 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/11/10 08:47:48 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/11/10 08:47:48 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/11/10 08:47:48 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/11/10 08:47:48 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/11/10 08:47:48 ERROR SparkContext: Error initializing SparkContext.
java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed after 16 retries (starting from 0)! Consider explicitly setting the appropriate port for the service 'sparkDriver' (for example spark.ui.port for SparkUI) to an available port or increasing spark.port.maxRetries.
    at sun.nio.ch.Net.bind0(Native Method)
    at sun.nio.ch.Net.bind(Net.java:433)
    at sun.nio.ch.Net.bind(Net.java:425)
    at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
    at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:127)
    at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:501)
    at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1218)
    at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:505)
    at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:490)
    at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:965)
    at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:210)
    at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:353)
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:408)
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:441)
    at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:140)
    at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
    at java.lang.Thread.run(Thread.java:745)
16/11/10 08:47:48 ERROR SparkContext: Error stopping SparkContext after init error.
java.lang.NullPointerException
    at org.apache.spark.SparkContext.stop(SparkContext.scala:1764)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:591)
    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2309)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:843)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:835)
    at scala.Option.getOrElse(Option.scala:121)
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:835)
    at org.apache.spark.repl.Main$.createSparkSession(Main.scala:101)
    at $line3.$read$$iw$$iw.<init>(<console>:15)
    at $line3.$read$$iw.<init>(<console>:42)
    at $line3.$read.<init>(<console>:44)
    at $line3.$read$.<init>(<console>:48)
    at $line3.$read$.<clinit>(<console>)
    at $line3.$eval$.$print$lzycompute(<console>:7)
    at $line3.$eval$.$print(<console>:6)
    at $line3.$eval.$print(<console>)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
    at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
    at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
    at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
    at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
    at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
    at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
    at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
    at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
    at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
    at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
    at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
    at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:38)
    at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
    at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:37)
    at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
    at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:37)
    at org.apache.spark.repl.SparkILoop.loadFiles(SparkILoop.scala:105)
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:920)
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
    at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:909)
    at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
    at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:909)
    at org.apache.spark.repl.Main$.doMain(Main.scala:68)
    at org.apache.spark.repl.Main$.main(Main.scala:51)
    at org.apache.spark.repl.Main.main(Main.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed after 16 retries (starting from 0)! Consider explicitly setting the appropriate port for the service 'sparkDriver' (for example spark.ui.port for SparkUI) to an available port or increasing spark.port.maxRetries.
  at sun.nio.ch.Net.bind0(Native Method)
  at sun.nio.ch.Net.bind(Net.java:433)
  at sun.nio.ch.Net.bind(Net.java:425)
  at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
  at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:127)
  at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:501)
  at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1218)
  at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:505)
  at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:490)
  at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:965)
  at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:210)
  at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:353)
  at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:408)
  at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:441)
  at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:140)
  at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
  at java.lang.Thread.run(Thread.java:745)
<console>:14: error: not found: value spark
       import spark.implicits._
              ^
<console>:14: error: not found: value spark
       import spark.sql
              ^
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.1.0-SNAPSHOT
      /_/

Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_92)
Type in expressions to have them evaluated.
Type :help for more information.
使用Spark的默认log4j配置文件:org/apache/Spark/log4j-defaults.properties
将默认日志级别设置为“警告”。
要调整日志记录级别,请使用sc.setLogLevel(newLevel)。对于SparkR,使用setLogLevel(newLevel)。
16/11/10 08:47:48警告NativeCodeLoader:无法为您的平台加载本机hadoop库。。。在适用的情况下使用内置java类
10年11月16日08:47:48警告Utils:服务“sparkDriver”无法绑定到端口0上。正在尝试端口1。
10年11月16日08:47:48警告Utils:服务“sparkDriver”无法绑定到端口0上。正在尝试端口1。
10年11月16日08:47:48警告Utils:服务“sparkDriver”无法绑定到端口0上。正在尝试端口1。
10年11月16日08:47:48警告Utils:服务“sparkDriver”无法绑定到端口0上。正在尝试端口1。
10年11月16日08:47:48警告Utils:服务“sparkDriver”无法绑定到端口0上。正在尝试端口1。
10年11月16日08:47:48警告Utils:服务“sparkDriver”无法绑定到端口0上。正在尝试端口1。
10年11月16日08:47:48警告Utils:服务“sparkDriver”无法绑定到端口0上。正在尝试端口1。
10年11月16日08:47:48警告Utils:服务“sparkDriver”无法绑定到端口0上。正在尝试端口1。
10年11月16日08:47:48警告Utils:服务“sparkDriver”无法绑定到端口0上。正在尝试端口1。
10年11月16日08:47:48警告Utils:服务“sparkDriver”无法绑定到端口0上。正在尝试端口1。
10年11月16日08:47:48警告Utils:服务“sparkDriver”无法绑定到端口0上。正在尝试端口1。
10年11月16日08:47:48警告Utils:服务“sparkDriver”无法绑定到端口0上。正在尝试端口1。
10年11月16日08:47:48警告Utils:服务“sparkDriver”无法绑定到端口0上。正在尝试端口1。
10年11月16日08:47:48警告Utils:服务“sparkDriver”无法绑定到端口0上。正在尝试端口1。
10年11月16日08:47:48警告Utils:服务“sparkDriver”无法绑定到端口0上。正在尝试端口1。
10年11月16日08:47:48警告Utils:服务“sparkDriver”无法绑定到端口0上。正在尝试端口1。
16/11/10 08:47:48错误SparkContext:初始化SparkContext时出错。
java.net.BindException:无法分配请求的地址:服务“sparkDriver”重试16次后失败(从0开始)!考虑为服务“火花驱动器”(例如SPARK.U.SARPUKI端口)为可用端口或增加Skk.Po.Max重试设置明确的端口。
位于sun.nio.ch.Net.bind0(本机方法)
位于sun.nio.ch.Net.bind(Net.java:433)
位于sun.nio.ch.Net.bind(Net.java:425)
位于sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
位于io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:127)
在io.netty.channel.AbstractChannel$AbstractSafe.bind(AbstractChannel.java:501)上
位于io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1218)
位于io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:505)
位于io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:490)
位于io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:965)
位于io.netty.channel.AbstractChannel.bind(AbstractChannel.java:210)
位于io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:353)
位于io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:408)
位于io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:441)
位于io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:140)
位于io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
运行(Thread.java:745)
16/11/10 08:47:48错误SparkContext:在初始化错误后停止SparkContext时出错。
java.lang.NullPointerException
位于org.apache.spark.SparkContext.stop(SparkContext.scala:1764)
位于org.apache.spark.SparkContext(SparkContext.scala:591)
位于org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2309)
位于org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:843)
位于org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:835)
位于scala.Option.getOrElse(Option.scala:121)
位于org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:835)
位于org.apache.spark.repl.Main$.createSparkSession(Main.scala:101)
在$line3。$read$$iw$$iw。(:15)
在$line3。$read$$iw。(:42)
$line3.$read.(:44)
在$line3.$read$(:48)
在$line3.$read$()
在$line3.$eval$.$print$lzycompute(:7)处
在$line3.$eval$.$print处(:6)
在$line3.$eval.$print()处
在sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)处
位于sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)中
位于java.lang.reflect.Method.invoke(Method.java:498)
在scala.tools.nsc.explorer.IMain$ReadEvalPrint.call中(IMain.scala:786)
位于scala.tools.nsc.explorer.IMain$Request.loadAndRun(IMain.scala:1047)
在scala.tools.nsc.explorer.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
在scala.tools.nsc.explorer.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
位于scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
位于scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
位于scala.tools.nsc.explorer.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
在scala.tools.nsc.explorer.IMain.explore(IMain.scala:569)中
在scala.tools.nsc.explorer.IMain.explore(IMain.scala:565)中
在sc
scala> sc.parallelize(1 to 1000).count()
<console>:18: error: not found: value sc
       sc.parallelize(1 to 1000).count()
127.0.0.1      your_hostname
export SPARK_LOCAL_IP="127.0.0.1"
cat /etc/hosts/     
192.168.245.*** quickstart.cloudera