Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
&引用;“未找到火花”;scala包命令中出现错误_Scala_Apache Spark - Fatal编程技术网

&引用;“未找到火花”;scala包命令中出现错误

&引用;“未找到火花”;scala包命令中出现错误,scala,apache-spark,Scala,Apache Spark,我正在为我的scala程序构建软件包 我已经导入了包,检查了build.sbt中的库依赖项和修订,但仍然得到“spark not found”错误。任何帮助都将不胜感激。如果我遗漏了什么,请告诉我 正在使用的配置包括: scala 2.11.8 hadoop 3.0.0 sam@testlab:~/mymooc-workspace/MyProject$ cat src/main/scala/MyProgram.scala import org.apache.spark.SparkContext

我正在为我的scala程序构建软件包

我已经导入了包,检查了build.sbt中的库依赖项和修订,但仍然得到“spark not found”错误。任何帮助都将不胜感激。如果我遗漏了什么,请告诉我

正在使用的配置包括: scala 2.11.8 hadoop 3.0.0

sam@testlab:~/mymooc-workspace/MyProject$ cat src/main/scala/MyProgram.scala 
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf

object MyProgram {
  def main(args: Array[String]): Unit = {
   val conf = new SparkConf().setAppName("scala spark")
   val sc = new SparkContext(conf)
   val df = spark.read
  .format("csv")
  .option("header","true")
  .option("inferSchema","true")
  .option("mode","failfast")
  .load("/home/sam/SparkScala/aadhar_dataset.csv")

  df.show(10,false) 
  }
}
sam@testlab:~/mymooc-workspace/MyProject$ 



sam@testlab:~/mymooc-workspace/MyProject$ sbt package -v
[process_args] java_version = '8'
# Executing command line:
java
-Xms1024m
-Xmx1024m
-XX:ReservedCodeCacheSize=128m
-XX:MaxMetaspaceSize=256m
-jar
/usr/share/sbt/bin/sbt-launch.jar
package

[info] Loading settings from plugins.sbt ...
[info] Loading global plugins from /home/sam/.sbt/1.0/plugins
[info] Loading project definition from /home/sam/mymooc-workspace/MyProject/project
[info] Loading settings from build.sbt ...
[info] Set current project to project (in build file:/home/sam/mymooc-workspace/MyProject/)
[info] Updating ...
[info] Done updating.
[warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible:
[warn]  * commons-net:commons-net:2.2 is selected over 3.1
[warn]      +- org.apache.spark:spark-core_2.11:2.1.0             (depends on 2.2)
[warn]      +- org.apache.hadoop:hadoop-common:2.2.0              (depends on 3.1)
[warn]  * com.google.guava:guava:14.0.1 is selected over 11.0.2
[warn]      +- org.apache.curator:curator-recipes:2.4.0           (depends on 14.0.1)
[warn]      +- org.apache.curator:curator-client:2.4.0            (depends on 14.0.1)
[warn]      +- org.apache.curator:curator-framework:2.4.0         (depends on 14.0.1)
[warn]      +- org.apache.hadoop:hadoop-hdfs:2.2.0                (depends on 11.0.2)
[warn]      +- org.apache.hadoop:hadoop-common:2.2.0              (depends on 11.0.2)
[warn] Run 'evicted' to see detailed eviction warnings
[info] Compiling 1 Scala source to /home/sam/mymooc-workspace/MyProject/target/scala-2.11/classes ...
[error] /home/sam/mymooc-workspace/MyProject/src/main/scala/MyProgram.scala:11:13: not found: value spark
[error]    val df = spark.read
[error]             ^
[error] one error found
[error] (Compile / compileIncremental) Compilation failed
[error] Total time: 151 s, completed Dec 5, 2018 10:23:26 AM
sam@testlab:~/mymooc-workspace/MyProject$ 

sam@testlab:~/mymooc-workspace/MyProject$ cat build.sbt 
name := "project"
version := "1.0"
scalaVersion := "2.11.8"

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % "2.1.0",
  "org.apache.spark" %% "spark-sql" % "2.1.0"
)


sam@testlab:~/mymooc-workspace/MyProject$ 

我找不到以
val spark=?
名称声明的任何值,但您正在程序中使用它

从您的代码片段中,我可以理解的是,您正在尝试使用
SparkSession
读取CSV文件,但您尚未声明/创建SparkSession本身。在读取文件之前,请尝试以下操作:

import org.apache.spark.sql.SparkSession

val spark = SparkSession.builder().getOrCreate()