Apache spark Spark流式处理上下文创建错误
我正在尝试Spark Streaming basic编程Apache spark Spark流式处理上下文创建错误,apache-spark,spark-streaming,Apache Spark,Spark Streaming,我正在尝试Spark Streaming basic编程 import org.apache.spark.SparkConf import org.apache.spark.streaming.{Seconds, StreamingContext} val appConf = new SparkConf().setMaster("local[2]").setAppName("kafkaapp"); val sContext = new StreamingContext(appConf , Se
import org.apache.spark.SparkConf
import org.apache.spark.streaming.{Seconds, StreamingContext}
val appConf = new SparkConf().setMaster("local[2]").setAppName("kafkaapp");
val sContext = new StreamingContext(appConf , Seconds(1));
它返回以下内容
Caused by: java.lang.ClassNotFoundException: org.apache.spark.streaming.StreamingContext
我的配置
Spark Core和流媒体“2.1.1”版本Scala版本2.11.8
我做错了什么
POM文件
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.demo</groupId>
<artifactId>streamapp</artifactId>
<version>1.0-SNAPSHOT</version>
<inceptionYear>2008</inceptionYear>
<properties>
<scala.version>2.11.8</scala.version>
</properties>
<repositories>
<repository>
<id>scala-tools.org</id>
<name>Scala-Tools Maven2 Repository</name>
<url>http://scala-tools.org/repo-releases</url>
</repository>
</repositories>
<pluginRepositories>
<pluginRepository>
<id>scala-tools.org</id>
<name>Scala-Tools Maven2 Repository</name>
<url>http://scala-tools.org/repo-releases</url>
</pluginRepository>
</pluginRepositories>
<dependencies>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.version}</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.1.1</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-streaming -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<version>2.1.1</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.specs</groupId>
<artifactId>specs</artifactId>
<version>1.2.5</version>
<scope>compile</scope>
</dependency>
</dependencies>
<build>
<sourceDirectory>src/main/scala</sourceDirectory>
<plugins>
<plugin>
<groupId>org.scala-tools</groupId>
<artifactId>maven-scala-plugin</artifactId>
<configuration>
<scalaVersion>${scala.version}</scalaVersion>
<args>
<arg>-target:jvm-1.5</arg>
</args>
</configuration>
</plugin>
</plugins>
</build>
<reporting>
<plugins>
<plugin>
<groupId>org.scala-tools</groupId>
<artifactId>maven-scala-plugin</artifactId>
<configuration>
<scalaVersion>${scala.version}</scalaVersion>
</configuration>
</plugin>
</plugins>
</reporting>
</project>
4.0.0
com.demo
streamapp
1.0-快照
2008
2.11.8
scala-tools.org
Scala工具Maven2存储库
http://scala-tools.org/repo-releases
scala-tools.org
Scala工具Maven2存储库
http://scala-tools.org/repo-releases
org.scala-lang
scala图书馆
${scala.version}
org.apache.spark
spark-core_2.11
2.1.1
org.apache.spark
spark-U 2.11
2.1.1
假如
org.specs
规格
1.2.5
编译
src/main/scala
org.scala-tools
maven scala插件
${scala.version}
-目标:jvm-1.5
org.scala-tools
maven scala插件
${scala.version}
检查您的运行时环境是否有带有StreamingContext的JAR(应该是spark-streaming_2.11.JAR)。如果没有,请将缺少的依赖项添加到您的依赖项管理工具(Maven、SBT、Gradle…。我有spark-streaming_2.11能否在第一个问题中添加pom.xml?这将有助于更好地理解这个问题。谢谢。因为您已将这些依赖项标记为提供的,所以Maven不会从远程存储库中搜索它们。它希望您在本地主机上拥有它们。因此,您可以删除提供的标记并重试。