Scala spark sql配置单元连接错误
我在Spark项目中面临Scala IDE的问题 我无法连接到HiveContext。它给出的错误是Scala spark sql配置单元连接错误,scala,maven,hadoop,hive,Scala,Maven,Hadoop,Hive,我在Spark项目中面临Scala IDE的问题 我无法连接到HiveContext。它给出的错误是 object hive is not a member of package org.apache.spark.sql 代码语句是 val hiveContext = new org.apache.spark.sql.hive.HiveContext(sc) 以下是项目中使用的版本: Scala-2.11.8 Java-1.8 Spark-2.1.0 供参考的pom.xml是 <plu
object hive is not a member of package org.apache.spark.sql
代码语句是
val hiveContext = new org.apache.spark.sql.hive.HiveContext(sc)
以下是项目中使用的版本:
Scala-2.11.8
Java-1.8
Spark-2.1.0
供参考的pom.xml是
<pluginRepositories>
<pluginRepository>
<id>scala-tools.org</id>
<name>Scala-tools Maven2 Repository</name>
<url>http://scala-tools.org/repo-releases</url>
</pluginRepository>
</pluginRepositories>
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.1.0</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.1.0</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.10</artifactId>
<version>2.1.0</version>
</dependency>
</dependencies>
<build>
<plugins>
<!-- mixed scala/java compile -->
<plugin>
<groupId>org.scala-tools</groupId>
<artifactId>maven-scala-plugin</artifactId>
<executions>
<execution>
<id>compile</id>
<goals>
<goal>compile</goal>
</goals>
<phase>compile</phase>
</execution>
<execution>
<id>test-compile</id>
<goals>
<goal>testCompile</goal>
</goals>
<phase>test-compile</phase>
</execution>
<execution>
<phase>process-resources</phase>
<goals>
<goal>compile</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
<!-- for fatjar -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
<version>2.4</version>
<configuration>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
<executions>
<execution>
<id>assemble-all</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<!-- <configuration> <archive> <manifest> <addClasspath>true</addClasspath>
<mainClass>fully.qualified.MainClass</mainClass> </manifest> </archive> </configuration> -->
</plugin>
</plugins>
<pluginManagement>
<plugins>
<!--This plugin's configuration is used to store Eclipse m2e settings
only. It has no influence on the Maven build itself. -->
<plugin>
<groupId>org.eclipse.m2e</groupId>
<artifactId>lifecycle-mapping</artifactId>
<version>1.0.0</version>
<configuration>
<lifecycleMappingMetadata>
<pluginExecutions>
<pluginExecution>
<pluginExecutionFilter>
<groupId>org.scala-tools</groupId>
<artifactId> maven-scala-plugin</artifactId>
<versionRange> [2.15.2,)</versionRange>
<goals>
<goal>compile</goal>
<goal>testCompile</goal>
</goals>
</pluginExecutionFilter>
<action>
<execute />
</action>
</pluginExecution>
</pluginExecutions>
</lifecycleMappingMetadata>
</configuration>
</plugin>
</plugins>
</pluginManagement>
</build>
scala-tools.org
Scala工具Maven2存储库
http://scala-tools.org/repo-releases
org.apache.spark
spark-core_2.11
2.1.0
假如
org.apache.spark
spark-sql_2.11
2.1.0
假如
org.apache.spark
spark-hive_2.10
2.1.0
org.scala-tools
maven scala插件
编译
编译
编译
测试编译
测试编译
测试编译
过程资源
编译
maven编译器插件
1.8
1.8
org.apache.maven.plugins
maven汇编插件
2.4
带有依赖项的jar
集合所有
包裹
单一的
org.apache.maven.plugins
maven jar插件
org.eclipse.m2e
生命周期映射
1.0.0
org.scala-tools
maven scala插件
[2.15.2,)
编译
测试编译
`在完成几次
mvn安装后,这似乎可以解决。
谢谢。不是解决方案,但不推荐使用HiveContext
。您可以使用SparkSession.builder.enableHiveSupport
。另外,为什么不使用spark-hive_2.11
?@Philantrover感谢您的回复。即使,SparkSession
不可用,也会引发相同的错误。我想我尝试过一个依赖性问题spark_hive-2.11也有,但没有解决方案。你是如何运行这个jar的?@philantrovert这似乎在完成了几次mvn安装后就解决了。