Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/scala/18.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/6/apache/9.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Scala 实例化'时出错;org.apache.spark.sql.hive.HiveSessionState';:使用spark会话读取csv文件时_Scala_Apache_Maven_Apache Spark_Intellij Idea - Fatal编程技术网

Scala 实例化'时出错;org.apache.spark.sql.hive.HiveSessionState';:使用spark会话读取csv文件时

Scala 实例化'时出错;org.apache.spark.sql.hive.HiveSessionState';:使用spark会话读取csv文件时,scala,apache,maven,apache-spark,intellij-idea,Scala,Apache,Maven,Apache Spark,Intellij Idea,我正在尝试使用sparksession读取csv文件,但出现以下错误 错误: Exception in thread "main" java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionState': 下面是我使用sparkSession读取csv文件的代码 ss.read.format("com.databricks.spark.csv").opt

我正在尝试使用sparksession读取csv文件,但出现以下错误

错误:

Exception in thread "main" java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionState':
下面是我使用sparkSession读取csv文件的代码

ss.read.format("com.databricks.spark.csv").option("header", "true").option("delimiter", ",").schema(fileSchema).load("data_sample.csv")
我创建了spark会话,如下所示:

val conf = new SparkConf()
  .set("hive.exec.orc.split.strategy", "ETL")
  .set("hive.exec.dynamic.partition", "true")
  .set("hive.exec.dynamic.partition.mode", "nonstrict")

val ss = SparkSession.builder().enableHiveSupport().config(conf).appName(applicationName).getOrCreate()
我的Pom文件具有以下依赖项:

<properties>
    <scala.compat.version>2.11</scala.compat.version>
    <scala.version>2.11.8</scala.version>
    <spark.version>2.1.2</spark.version>
</properties>

<!-- https://mvnrepository.com/artifact/org.scala-lang/scala-library -->
    <dependency>
        <groupId>org.scala-lang</groupId>
        <artifactId>scala-library</artifactId>
        <version>${scala.version}</version>
    </dependency>
    <!-- https://mvnrepository.com/artifact/org.scala-lang/scala-reflect -->
    <dependency>
        <groupId>org.scala-lang</groupId>
        <artifactId>scala-reflect</artifactId>
        <version>${scala.version}</version>
    </dependency>
    <!-- https://mvnrepository.com/artifact/org.scala-lang.modules/scala-xml -->
    <dependency>
        <groupId>org.scala-lang.modules</groupId>
        <artifactId>scala-xml_2.11</artifactId>
        <version>1.0.6</version>
    </dependency>
    <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core -->
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_${scala.compat.version}</artifactId>
        <version>${spark.version}</version>
    </dependency>
    <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql -->
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_${scala.compat.version}</artifactId>
        <version>${spark.version}</version>
    </dependency>
    <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-hive -->
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-hive_${scala.compat.version}</artifactId>
        <version>${spark.version}</version>
    </dependency>
    <!-- https://mvnrepository.com/artifact/com.databricks/spark-csv -->
    <dependency>
        <groupId>com.databricks</groupId>
        <artifactId>spark-csv_${scala.compat.version}</artifactId>
        <version>1.5.0</version>
    </dependency>
    <!-- https://mvnrepository.com/artifact/org.slf4j/slf4j-api -->
    <dependency>
        <groupId>org.slf4j</groupId>
        <artifactId>slf4j-api</artifactId>
        <version>1.7.25</version>
    </dependency>
    <!-- https://mvnrepository.com/artifact/org.slf4j/slf4j-simple -->
    <dependency>
        <groupId>org.slf4j</groupId>
        <artifactId>slf4j-simple</artifactId>
        <version>1.7.25</version>
    </dependency>
    <!-- https://mvnrepository.com/artifact/org.slf4j/slf4j-log4j12 -->
    <dependency>
        <groupId>org.slf4j</groupId>
        <artifactId>slf4j-log4j12</artifactId>
        <version>1.7.25</version>
    </dependency>
    <!-- https://mvnrepository.com/artifact/commons-configuration/commons-configuration -->
    <dependency>
        <groupId>commons-configuration</groupId>
        <artifactId>commons-configuration</artifactId>
        <version>1.6</version>
    </dependency>
    <!-- https://mvnrepository.com/artifact/commons-lang/commons-lang -->
    <dependency>
        <groupId>commons-lang</groupId>
        <artifactId>commons-lang</artifactId>
        <version>2.6</version>
    </dependency>
    <!-- https://mvnrepository.com/artifact/com.fasterxml.jackson.core/jackson-core -->
    <dependency>
        <groupId>com.fasterxml.jackson.core</groupId>
        <artifactId>jackson-core</artifactId>
        <version>2.5.2</version>
    </dependency>
    <!-- https://mvnrepository.com/artifact/com.fasterxml.jackson.core/jackson-annotations -->
    <dependency>
        <groupId>com.fasterxml.jackson.core</groupId>
        <artifactId>jackson-annotations</artifactId>
        <version>2.5.2</version>
    </dependency>

2.11
2.11.8
2.1.2
org.scala-lang
scala图书馆
${scala.version}
org.scala-lang
斯卡拉反射
${scala.version}
org.scala-lang.modules
scala-xml_2.11
1.0.6
org.apache.spark
spark-core{scala.compat.version}
${spark.version}
org.apache.spark
spark-sql{scala.compat.version}
${spark.version}
org.apache.spark
spark-hive${scala.compat.version}
${spark.version}
com.databricks
spark-csv{scala.compat.version}
1.5.0
org.slf4j
slf4j api
1.7.25
org.slf4j
slf4j简单
1.7.25
org.slf4j
slf4j-log4j12
1.7.25
共用配置
共用配置
1.6
公地郎
公地郎
2.6
com.fasterxml.jackson.core
杰克逊核心
2.5.2
com.fasterxml.jackson.core
杰克逊注释
2.5.2
在创建spark会话时,我尝试添加(“spark.sql.warehouse.dir”、“/spark warehouse”)配置,但仍然遇到相同的错误。我正在使用maven构建项目,并在IntelliJ中运行应用程序


是否有我遗漏的任何config/pom依赖项?

您如何使用spark submit运行此应用程序?@Bhanu我正在以IntelliJ的应用程序运行此应用程序。这可能是您获得此应用程序的原因之一。你应该已经安装了蜂巢。我更喜欢把它装成罐子,然后用spark提交。如果您想进行交互式开发,建议使用spark shell。确保您的spark配置中有hive-site.xml。您如何使用spark submit运行此应用程序?@Bhanu我正在以IntelliJ的应用程序运行此应用程序。这可能是您获得此应用程序的原因之一。你应该已经安装了蜂巢。我更喜欢把它装成罐子,然后用spark提交。如果您想进行交互式开发,建议使用spark shell。确保spark配置中有hive-site.xml。