Pyspark 实例化'时出错;org.apache.spark.sql.hive.HiveExternalCatalog';

Pyspark 实例化'时出错;org.apache.spark.sql.hive.HiveExternalCatalog';,pyspark,hive,Pyspark,Hive,我无法从Pyspark运行配置单元查询 我试图将hive-site.xml复制到spark的conf中,但尽管如此,它还是抛出了同样的错误 完全错误 Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/usr/local/spark-2.4.0/python/pyspark/sql/context.py", line 358, in sql return

我无法从Pyspark运行配置单元查询

我试图将hive-site.xml复制到spark的conf中,但尽管如此,它还是抛出了同样的错误

完全错误

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/local/spark-2.4.0/python/pyspark/sql/context.py", line 358, in sql
    return self.sparkSession.sql(sqlQuery)
  File "/usr/local/spark-2.4.0/python/pyspark/sql/session.py", line 767, in sql
    return DataFrame(self._jsparkSession.sql(sqlQuery), self._wrapped)
  File "/usr/local/spark-2.4.0/python/lib/py4j-0.10.7-src.zip/py4j/java_gateway.py", line 1257, in __call__
  File "/usr/local/spark-2.4.0/python/pyspark/sql/utils.py", line 79, in deco
    raise IllegalArgumentException(s.split(': ', 1)[1], stackTrace)
pyspark.sql.utils.IllegalArgumentException: u"Error while instantiating 'org.apache.spark.sql.hive.HiveExternalCatalog':"
回溯(最近一次呼叫最后一次):
文件“”,第1行,在
sql中的文件“/usr/local/spark-2.4.0/python/pyspark/sql/context.py”,第358行
返回self.sparkSession.sql(sqlQuery)
文件“/usr/local/spark-2.4.0/python/pyspark/sql/session.py”,第767行,sql格式
返回数据帧(self.\u jsparkSession.sql(sqlQuery),self.\u包装)
文件“/usr/local/spark-2.4.0/python/lib/py4j-0.10.7-src.zip/py4j/java_gateway.py”,第1257行,在__
文件“/usr/local/spark-2.4.0/python/pyspark/sql/utils.py”,第79行,deco格式
引发IllegalArgumentException(s.split(“:”,1)[1],stackTrace)
pyspark.sql.utils.IllegalArgumentException:u“实例化'org.apache.spark.sql.hive.HiveExternalCatalog'时出错:”

在我与oozie的测试中,我不得不添加Spark需要的与蜂巢相关的罐子。试着在spark的配置中添加相同的内容,也许

@JamesZ我已经附加了完整的错误,请帮助我,我被stuckHello!我也有同样的问题。如果你解决了你的问题,请写一个解决方案。如果你想看到有人有解决方案,请在这里发布解决方案