Apache spark 无法在spark中本地读取拼花地板文件

Apache spark 无法在spark中本地读取拼花地板文件,apache-spark,pyspark,apache-spark-sql,spark-dataframe,pyspark-sql,Apache Spark,Pyspark,Apache Spark Sql,Spark Dataframe,Pyspark Sql,我正在本地运行Pyspark,试图读取拼花地板文件并从笔记本加载到数据框中 df=spark.read.parquet(“metastore_db/tmp/userdata1.parquet”) 我得到了这个例外 An error occurred while calling o738.parquet. : org.apache.spark.sql.AnalysisException: java.lang.RuntimeException: java.lang.RuntimeException:

我正在本地运行Pyspark,试图读取拼花地板文件并从笔记本加载到数据框中

df=spark.read.parquet(“metastore_db/tmp/userdata1.parquet”)

我得到了这个例外

An error occurred while calling o738.parquet.
: org.apache.spark.sql.AnalysisException: java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient;

有人知道怎么做吗?

假设您正在本地运行spark,您应该执行以下操作

df = spark.read.parquet("file:///metastore_db/tmp/userdata1.parquet")