Apache spark java.lang.AbstractMethodError:org.apache.phoenix.spark.DefaultSource.createRelation在pyspark中使用pheonix

Apache spark java.lang.AbstractMethodError:org.apache.phoenix.spark.DefaultSource.createRelation在pyspark中使用pheonix,apache-spark,pyspark,hbase,Apache Spark,Pyspark,Hbase,我正在尝试使用pheonix将Spark数据帧写入HBase,我看到以下错误。知道这是怎么回事吗 调用o102.save时出错。 :java.lang.AbstractMethodError:org.apache.phoenix.spark.DefaultSource.createRelation(Lorg/apache/spark/sql/SQLContext;Lorg/apache/spark/sql/SaveMode;Lscala/collection/immutable/Map;Lorg

我正在尝试使用pheonix将Spark数据帧写入HBase,我看到以下错误。知道这是怎么回事吗

调用o102.save时出错。 :java.lang.AbstractMethodError:org.apache.phoenix.spark.DefaultSource.createRelation(Lorg/apache/spark/sql/SQLContext;Lorg/apache/spark/sql/SaveMode;Lscala/collection/immutable/Map;Lorg/apache/spark/sql/Dataset;)Lorg/apache/spark/sql/sources/BaseRelation; 位于org.apache.spark.sql.execution.datasources.DataSource.write(DataSource.scala:471) 位于org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:50) 位于org.apache.spark.sql.execution.command.executeCommandExec.sideEffectResult$lzycompute(commands.scala:58) 位于org.apache.spark.sql.execution.command.executeCommandExec.sideEffectResult(commands.scala:56) 位于org.apache.spark.sql.execution.command.executeCommandExec.doExecute(commands.scala:74) 位于org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:117) 位于org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:117) 位于org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:138) 位于org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) 位于org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:135) 位于org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:116) 位于org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:92) 位于org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:92) 位于org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:609) 位于org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:233) 在sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)处 位于sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)中 位于java.lang.reflect.Method.invoke(Method.java:498) 位于py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244) 位于py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) 在py4j.Gateway.invoke处(Gateway.java:280) 位于py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) 在py4j.commands.CallCommand.execute(CallCommand.java:79) 在py4j.GatewayConnection.run处(GatewayConnection.java:214) 运行(Thread.java:748)

回溯(最近一次呼叫最后一次): 文件“/grid/1/hadoop/thread/local/usercache/sifsuser/appcache/application\u 1569479196412\u 0065/container\u e06\u 15694799196412\u 0065\u 01\u000001/pyspark.zip/pyspark/sql/readwriter.py”,第593行,保存 self.\u jwrite.save() 文件“/grid/1/hadoop/thread/local/usercache/sifsuser/appcache/application\u 1569479196412\u 0065/container\u e06\u 15694799196412\u 0065\u 01\u000001/py4j-0.10.4-src.zip/py4j/java\u gateway.py”,第1133行,在调用中 回答,self.gateway\u客户端,self.target\u id,self.name) 文件“/grid/1/hadoop/thread/local/usercache/sifsuser/appcache/application\u 1569479196412\u 0065/container\u e06\u 15694799196412\u 0065\u 01\u000001/pyspark.zip/pyspark/sql/utils.py”,第63行,deco格式 返回f(*a,**kw) 文件“/grid/1/hadoop/thread/local/usercache/sifsuser/appcache/application