Pyspark Databricks MQTT数据流抽象方法错误

Pyspark Databricks MQTT数据流抽象方法错误,pyspark,mqtt,spark-structured-streaming,apache-bahir,Pyspark,Mqtt,Spark Structured Streaming,Apache Bahir,我正在尝试从MQTT代理生成SQL结构化流: test=spark.readStream.format(“org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider”)\ .选项(“客户ID”、“实验”)\ .选项(“brokerUrl”、“tcp://:1883”)\ .option(“主题”、“#”)\ .option('QoS',0)\ .option('connectionTimeout',0)\ .选项('keepAlive

我正在尝试从MQTT代理生成SQL结构化流:

test=spark.readStream.format(“org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider”)\
.选项(“客户ID”、“实验”)\
.选项(“brokerUrl”、“tcp://:1883”)\
.option(“主题”、“#”)\
.option('QoS',0)\
.option('connectionTimeout',0)\
.选项('keepAlive',5)\
.option('autoReconnect',True)\
.option('persistence','memory')\
.load()
这些命令看起来不错,但是当我尝试写入流时,我得到一个错误:

PARQUET_PATH=“/tmp”
test.writeStream.format(“拼花地板”)\
.选项('checkpointLocation',拼花地板路径+“/\u chk”)\
.开始(拼花地板路径)
java.lang.AbstractMethodError:org.apache.bahir.sql.streaming.mqtt.MQTTStreamSource.planInputPartitions()Ljava/util/List;
有人犯了这样的错误吗?我已经试过了所有的方法,但都没有成功

我使用的是Spark 2.4.0、Scala 2.11和org.apache.bahir:Spark-sql-streaming-mqtt_2.11:2.4.0-SNAPSHOT