Hive HDFS Kafka连接-配置单元集成创建表异常

Hive HDFS Kafka连接-配置单元集成创建表异常,hive,apache-kafka,hdfs,apache-kafka-connect,confluent-platform,Hive,Apache Kafka,Hdfs,Apache Kafka Connect,Confluent Platform,我正在尝试将数据插入hdfs,这很好,但在配置配置单元以查看数据时,我遇到了与URI相关的错误 我已尝试使用store.url代替hdfs.url,但失败,出现空指针异常 我的hdfs-sink.json配置: "connector.class": "io.confluent.connect.hdfs3.Hdfs3SinkConnector", "tasks.max": "1", "topics": "users", "hdfs.url": "hdfs://192.168.1.221:9000"

我正在尝试将数据插入hdfs,这很好,但在配置配置单元以查看数据时,我遇到了与URI相关的错误

我已尝试使用store.url代替hdfs.url,但失败,出现空指针异常

我的hdfs-sink.json配置:

"connector.class": "io.confluent.connect.hdfs3.Hdfs3SinkConnector",
"tasks.max": "1",
"topics": "users",
"hdfs.url": "hdfs://192.168.1.221:9000",
"flush.size": "5",
"key.converter": "org.apache.kafka.connect.storage.StringConverter",
"value.converter": "io.confluent.connect.avro.AvroConverter",
"confluent.topic.bootstrap.servers": "localhost:9092",
"confluent.topic.replication.factor": "1",
"key.converter.schema.registry.url":"http://localhost:8081" ,
"value.converter.schema.registry.url":"http://localhost:8081",
"hive.integration":"true",
"hive.metastore.uris":"thrift://192.168.1.221:9083",
"schema.compatibility":"BACKWARD"
我得到以下错误:

[2019-09-12 15:12:33,533] ERROR Creating Hive table threw unexpected error (io.confluent.connect.hdfs3.TopicPartitionWriter)
io.confluent.connect.storage.errors.HiveMetaStoreException: Hive MetaStore exception
    at io.confluent.connect.storage.hive.HiveMetaStore.doAction(HiveMetaStore.java:99)
    at io.confluent.connect.storage.hive.HiveMetaStore.createTable(HiveMetaStore.java:223)
    at io.confluent.connect.hdfs3.avro.AvroHiveUtil.createTable(AvroHiveUtil.java:52)
    at io.confluent.connect.hdfs3.DataWriter$3.createTable(DataWriter.java:285)
    at io.confluent.connect.hdfs3.TopicPartitionWriter$1.call(TopicPartitionWriter.java:796)
    at io.confluent.connect.hdfs3.TopicPartitionWriter$1.call(TopicPartitionWriter.java:792)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)
Caused by: MetaException(message:java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: hdfs://192.168.1.221:9000./null/topics/users)


是否有人面临同样的问题?已解决………在sink.json文件中添加store.url。谢谢是否有人面临同样的问题?已解决………在sink.json文件中添加store.url。谢谢
ERROR Altering Hive schema threw unexpected error (io.confluent.connect.hdfs3.TopicPartitionWriter)
io.confluent.connect.storage.errors.HiveMetaStoreException: Hive table not found: default.users
    at io.confluent.connect.storage.hive.HiveMetaStore$9.call(HiveMetaStore.java:297)
    at io.confluent.connect.storage.hive.HiveMetaStore$9.call(HiveMetaStore.java:290)
    at io.confluent.connect.storage.hive.HiveMetaStore.doAction(HiveMetaStore.java:97)
    at io.confluent.connect.storage.hive.HiveMetaStore.getTable(HiveMetaStore.java:303)
    at io.confluent.connect.hdfs3.avro.AvroHiveUtil.alterSchema(AvroHiveUtil.java:61)
    at io.confluent.connect.hdfs3.DataWriter$3.alterSchema(DataWriter.java:290)
    at io.confluent.connect.hdfs3.TopicPartitionWriter$2.call(TopicPartitionWriter.java:811)
    at io.confluent.connect.hdfs3.TopicPartitionWriter$2.call(TopicPartitionWriter.java:807)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)`