Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/hadoop/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Nifi NPE中的PutHiveStreaming处理器_Hive_Apache Nifi - Fatal编程技术网

Nifi NPE中的PutHiveStreaming处理器

Nifi NPE中的PutHiveStreaming处理器,hive,apache-nifi,Hive,Apache Nifi,我正在调试一个遵循官方PutHiveStreaming处理器的HiveProcessor,但它会写入Hive2.x而不是3.x。该流程在Nifi cluster 1.7.1中运行。尽管发生了此异常,但数据仍会写入配置单元 例外情况是: java.lang.NullPointerException: null at org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaStoreFilterHoo

我正在调试一个遵循官方PutHiveStreaming处理器的HiveProcessor,但它会写入Hive2.x而不是3.x。该流程在Nifi cluster 1.7.1中运行。尽管发生了此异常,但数据仍会写入配置单元

例外情况是:


java.lang.NullPointerException: null
    at org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaStoreFilterHook.getFilteredObjects(AuthorizationMetaStoreFilterHook.java:77)
    at org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaStoreFilterHook.filterDatabases(AuthorizationMetaStoreFilterHook.java:54)
    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabases(HiveMetaStoreClient.java:1147)
    at org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient.isOpen(HiveClientCache.java:471)
    at sun.reflect.GeneratedMethodAccessor1641.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:169)
    at com.sun.proxy.$Proxy308.isOpen(Unknown Source)
    at org.apache.hive.hcatalog.common.HiveClientCache.get(HiveClientCache.java:270)
    at org.apache.hive.hcatalog.common.HCatUtil.getHiveMetastoreClient(HCatUtil.java:558)
    at org.apache.hive.hcatalog.streaming.AbstractRecordWriter.<init>(AbstractRecordWriter.java:95)
    at org.apache.hive.hcatalog.streaming.StrictJsonWriter.<init>(StrictJsonWriter.java:82)
    at org.apache.hive.hcatalog.streaming.StrictJsonWriter.<init>(StrictJsonWriter.java:60)
    at org.apache.nifi.util.hive.HiveWriter.lambda$getRecordWriter$0(HiveWriter.java:91)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
    at org.apache.nifi.util.hive.HiveWriter.getRecordWriter(HiveWriter.java:91)
    at org.apache.nifi.util.hive.HiveWriter.<init>(HiveWriter.java:75)
    at org.apache.nifi.util.hive.HiveUtils.makeHiveWriter(HiveUtils.java:46)
    at org.apache.nifi.processors.hive.PutHive2Streaming.makeHiveWriter(PutHive2Streaming.java:1152)
    at org.apache.nifi.processors.hive.PutHive2Streaming.getOrCreateWriter(PutHive2Streaming.java:1065)
    at org.apache.nifi.processors.hive.PutHive2Streaming.access$1000(PutHive2Streaming.java:114)
    at org.apache.nifi.processors.hive.PutHive2Streaming$1.lambda$process$2(PutHive2Streaming.java:858)
    at org.apache.nifi.processor.util.pattern.ExceptionHandler.execute(ExceptionHandler.java:127)
    at org.apache.nifi.processors.hive.PutHive2Streaming$1.process(PutHive2Streaming.java:855)
    at org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:2211)
    at org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:2179)
    at org.apache.nifi.processors.hive.PutHive2Streaming.onTrigger(PutHive2Streaming.java:808)
    at org.apache.nifi.processors.hive.PutHive2Streaming.lambda$onTrigger$4(PutHive2Streaming.java:672)
    at org.apache.nifi.processor.util.pattern.PartialFunctions.onTrigger(PartialFunctions.java:114)
    at org.apache.nifi.processor.util.pattern.RollbackOnFailure.onTrigger(RollbackOnFailure.java:184)
    at org.apache.nifi.processors.hive.PutHive2Streaming.onTrigger(PutHive2Streaming.java:672)
    at org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1165)
    at org.apache.nifi.controller.tasks.ConnectableTask.invoke(ConnectableTask.java:203)
    at org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:117)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)

java.lang.NullPointerException:null
位于org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaStoreFilterHook.getFilteredObjects(AuthorizationMetaStoreFilterHook.java:77)
位于org.apache.hadoop.hive.ql.security.authorization.plugin.AuthorizationMetaStoreFilterHook.FilterDatabase(AuthorizationMetaStoreFilterHook.java:54)
位于org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabases(HiveMetaStoreClient.java:1147)
位于org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient.isOpen(HiveClientCache.java:471)
位于sun.reflect.GeneratedMethodAccessor1641.invoke(未知源)
在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)中
位于java.lang.reflect.Method.invoke(Method.java:498)
位于org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:169)
位于com.sun.proxy.$Proxy308.isOpen(未知来源)
位于org.apache.hive.hcatalog.common.HiveClientCache.get(HiveClientCache.java:270)
位于org.apache.hive.hcatalog.common.HCatUtil.getHiveMetastoreClient(HCatUtil.java:558)
位于org.apache.hive.hcatalog.streaming.AbstractRecordWriter。(AbstractRecordWriter.java:95)
位于org.apache.hive.hcatalog.streaming.StrictJsonWriter。(StrictJsonWriter.java:82)
位于org.apache.hive.hcatalog.streaming.StrictJsonWriter。(StrictJsonWriter.java:60)
位于org.apache.nifi.util.hive.HiveWriter.lambda$getRecordWriter$0(HiveWriter.java:91)
位于java.security.AccessController.doPrivileged(本机方法)
位于javax.security.auth.Subject.doAs(Subject.java:422)
位于org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
位于org.apache.nifi.util.hive.HiveWriter.getRecordWriter(HiveWriter.java:91)
位于org.apache.nifi.util.hive.HiveWriter(HiveWriter.java:75)
位于org.apache.nifi.util.hive.HiveUtils.makeHiveWriter(HiveUtils.java:46)
位于org.apache.nifi.processors.hive.PutHive2Streaming.makeHiveWriter(PutHive2Streaming.java:1152)
位于org.apache.nifi.processors.hive.PutHive2Streaming.getOrCreateWriter(PutHive2Streaming.java:1065)
位于org.apache.nifi.processors.hive.PutHive2Streaming.access$1000(PutHive2Streaming.java:114)
位于org.apache.nifi.processors.hive.PutHive2Streaming$1.lambda$process$2(PutHive2Streaming.java:858)
位于org.apache.nifi.processor.util.pattern.ExceptionHandler.execute(ExceptionHandler.java:127)
位于org.apache.nifi.processors.hive.PutHive2Streaming$1.process(PutHive2Streaming.java:855)
位于org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:2211)
位于org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:2179)
位于org.apache.nifi.processors.hive.PutHive2Streaming.onTrigger(PutHive2Streaming.java:808)
位于org.apache.nifi.processors.hive.PutHive2Streaming.lambda$onTrigger$4(PutHive2Streaming.java:672)
位于org.apache.nifi.processor.util.pattern.PartialFunctions.onTrigger(PartialFunctions.java:114)
位于org.apache.nifi.processor.util.pattern.RollbackOnFailure.onTrigger(RollbackOnFailure.java:184)
位于org.apache.nifi.processors.hive.PutHive2Streaming.onTrigger(PutHive2Streaming.java:672)
位于org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1165)
位于org.apache.nifi.controller.tasks.ConnectableTask.invoke(ConnectableTask.java:203)
位于org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:117)
位于java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
位于java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
位于java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
位于java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
位于java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
位于java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
运行(Thread.java:748)
我也喜欢重新产生错误。将使用
testrunner.newTestRunner(处理器)你能找到它吗?我参考了Hive3.x的测试用例

另一种方法是在本地运行Hive2.x和Nifi容器。但是我必须运行
docker cp
来通过mvn复制nar包,并从intellij连接远程JVM,正如本博客所描述的那样。


有人做过类似的事情吗?或者有更简单的方法调试自定义处理器吗?

这是一个危险的错误,在配置单元端存在一些问题,它无法获得自己的IP地址或主机名,因此会定期发出此错误。不过,我认为这不会造成任何实际问题,正如您所说的,数据会写入到Hive中

为了完整起见,在ApacheNifi中,PutHiveStreaming是针对Hive1.2.x而不是Hive2.x构建的。目前没有特定的Hive2.x处理器,我们从未确定Hive1.2.x处理器是否与Hive2.x兼容

对于调试,如果您可以在容器中运行配置单元并公开metastore端口(我认为9083是默认端口),那么您应该能够使用
TestRunners
之类的东西创建集成测试,并从IDE本地运行NiFi。这就是对外部系统(例如MongoDB或Elasticsearch)执行其他集成测试的方式

在配置单元测试套件中有一个MiniHS2类用于集成测试,但它不在已发布的工件中,因此不幸的是,我们不得不针对真实的配置单元实例运行测试。