Py4JError:org.apache.flink.table.catalog.hive.HiveCatalog在JVM中不存在

Py4JError:org.apache.flink.table.catalog.hive.HiveCatalog在JVM中不存在,hive,apache-flink,catalog,Hive,Apache Flink,Catalog,我正在尝试从以下url运行flink目录示例: 从pyflink.table导入* 从pyflink.table.catalog导入HiveCatalog、CatalogDatabase、ObjectPath、CatalogBaseTable 从pyflink.table.descriptor导入卡夫卡 settings=EnvironmentSettings.new_instance()。在批处理模式()下。使用\u blink\u planner()。build() t_env=Batch

我正在尝试从以下url运行flink目录示例:

从pyflink.table导入*
从pyflink.table.catalog导入HiveCatalog、CatalogDatabase、ObjectPath、CatalogBaseTable
从pyflink.table.descriptor导入卡夫卡
settings=EnvironmentSettings.new_instance()。在批处理模式()下。使用\u blink\u planner()。build()
t_env=BatchTableEnvironment.create(环境设置=设置)
catalog=HiveCatalog(“myhive”,None,”)
但我有一些错误:

ERROR:root:Exception while sending command.
Traceback (most recent call last):
  File "/home/sean/code/play_with_data/venv_py36/lib/python3.6/site-packages/py4j/java_gateway.py", line 1188, in send_command
    raise Py4JNetworkError("Answer from Java side is empty")
py4j.protocol.Py4JNetworkError: Answer from Java side is empty

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/sean/code/play_with_data/venv_py36/lib/python3.6/site-packages/py4j/java_gateway.py", line 1014, in send_command
    response = connection.send_command(command)
  File "/home/sean/code/play_with_data/venv_py36/lib/python3.6/site-packages/py4j/java_gateway.py", line 1193, in send_command
    "Error while receiving", e, proto.ERROR_ON_RECEIVE)
py4j.protocol.Py4JNetworkError: Error while receiving
---------------------------------------------------------------------------
Py4JError                                 Traceback (most recent call last)
<ipython-input-2-bfdc1a237737> in <module>
      3 
      4 # Create a HiveCatalog
----> 5 catalog = HiveCatalog("myhive", None,  "/opt/apache/hive/conf")

~/code/play_with_data/venv_py36/lib/python3.6/site-packages/pyflink/table/catalog.py in __init__(self, catalog_name, default_database, hive_conf_dir, j_hive_catalog)
    994 
    995         if j_hive_catalog is None:
--> 996             j_hive_catalog = gateway.jvm.org.apache.flink.table.catalog.hive.HiveCatalog(
    997                 catalog_name, default_database, hive_conf_dir)
    998         super(HiveCatalog, self).__init__(j_hive_catalog)

~/code/play_with_data/venv_py36/lib/python3.6/site-packages/py4j/java_gateway.py in __getattr__(self, name)
   1625                 answer[proto.CLASS_FQN_START:], self._gateway_client)
   1626         else:
-> 1627             raise Py4JError("{0} does not exist in the JVM".format(new_fqn))
   1628 
   1629 

Py4JError: org.apache.flink.table.catalog.hive.HiveCatalog does not exist in the JVM


${FLINK_HOME}/lib:

lib
├── flink-connector-jdbc_2.11-1.11.2.jar
├── flink-csv-1.11.2.jar
├── flink-dist_2.12-1.11.2.jar
├── flink-hadoop-compatibility_2.12-1.11.2.jar
├── flink-json-1.11.2.jar
├── flink-shaded-zookeeper-3.4.14.jar
├── flink-sql-connector-hive-3.1.2_2.12-1.11.2.jar
├── flink-table_2.12-1.11.2.jar
├── flink-table-blink_2.12-1.11.2.jar
├── hive-common-3.1.2.jar -> /opt/apache/hive/lib/hive-common-3.1.2.jar
├── hive-exec-3.1.2.jar -> /opt/apache/hive/lib/hive-exec-3.1.2.jar
├── libfb303-0.9.3.jar -> /opt/apache/hive/lib/libfb303-0.9.3.jar
├── log4j-1.2-api-2.12.1.jar
├── log4j-api-2.12.1.jar
├── log4j-core-2.12.1.jar
├── log4j-slf4j-impl-2.12.1.jar
└── postgresql-42.2.18.jar
错误:root:发送命令时异常。
回溯(最近一次呼叫最后一次):
文件“/home/sean/code/play_with_data/venv_py36/lib/python3.6/site packages/py4j/java_gateway.py”,第1188行,在send_命令中
raise Py4JNetworkError(“来自Java端的答案为空”)
py4j.protocol.Py4JNetworkError:来自Java端的答案为空
在处理上述异常期间,发生了另一个异常:
回溯(最近一次呼叫最后一次):
文件“/home/sean/code/play_with_data/venv_py36/lib/python3.6/site packages/py4j/java_gateway.py”,第1014行,在send_命令中
响应=连接。发送命令(命令)
文件“/home/sean/code/play_with_data/venv_py36/lib/python3.6/site packages/py4j/java_gateway.py”,第1193行,在send_命令中
“接收时出错”,e,接收时出现协议错误)
py4j.protocol.Py4JNetworkError:接收时出错
---------------------------------------------------------------------------
Py4JError回溯(最后一次最近调用)
在里面
3.
4#创建HiveCatalog
---->5 catalog=HiveCatalog(“myhive”,None,“/opt/apache/hive/conf”)
~/code/play_with_data/venv_py36/lib/python3.6/site-packages/pyflink/table/catalog.py in_uuuuinit_uuuuuuuuu(self,catalog_name,default_数据库,配置单元配置目录,配置单元目录)
994
995如果j_hive_目录为无:
-->996 j_hive_catalog=gateway.jvm.org.apache.flink.table.catalog.hive.HiveCatalog(
997目录\名称、默认\数据库、配置单元\配置文件\目录)
998超级(HiveCatalog,self)。\uuuu初始化(j\uHive\uu目录)
~/code/play_with_data/venv_py36/lib/python3.6/site-packages/py4j/java_gateway.py in_uuu_getattr_u(self,name)
1625回答[协议类\u FQN\u开始:],自我网关\u客户端)
1626其他:
->1627 raise Py4JError(“{0}在JVM中不存在”。格式(new_fqn))
1628
1629
Py4JError:org.apache.flink.table.catalog.hive.HiveCatalog在JVM中不存在
${FLINK_HOME}/lib:
解放党
├── flink-connector-jdbc_2.11-1.11.2.jar
├── flink-csv-1.11.2.jar
├── flink-dist_2.12-1.11.2.jar
├── flink-hadoop-compatibility_2.12-1.11.2.jar
├── flink-json-1.11.2.jar
├── flink-shaded-zookeeper-3.4.14.jar
├── flink-sql-connector-hive-3.1.2_2.12-1.11.2.jar
├── flink-table_2.12-1.11.2.jar
├── flink-table-blink_2.12-1.11.2.jar
├── hive-common-3.1.2.jar->/opt/apache/hive/lib/hive-common-3.1.2.jar
├── hive-exec-3.1.2.jar->/opt/apache/hive/lib/hive-exec-3.1.2.jar
├── libfb303-0.9.3.jar->/opt/apache/hive/lib/libfb303-0.9.3.jar
├── log4j-1.2-api-2.12.1.jar
├── log4j-api-2.12.1.jar
├── log4j-core-2.12.1.jar
├── log4j-slf4j-impl-2.12.1.jar
└── postgresql-42.2.18.jar
怎么了? 看起来有个图书馆不见了。但我不明白

ERROR:root:Exception while sending command.
Traceback (most recent call last):
  File "/home/sean/code/play_with_data/venv_py36/lib/python3.6/site-packages/py4j/java_gateway.py", line 1188, in send_command
    raise Py4JNetworkError("Answer from Java side is empty")
py4j.protocol.Py4JNetworkError: Answer from Java side is empty

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/sean/code/play_with_data/venv_py36/lib/python3.6/site-packages/py4j/java_gateway.py", line 1014, in send_command
    response = connection.send_command(command)
  File "/home/sean/code/play_with_data/venv_py36/lib/python3.6/site-packages/py4j/java_gateway.py", line 1193, in send_command
    "Error while receiving", e, proto.ERROR_ON_RECEIVE)
py4j.protocol.Py4JNetworkError: Error while receiving
---------------------------------------------------------------------------
Py4JError                                 Traceback (most recent call last)
<ipython-input-2-bfdc1a237737> in <module>
      3 
      4 # Create a HiveCatalog
----> 5 catalog = HiveCatalog("myhive", None,  "/opt/apache/hive/conf")

~/code/play_with_data/venv_py36/lib/python3.6/site-packages/pyflink/table/catalog.py in __init__(self, catalog_name, default_database, hive_conf_dir, j_hive_catalog)
    994 
    995         if j_hive_catalog is None:
--> 996             j_hive_catalog = gateway.jvm.org.apache.flink.table.catalog.hive.HiveCatalog(
    997                 catalog_name, default_database, hive_conf_dir)
    998         super(HiveCatalog, self).__init__(j_hive_catalog)

~/code/play_with_data/venv_py36/lib/python3.6/site-packages/py4j/java_gateway.py in __getattr__(self, name)
   1625                 answer[proto.CLASS_FQN_START:], self._gateway_client)
   1626         else:
-> 1627             raise Py4JError("{0} does not exist in the JVM".format(new_fqn))
   1628 
   1629 

Py4JError: org.apache.flink.table.catalog.hive.HiveCatalog does not exist in the JVM


${FLINK_HOME}/lib:

lib
├── flink-connector-jdbc_2.11-1.11.2.jar
├── flink-csv-1.11.2.jar
├── flink-dist_2.12-1.11.2.jar
├── flink-hadoop-compatibility_2.12-1.11.2.jar
├── flink-json-1.11.2.jar
├── flink-shaded-zookeeper-3.4.14.jar
├── flink-sql-connector-hive-3.1.2_2.12-1.11.2.jar
├── flink-table_2.12-1.11.2.jar
├── flink-table-blink_2.12-1.11.2.jar
├── hive-common-3.1.2.jar -> /opt/apache/hive/lib/hive-common-3.1.2.jar
├── hive-exec-3.1.2.jar -> /opt/apache/hive/lib/hive-exec-3.1.2.jar
├── libfb303-0.9.3.jar -> /opt/apache/hive/lib/libfb303-0.9.3.jar
├── log4j-1.2-api-2.12.1.jar
├── log4j-api-2.12.1.jar
├── log4j-core-2.12.1.jar
├── log4j-slf4j-impl-2.12.1.jar
└── postgresql-42.2.18.jar