Apache flink pyflink(flink)1.12.0通过to_append_流将表转换为数据流时出现的错误(java api为:toAppendStream)

Apache flink pyflink(flink)1.12.0通过to_append_流将表转换为数据流时出现的错误(java api为:toAppendStream),apache-flink,flink-streaming,pyflink,Apache Flink,Flink Streaming,Pyflink,非常感谢您的帮助 代码: 从pyflink.common.typeinfo导入行类型信息、类型、基本类型信息、TupleTypeInfo 从pyflink.table导入环境设置,StreamTableEnvironment #溪流模式的环境创建 env_settings_stream=EnvironmentSettings.new_instance()。使用_blink_planner()。在_streaming_mode()中。生成() env\u stream=StreamTableEnv

非常感谢您的帮助

代码:

从pyflink.common.typeinfo导入行类型信息、类型、基本类型信息、TupleTypeInfo
从pyflink.table导入环境设置,StreamTableEnvironment
#溪流模式的环境创建
env_settings_stream=EnvironmentSettings.new_instance()。使用_blink_planner()。在_streaming_mode()中。生成()
env\u stream=StreamTableEnvironment.create(环境设置=env\u设置\u stream)
表1=env_stream.from_元素([(1,23.4,'lili'),(2,33.4,'er'),(3,45.6,'yu'),['id','order_amt','name']))
表2=env_stream.from_元素([(1,43.4,'xixi'),(2,53.4,'rr'),(3,65.6,'ww'),['id2','order_amt2','name'))
#类型:列表[TypeInformation],字段名称:列表[str]
#行类型信息=行类型信息([BasicTypeInfo.STRING类型信息(),BasicTypeInfo.FLOAT类型信息(),BasicTypeInfo.STRING类型信息(),['id','order\u amt','name']))
行类型信息=TupleTypeInfo([BasicTypeInfo.STRING类型信息(),BasicTypeInfo.FLOAT类型信息(),BasicTypeInfo.STRING类型信息()
stream=env_stream.to_append_stream(表1,行类型信息)
错误信息:

Traceback (most recent call last):
  File "/Users/hulc/anaconda3/envs/myenv_3_6/lib/python3.6/site-packages/pyflink/util/exceptions.py", line 147, in deco
    return f(*a, **kw)
  File "/Users/hulc/anaconda3/envs/myenv_3_6/lib/python3.6/site-packages/py4j/protocol.py", line 332, in get_return_value
    format(target_id, ".", name, value))
py4j.protocol.Py4JError: An error occurred while calling o4.toAppendStream. Trace:
org.apache.flink.api.python.shaded.py4j.Py4JException: Method toAppendStream([class org.apache.flink.table.api.internal.TableImpl, class org.apache.flink.api.java.typeutils.TupleTypeInfo]) does not exist
    at org.apache.flink.api.python.shaded.py4j.reflection.ReflectionEngine.getMethod(ReflectionEngine.java:318)
    at org.apache.flink.api.python.shaded.py4j.reflection.ReflectionEngine.getMethod(ReflectionEngine.java:326)
    at org.apache.flink.api.python.shaded.py4j.Gateway.invoke(Gateway.java:274)
    at org.apache.flink.api.python.shaded.py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
    at org.apache.flink.api.python.shaded.py4j.commands.CallCommand.execute(CallCommand.java:79)
    at org.apache.flink.api.python.shaded.py4j.GatewayConnection.run(GatewayConnection.java:238)
    at java.lang.Thread.run(Thread.java:748)
环境:

  • ApacheFlink 1.12.0(python flink)
  • py4j 0.10.8.1(当pip3安装ApacheFlink时,py4j将自动安装以实现依赖性)
  • python 3.7(蟒蛇)
  • pycharm 2020.1.1版本
  • MacOS11.1
  • 调试信息映像1:

    调试信息映像2:

    复制步骤:

  • 在相同的环境中,本地运行代码(本地模式)
  • 代码行的断点:“stream=env\u stream.to\u append\u stream(表1,行\u type\u info)”
  • 调试运行,断点将被触发两次,第一次未找到toAppendStream方法,第二次找到toAppendStream方法。但第一次出现了例外