Warning: file_get_contents(/data/phpspider/zhask/data//catemap/8/sorting/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Apache flink 从Postgres表创建Flink数据流_Apache Flink_Flink Streaming_Flink Table Api - Fatal编程技术网

Apache flink 从Postgres表创建Flink数据流

Apache flink 从Postgres表创建Flink数据流,apache-flink,flink-streaming,flink-table-api,Apache Flink,Flink Streaming,Flink Table Api,我正在尝试处理大量数据流(source=kinisis-stream)并将其放入Postgres数据库。 在执行此操作时,我需要首先使用Postgres数据库中已经存在的一些主数据加入传入流 我正在从传入的kinesis流创建一个键控流,并使用JDBC目录使用Flink table API创建第二个流。 我已按如下方式设置了我的数据库接收器: public class PosgresSink extends RichSinkFunction <Clazz> implements

我正在尝试处理大量数据流(source=kinisis-stream)并将其放入Postgres数据库。 在执行此操作时,我需要首先使用Postgres数据库中已经存在的一些主数据加入传入流

我正在从传入的kinesis流创建一个键控流,并使用JDBC目录使用Flink table API创建第二个流。 我已按如下方式设置了我的数据库接收器:

  public class PosgresSink extends RichSinkFunction <Clazz> implements CheckpointedFunction, CheckpointListener { .. }
 final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
 EnvironmentSettings bsSettings = EnvironmentSettings.newInstance().useBlinkPlanner().inStreamingMode().build();
 StreamTableEnvironment bsTableEnv = StreamTableEnvironment.create(env, bsSettings);

// Register a JDBC catalog 

static JdbcCatalog registerJdbcCatalog(StreamTableEnvironment bsTableEnv)   {
   String name = "<>";
   String defaultDatabase = "<>";
   String username = "<>";
   String password = "<>";
   String baseUrl = "jdbc:postgresql://localhost:5432/";

   JdbcCatalog jdbcCatalog = new JdbcCatalog(name, defaultDatabase, username, password, baseUrl);
   bsTableEnv.registerCatalog("catalogName", jdbcCatalog);
   bsTableEnv.useCatalog("catalogName");
   return jdbcCatalog;
 }

// get the table
Table table= bsTableEnv.from("table")

// create a data stream from table
DataStream<Table> myStream= bsTableEnv.toAppendStream(table, Table.class);
这会阻塞我的接收器,因为每次都会中止检查点

我的JDBC源代码似乎很早就完成了,当Flink尝试检查点时,它找不到任何正在运行的作业,并中止检查点。Flink似乎有一个限制,即它只在所有操作员/任务仍在运行时设置检查点

我正在设置JDBC流,如下所示:

  public class PosgresSink extends RichSinkFunction <Clazz> implements CheckpointedFunction, CheckpointListener { .. }
 final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
 EnvironmentSettings bsSettings = EnvironmentSettings.newInstance().useBlinkPlanner().inStreamingMode().build();
 StreamTableEnvironment bsTableEnv = StreamTableEnvironment.create(env, bsSettings);

// Register a JDBC catalog 

static JdbcCatalog registerJdbcCatalog(StreamTableEnvironment bsTableEnv)   {
   String name = "<>";
   String defaultDatabase = "<>";
   String username = "<>";
   String password = "<>";
   String baseUrl = "jdbc:postgresql://localhost:5432/";

   JdbcCatalog jdbcCatalog = new JdbcCatalog(name, defaultDatabase, username, password, baseUrl);
   bsTableEnv.registerCatalog("catalogName", jdbcCatalog);
   bsTableEnv.useCatalog("catalogName");
   return jdbcCatalog;
 }

// get the table
Table table= bsTableEnv.from("table")

// create a data stream from table
DataStream<Table> myStream= bsTableEnv.toAppendStream(table, Table.class);
final StreamExecutionEnvironment env=StreamExecutionEnvironment.getExecutionEnvironment();
EnvironmentSettings bsSettings=EnvironmentSettings.newInstance().useBlinkPlanner().inStreamingMode().build();
StreamTableEnvironment bsTableEnv=StreamTableEnvironment.create(env,bsSettings);
//注册JDBC目录
静态JdbcCatalog注册表JdbcCatalog(StreamTableEnvironment bsTableEnv){
字符串名称=”;
字符串defaultDatabase=“”;
字符串username=“”;
字符串密码=”;
String baseUrl=“jdbc:postgresql://localhost:5432/";
JdbcCatalog JdbcCatalog=新的JdbcCatalog(名称、默认数据库、用户名、密码、baseUrl);
b tableenv.registerCatalog(“目录名”,jdbcCatalog);
b tableenv.useCatalog(“目录名”);
返回jdbcCatalog;
}
//收拾桌子
Table Table=bsTableEnv.from(“Table”)
//从表创建数据流
DataStream myStream=bsTableEnv.toAppendStream(table,table.class);

这是正确的理解吗?这里有办法吗?

您如何设置使用JDBC的流?@DavidAnderson-我用设置使用JDBC的流的步骤更新了问题。您如何设置使用JDBC的流?@DavidAnderson-我用设置使用JDBC的流的步骤更新了问题JDBC。