Apache flink 找不到适用于';org.apache.flink.table.factories.TableSourceFactory';在类路径中

Apache flink 找不到适用于';org.apache.flink.table.factories.TableSourceFactory';在类路径中,apache-flink,Apache Flink,我使用的是Flink 1.12,我有以下简单的代码片段,用于演示表和数据集的集成 运行应用程序时,出现错误,异常为: Caused by: org.apache.flink.table.api.NoMatchingTableFactoryException: Could not find a suitable table factory for 'org.apache.flink.table.factories.TableSourceFactory' in the classpath. Rea

我使用的是Flink 1.12,我有以下简单的代码片段,用于演示表和数据集的集成

运行应用程序时,出现错误,异常为:

Caused by: org.apache.flink.table.api.NoMatchingTableFactoryException: Could not find a suitable table factory for 'org.apache.flink.table.factories.TableSourceFactory' in
the classpath.

Reason: Required context properties mismatch.

The following properties are requested:
connector=filesystem
format=csv
path=D:/stock.csv
schema.0.data-type=VARCHAR(2147483647)
schema.0.name=key
schema.1.data-type=VARCHAR(2147483647)
schema.1.name=date
schema.2.data-type=DOUBLE
schema.2.name=price

The following factories have been considered:
org.apache.flink.table.sources.CsvBatchTableSourceFactory
org.apache.flink.table.sources.CsvAppendTableSourceFactory
org.apache.flink.streaming.connectors.kafka.KafkaTableSourceSinkFactory
org.apache.flink.connector.jdbc.table.JdbcTableSourceSinkFactory
    at org.apache.flink.table.factories.TableFactoryService.filterByContext(TableFactoryService.java:322)
    at org.apache.flink.table.factories.TableFactoryService.filter(TableFactoryService.java:190)
    at org.apache.flink.table.factories.TableFactoryService.findSingleInternal(TableFactoryService.java:143)
    at org.apache.flink.table.factories.TableFactoryService.find(TableFactoryService.java:96)
    at org.apache.flink.table.factories.TableFactoryUtil.findAndCreateTableSource(TableFactoryUtil.java:46)
    ... 76 more
代码是:

  test("batch test 3") {
    val env = ExecutionEnvironment.getExecutionEnvironment
    val tenv = BatchTableEnvironment.create(env)
    val ddl =
      """
      create table sourceTable(
      key STRING,
      deal_date STRING,
      price DOUBLE
      ) with (
        'connector' = 'filesystem',
        'path' = 'D:/stock.csv',
        'format' = 'csv'
      )
      """.stripMargin(' ')
    tenv.executeSql(ddl)

    val table = tenv.sqlQuery(
      """
      select key,deal_date from sourceTable
      """.stripMargin(' '))

   //Convert the table to data set
    tenv.toDataSet[(String, String)](table).print()

    env.execute()
  }