Scala Flink-NiFi连接器

Scala Flink-NiFi连接器,scala,apache-flink,apache-nifi,Scala,Apache Flink,Apache Nifi,我需要一些关于使用Scala代码将数据从输出NiFi端口传输到Flink的帮助 我被困在.addSource()函数中。它要求额外的参数([OUT]),但当我提供这些参数时,总是会出现错误。下面是Scala代码和错误消息 package flinkTest import java.nio.charset.{Charset, StandardCharsets} import org.apache.flink.streaming.api.environment.StreamExecutionEn

我需要一些关于使用Scala代码将数据从输出NiFi端口传输到Flink的帮助

我被困在
.addSource()
函数中。它要求额外的参数([OUT]),但当我提供这些参数时,总是会出现错误。下面是Scala代码和错误消息

package flinkTest

import java.nio.charset.{Charset, StandardCharsets}

import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment
import org.apache.flink.streaming.api.scala.DataStream
import org.apache.flink.streaming.connectors.nifi.NiFiSource
import org.apache.flink.streaming.api.functions.source.SourceFunction
import org.apache.flink.streaming.connectors.nifi.NiFiDataPacket

import org.apache.nifi.remote.client.{SiteToSiteClient, SiteToSiteClientConfig}

object NifiFlow {
  def main(): Unit = {

    // get the execution environment
    val env: StreamExecutionEnvironment = 
    StreamExecutionEnvironment.getExecutionEnvironment

    // get input data by connecting to NiFi
    val clientConfig: SiteToSiteClientConfig = new SiteToSiteClient.Builder()
      .url("http://localhost:8080/nifi")
      .portName("Data to flink")
      .requestBatchCount(2)
      .buildConfig()

    val nifiSource: SourceFunction[NiFiDataPacket] = new NiFiSource(clientConfig)
这是一块

    val streamSource: DataStream[NiFiDataPacket] = 
    env.addSource(nifiSource).setParallelism(2)
还有一些代码

    val dataStream = streamSource.map(dataPacket => new String(dataPacket.getContent, StandardCharsets.UTF_8))

    dataStream.print()

    env.execute()
  }
}
1) 有[没有]

Error:(28, 76) value nifiSource of type org.apache.flink.streaming.api.functions.source.SourceFunction[org.apache.flink.streaming.connectors.nifi.NiFiDataPacket] does not take type parameters.
    val streamSource: DataStream[NiFiDataPacket] = env.addSource(nifiSource[NiFiDataPacket]).setParallelism(2)
2) 没有[出去]

Error:(28, 66) type mismatch;
 found   : org.apache.flink.streaming.api.functions.source.SourceFunction[org.apache.flink.streaming.connectors.nifi.NiFiDataPacket]
 required: org.apache.flink.streaming.api.function.source.SourceFunction[?]
    val streamSource: DataStream[NiFiDataPacket] = env.addSource(nifiSource).setParallelism(2)
这个例子被改写成Scala

我将感谢你的建议

UPD2

错误


scala的执行环境有一个特殊的实现


只需使用它而不是
org.apache.flink.streaming.api.environment.StreamExecutionEnvironment

scala的执行环境有一个特殊的实现

只需使用它而不是
org.apache.flink.streaming.api.environment.StreamExecutionEnvironment
env.addSource(nifiSource)
仅适用于先前的设置
env.getJavaEnv.getConfig.disableClosureCleaner()

可能,这个开源项目中的scala源代码应该更新一点(位于flink-scala_2.11…jar中)。

env.addSource(nifiSource)
仅在使用之前的设置时才起作用
env.getJavaEnv.getConfig.disableClosureCleaner()


也许,这个开源项目中的scala源代码应该更新一点(位于flink-scala_2.11…jar中)。

没有帮助。它无法转换类型类。错误:(37,38)类型不匹配;找到:org.apache.flink.streaming.api.functions.source.SourceFunction[org.apache.flink.streaming.connectors.nifi.NiFiDataPacket]必需:org.apache.flink.streaming.api.source.SourceFunction[?]val streamSource=env.addSource(nifiSource)您能用修改后的代码和完整错误附加完整文件吗?更新了最初的帖子。代码太长,无法粘贴到注释中。您的flink版本是什么?我发现SourceFunction类的包名有问题。StreamExecutionEnvironment需要
org.apache.flink.streaming.api.function.source.SourceFunction
,我只能在
flink v0.8.1
中找到它,您需要传递
org.apache.flink.streaming.api.functions.source.SourceFunction
,它出现在
flink v0.9
之后。请注意,有
.function.
vs
.functions.
我想您的类路径中有不同的库。这没有帮助。它无法转换类型类。错误:(37,38)类型不匹配;找到:org.apache.flink.streaming.api.functions.source.SourceFunction[org.apache.flink.streaming.connectors.nifi.NiFiDataPacket]必需:org.apache.flink.streaming.api.source.SourceFunction[?]val streamSource=env.addSource(nifiSource)您能用修改后的代码和完整错误附加完整文件吗?更新了最初的帖子。代码太长,无法粘贴到注释中。您的flink版本是什么?我发现SourceFunction类的包名有问题。StreamExecutionEnvironment需要
org.apache.flink.streaming.api.function.source.SourceFunction
,我只能在
flink v0.8.1
中找到它,您需要传递
org.apache.flink.streaming.api.functions.source.SourceFunction
,它出现在
flink v0.9
之后。请注意,有
.function.
vs
.functions.
我想您的类路径中有不同的库。
package flinkTest

import org.apache.nifi.remote.client.{SiteToSiteClient, SiteToSiteClientConfig}
import org.apache.flink.streaming.api.scala._
import org.apache.flink.streaming.connectors.nifi._

object NifiFlow {
  def main(): Unit = {

    // get the execution environment
    val env: StreamExecutionEnvironment = StreamExecutionEnvironment.getExecutionEnvironment

    // get input data by connecting to NiFi
    val clientConfig: SiteToSiteClientConfig = new SiteToSiteClient.Builder()
      .url("http://localhost:8080/nifi")
      .portName("Data to flink")
      .requestBatchCount(2)
      .buildConfig()

    val nifiSource = new NiFiSource(clientConfig)

    val streamSource: DataStream[String] = env
      .addSource(nifiSource)
      .map(x => x.getAttributes().toString)

    env.execute()
  }
}
Connected to the target VM, address: '127.0.0.1:41218', transport: 'socket'
Exception in thread "main" org.apache.flink.api.common.functions.InvalidTypesException: Interfaces and abstract classes are not valid types: interface org.apache.flink.streaming.connectors.nifi.NiFiDataPacket
    at org.apache.flink.api.java.typeutils.TypeExtractor.privateGetForClass(TypeExtractor.java:871)
    at org.apache.flink.api.java.typeutils.TypeExtractor.privateGetForClass(TypeExtractor.java:863)
    at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfoWithTypeHierarchy(TypeExtractor.java:406)
    at org.apache.flink.api.java.typeutils.TypeExtractor.privateCreateTypeInfo(TypeExtractor.java:197)
    at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfo(TypeExtractor.java:184)
    at flinkTest.NifiFlow$.main(NiFiFlow.scala:23)