Warning: file_get_contents(/data/phpspider/zhask/data//catemap/8/logging/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Logging 来自Netcat源的事件不';不要通过卡夫卡频道_Logging_Apache Kafka_Flume - Fatal编程技术网

Logging 来自Netcat源的事件不';不要通过卡夫卡频道

Logging 来自Netcat源的事件不';不要通过卡夫卡频道,logging,apache-kafka,flume,Logging,Apache Kafka,Flume,我使用flume代理通过flume代理收集外部数据。外部数据批处理几乎是每10秒1MB。我将Flume代理配置如下 # Flume agent configuration as /flume/conf/agent.conf agent.sources = netcat-source agent.channels = kafka-channel agent.sinks = logger-sink ######################################## # Netca

我使用flume代理通过flume代理收集外部数据。外部数据批处理几乎是每10秒1MB。我将Flume代理配置如下

# Flume agent configuration as /flume/conf/agent.conf
agent.sources = netcat-source
agent.channels = kafka-channel
agent.sinks = logger-sink

########################################
#   Netcat Source
########################################

agent.sources.netcat-source.type = netcat
agent.sources.netcat-source.bind = 0.0.0.0
agent.sources.netcat-source.port = 4141
agent.sources.netcat-source.max-line-length = 500000
agent.sources.netcat-source.channels = kafka-channel

########################################
#   Kafka Channel
########################################

agent.channels.kafka-channel.type =  org.apache.flume.channel.kafka.KafkaChannel
agent.channels.kafka-channel.brokerList = 10.212.136.108:9092,10.212.136.108:9092
agent.channels.kafka-channel.zookeeperConnect = 10.212.136.108:2181,10.212.136.108:2181/kafka
agent.channels.kafka-channel.topic = channel
agent.channels.kafka-channel.groupId = fcd-group


########################################
#   Logger Sink
########################################

agent.sinks.logger-sink.type = logger
agent.sinks.logger-sink.channel = kafka-channel
我用以下方式激活了代理

flume-ng agent -n agent -c /flume/conf -f /flume/conf/agent.conf 
不幸的是,事实证明netcat源运行良好,通道或接收器出现问题。从Ubuntu的资源监视器中,我可以看到以下性能。 在没有其他应用程序运行网络io的情况下,我确信这个图演示了Flume代理的情况

当我通过console consumer查看主题“频道”中的卡夫卡内容时,什么也没有得到。另外,当我检查flume.log时,我只得到flume的状态输出,没有数据

我已经使用

nc -lk 4141 >> my_data_check_file
我的频道或水槽怎么了


另外,当我使用内存通道、文件通道时,事情变得同样棘手。

啊,最后,我自己解决了这个问题

关键点是行delimeter'\n'。

在Flume源代码NetcatSource.java中,我们有一个棘手的行,如下所示

private int processEvents(CharBuffer buffer, Writer writer) throws IOException {
  int numProcessed = 0;

  boolean foundNewLine = true;
  while (foundNewLine) {
    foundNewLine = false;

    int limit = buffer.limit();
    for (int pos = buffer.position(); pos < limit; pos++) {
      if (buffer.get(pos) == '\n') {  
        // parse event body bytes out of CharBuffer
        buffer.limit(pos); // temporary limit
        ByteBuffer bytes = Charsets.UTF_8.encode(buffer);
        buffer.limit(limit); // restore limit
... ...
... ...
private int processEvents(CharBuffer、Writer-Writer)引发IOException{
int numProcessed=0;
布尔值foundNewLine=true;
while(foundNewLine){
foundNewLine=false;
int limit=buffer.limit();
对于(int pos=buffer.position();pos
该代码强制输入数据以“\n”结尾。否则,通道将不接收任何事件。我们可以根据需要更改此字符,并将自定义源代码放入$FLUME\u HOME/lib中