Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/java/334.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java Flink WebUI中不提供Apache波束计数器/度量_Java_Apache Flink_Metrics_Apache Beam - Fatal编程技术网

Java Flink WebUI中不提供Apache波束计数器/度量

Java Flink WebUI中不提供Apache波束计数器/度量,java,apache-flink,metrics,apache-beam,Java,Apache Flink,Metrics,Apache Beam,我正在使用Flink 1.4.1和Beam 2.3.0,我想知道在Flink WebUI(或任何地方)中是否可以像在Dataflow WebUI中一样使用度量 我使用了计数器,例如: import org.apache.beam.sdk.metrics.Counter; import org.apache.beam.sdk.metrics.Metrics; ... Counter elementsRead = Metrics.counter(getClass(), "elements_read"

我正在使用Flink 1.4.1和Beam 2.3.0,我想知道在Flink WebUI(或任何地方)中是否可以像在Dataflow WebUI中一样使用度量

我使用了计数器,例如:

import org.apache.beam.sdk.metrics.Counter;
import org.apache.beam.sdk.metrics.Metrics;
...
Counter elementsRead = Metrics.counter(getClass(), "elements_read");
...
elementsRead.inc();

但我在Flink WebUI中找不到可用的
“元素读取”
计数(任务度量或累加器)。我认为这将很简单。

一旦您在仪表板中选择了一个作业,您将看到该作业的DAG,DAG下面有一个选项卡列表

  • 单击“任务度量”选项卡
  • 单击DAG的一个框
  • 单击“添加度量”按钮,以显示该运算符度量

  • 如果管道以分离模式运行,则不支持查询指标。参考

    但是,我可以使用slf4j查看指标


    我可以在累加器选项卡中看到度量,但在度量选项卡中看不到。。我使用的是Flink版本:1.12.0。。使用最新的Apache Beam Master分支代码..

    尝试过,但没有成功。我的计数器不在指标列表中。您是如何创建波束计数器/指标的?哼。。。你能在累加器选项卡上看到你的计数器吗?@robosoul,有什么进展吗?我也面临着同样的问题:我所能看到的只是标准指标,而没有我自定义指标的迹象。@diegoreico。。我可以在累加器选项卡中看到度量,但在度量选项卡中看不到。。我使用的是Flink版本:1.12.0。。使用最新的Apache Beam Master分支代码..@zorro,您如何使用slf4j reporter查看指标?
    public class FlinkDetachedRunnerResult implements PipelineResult {
    
      FlinkDetachedRunnerResult() {}
    
      @Override
      public State getState() {
        return State.UNKNOWN;
      }
    
      @Override
      public MetricResults metrics() {
        throw new UnsupportedOperationException("The FlinkRunner does not currently support metrics.");
      }
    
      @Override
      public State cancel() throws IOException {
        throw new UnsupportedOperationException("Cancelling is not yet supported.");
      }
    
      @Override
      public State waitUntilFinish() {
        return State.UNKNOWN;
      }
    
      @Override
      public State waitUntilFinish(Duration duration) {
        return State.UNKNOWN;
      }
    
      @Override
      public String toString() {
        return "FlinkDetachedRunnerResult{}";
      }
    }
    
    from apache_beam.metrics.metric import Metrics
    from apache_beam.metrics.metric import MetricsFilter
    from apache_beam.options.pipeline_options import PipelineOptions
    import apache_beam as beam
    import csv
    import logging
    
    GAME_DATA = [
    'user1_team1,team1,18,1447686663000,2015-11-16 15:11:03.921',
    'user1_team1,team1,18,1447690263000,2015-11-16 16:11:03.921',
    'user2_team2,team2,2,1447690263000,2015-11-16 16:11:03.955',
    'user3_team3,team3,8,1447690263000,2015-11-16 16:11:03.955',
    'user4_team3,team3,5,1447690263000,2015-11-16 16:11:03.959',
    'user1_team1,team1,14,1447697463000,2015-11-16 18:11:03.955',
    'robot1_team1,team1,9000,1447697463000,2015-11-16 18:11:03.955',
    'robot2_team2,team2,1,1447697463000,2015-11-16 20:11:03.955',
    'robot2_team2,team2,9000,1447697463000,2015-11-16 21:11:03.955',
    'robot1_team1,1000,2447697463000,2915-11-16 21:11:03.955',
    'robot2_team2,9000,1447697463000,2015-11-16 21:11:03.955']
    
    class ParseGameEventFn(beam.DoFn):
        def __init__(self):
            super(ParseGameEventFn, self).__init__()
        self.game_events = Metrics.counter(self.__class__, 'game_events')
    
        def process(self, element, *args, **kwargs):
            try:
                self.game_events.inc()
                row = list(csv.reader([element]))[0]
                if int(row[2]) < 5:
                   return
                yield {
                    'user': row[0],
                    'team': row[1],
                    'score': int(row[2]),
                    'timestamp': int(row[3]) / 1000.0,
                }
            except Exception as ex:
                logging.error('Parse error on {}: {}'.format(element, ex))
    
    with beam.Pipeline(options=pipeline_options) as pipeline:
        results = (
            pipeline
            | "Create" >> beam.Create(GAME_DATA)
            | "Parsing" >> beam.ParDo(ParseGameEventFn())
            | "AddEventTimestamps" >> beam.Map(
                 lambda elem: beam.window.TimestampedValue(elem, elem['timestamp']))
            | "Print" >> beam.Map(print))
    
    metric_results = pipeline.result.metrics().query(MetricsFilter().with_name('game_events'))
    outputs_user_counter = metric_results['counters'][0]
    print(outputs_user_counter.committed)
    
    metrics.reporters: prom
    metrics.reporter.prom.class: org.apache.flink.metrics.prometheus.PrometheusReporter
    metrics.reporter.prom.port: 9250-9260