Python 如何在pyspark中使用MultiClassMetrics计算f分数?

Python 如何在pyspark中使用MultiClassMetrics计算f分数?,python,apache-spark,machine-learning,pyspark,rdd,Python,Apache Spark,Machine Learning,Pyspark,Rdd,正如我在pyspark文档中看到的,fmeasure()函数接受两个参数,即label和beta: fMeasure(label=None, beta=None) 这里的beta是什么 我在RDD中使用了一个非常简单的数据集,如下所示: (它在dataframe中,但我将其转换为RDD) 当我运行此命令时: multi_metrics = MulticlassMetrics(rdd) print 'fMeasure: ', multi_metrics.fMeasure(1) 我得到这个错误:

正如我在pyspark文档中看到的,
fmeasure()
函数接受两个参数,即
label
beta

fMeasure(label=None, beta=None)
这里的beta是什么

我在RDD中使用了一个非常简单的数据集,如下所示: (它在dataframe中,但我将其转换为RDD)

当我运行此命令时:

multi_metrics = MulticlassMetrics(rdd)
print 'fMeasure: ', multi_metrics.fMeasure(1)
我得到这个错误:

print 'fMeasure: ', multi_metrics.fMeasure(1)
  File "/usr/hdp/current/spark-client/python/pyspark/mllib/evaluation.py", line 259, in fMeasure
    return self.call("fMeasure", label)
  File "/usr/hdp/current/spark-client/python/pyspark/mllib/common.py", line 146, in call
    return callJavaFunc(self._sc, getattr(self._java_model, name), *a)
  File "/usr/hdp/current/spark-client/python/pyspark/mllib/common.py", line 123, in callJavaFunc
    return _java2py(sc, func(*args))
  File "/usr/hdp/current/spark-client/python/lib/py4j-0.9-src.zip/py4j/java_gateway.py", line 813, in __call__
    answer, self.gateway_client, self.target_id, self.name)
  File "/usr/hdp/current/spark-client/python/pyspark/sql/utils.py", line 45, in deco
    return f(*a, **kw)
  File "/usr/hdp/current/spark-client/python/lib/py4j-0.9-src.zip/py4j/protocol.py", line 312, in get_return_value
    format(target_id, ".", name, value))
Py4JError: An error occurred while calling o154.fMeasure. Trace:
py4j.Py4JException: Method fMeasure([class java.lang.Integer]) does not exist
    at py4j.reflection.ReflectionEngine.getMethod(ReflectionEngine.java:335)
    at py4j.reflection.ReflectionEngine.getMethod(ReflectionEngine.java:344)
    at py4j.Gateway.invoke(Gateway.java:252)
    at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:133)
    at py4j.commands.CallCommand.execute(CallCommand.java:79)
    at py4j.GatewayConnection.run(GatewayConnection.java:209)
    at java.lang.Thread.run(Thread.java:745)
这里的beta是什么

Spark的
MulticlassMetrics
实现了$F{\beta}$-measure,如果将$\beta$=1,这与传统的F-measure一致。$\beta$参数允许执行以下操作:

关于错误:如果您查看实现,它实际上需要一个
Double
。这是pyspark方法的包装器,这是实际的(在Scala中)

所以你应该这样称呼它,例如:

multi_metrics = MulticlassMetrics(rdd)
print 'fMeasure: ', multi_metrics.fMeasure(1.0,1.0)
multi_metrics = MulticlassMetrics(rdd)
print 'fMeasure: ', multi_metrics.fMeasure(1.0,1.0)