Scikit learn Keras-寻找二元分类的召回值

Scikit learn Keras-寻找二元分类的召回值,scikit-learn,keras,precision,precision-recall,Scikit Learn,Keras,Precision,Precision Recall,使用sklearn查找回忆的价值时,我有点困难。 我正在使用Keras2.0解决一个二进制分类问题。为了找到回忆,我需要依赖sklearn度量,但我得到了一个值错误。以下是我的示例代码和错误堆栈: >> print(y_true[:5]) >> [[ 0. 1.] [ 0. 1.] [ 1. 0.] [ 0. 1.] [ 1. 0.]] >> y_scores = model.predict(x_val) >>

使用sklearn查找回忆的价值时,我有点困难。 我正在使用Keras2.0解决一个二进制分类问题。为了找到回忆,我需要依赖sklearn度量,但我得到了一个值错误。以下是我的示例代码和错误堆栈:

>> print(y_true[:5])
>> [[ 0.  1.]
   [ 0.  1.]
   [ 1.  0.]
   [ 0.  1.]
   [ 1.  0.]]

>> y_scores = model.predict(x_val)
>> print(y_scores[:5])
[[  7.00690389e-01   2.99309582e-01]
 [  9.36253404e-04   9.99063790e-01]
 [  9.99530196e-01   4.69864986e-04]
 [  6.66563153e-01   3.33436847e-01]
 [  9.98917222e-01   1.08276575e-03]

 >>from sklearn import metrics
 >>recall_score(y_true, y_scores)

 ValueError                                Traceback (most recent call last)
 <ipython-input-39-9f93c0c66265> in <module>()
  1 from sklearn import metrics
 ----> 2 recall_score(y_true, y_scores)
 ~\AppData\Local\Continuum\miniconda3\envs\deepnetwork\lib\site-
 packages\sklearn\metrics\classification.py in recall_score(y_true, y_pred, 
 labels, pos_label, average, sample_weight)
 1357                                                  average=average,
 1358                                                  warn_for=('recall',),
 -> 1359                                                  
 sample_weight=sample_weight)
  1360     return r
  1361 

  ~\AppData\Local\Continuum\miniconda3\envs\deepnetwork\lib\site-
  packages\sklearn\metrics\classification.py in 
  precision_recall_fscore_support(y_true, y_pred, beta, labels, pos_label, 
 average, warn_for, sample_weight)
 1023         raise ValueError("beta should be >0 in the F-beta score")
 1024 
 -> 1025     y_type, y_true, y_pred = _check_targets(y_true, y_pred)
 1026     present_labels = unique_labels(y_true, y_pred)
 1027 

   ~\AppData\Local\Continuum\miniconda3\envs\deepnetwork\lib\site-
    packages\sklearn\metrics\classification.py in _check_targets(y_true, 
   y_pred)
 79     if len(y_type) > 1:
 80         raise ValueError("Classification metrics can't handle a mix of 
    {0} "
  ---> 81                "and {1} targets".format(type_true, type_pred))
 82 
 83     # We can't have more than one value on y_type => The set is no more 
       needed

 ValueError: Classification metrics can't handle a mix of multilabel-
 indicator  and continuous-multioutput targets
>打印(y_真[:5])
>> [[ 0.  1.]
[ 0.  1.]
[ 1.  0.]
[ 0.  1.]
[ 1.  0.]]
>>y_分数=模型预测(x_val)
>>打印(y_分数[:5])
[[7.00690389e-01 2.99309582e-01]
[9.36253400E-04 9.99063790e-01]
[9.99530196e-01 4.69864986e-04]
[6.66563153e-01 3.33436847e-01]
[9.98917222e-01 1.08276575e-03]
>>从SKM学习导入度量
>>回忆分数(y_正确,y_分数)
ValueError回溯(最近一次调用上次)
在()
1从SKL学习导入度量
---->2回忆分数(y_正确,y_分数)
~\AppData\Local\Continuum\miniconda3\envs\deepnetwork\lib\site-
召回评分中的packages\sklearn\metrics\classification.py(y_true,y_pred,
标签、位置标签、平均值、样品重量)
1357平均值=平均值,
1358 warn_for=(‘召回’,),
-> 1359                                                  
样品重量=样品重量)
1360返回r
1361
~\AppData\Local\Continuum\miniconda3\envs\deepnetwork\lib\site-
中的packages\sklearn\metrics\classification.py
精确记忆功能支持(y_真、y_pred、beta、标签、pos_标签、,
平均值、警告值、样品重量)
1023 raise VALUE ERROR(“F-beta分数中的贝塔值应大于0”)
1024
->1025 y_类型,y_真,y_pred=_检查_目标(y_真,y_pred)
1026当前标签=唯一标签(y_true,y_pred)
1027
~\AppData\Local\Continuum\miniconda3\envs\deepnetwork\lib\site-
packages\sklearn\metrics\classification.py in\u check\u targets(y\u true,
y_pred)
79如果透镜(y_型)>1:
80 raise VALUERROR(“分类指标无法处理以下情况的混合
{0} "
--->81“和{1}目标”。格式(type_true,type_pred))
82
83#在y_type=>上不能有多个值集合不多
需要
ValueError:分类指标无法处理多标签的混合-
指标和连续多输出目标

您的
y_true
y_分数
是一个热编码,而根据
回忆_分数
要求它们是一维数组:

从sklearn.metrics导入召回分数
将numpy作为np导入
y_true=[[0,1.],
[ 0. , 1.],
[ 1. , 0.],
[ 0. , 1.],
[ 1. , 0.]]
y_得分=[[7.00690389e-01,2.99309582e-01],
[9.36253400E-04,9.99063790e-01],
[9.99530196e-01,4.69864986e-04],
[6.66563153e-01,3.33436847e-01],
[9.98917222e-01,1.08276575e-03]]
yy_-true=[np.argmax(i)表示y_-true中的i]
没错
# [1, 1, 0, 1, 0]
yy_分数=[np.argmax(i)表示y_分数中的i]
yy_分数
# [0, 1, 0, 0, 0]
回忆分数(yy_正确,yy_分数)
# 0.33333333333333331

Ha,你是第一个:)@MarcinMożejko是的,我甚至做了一个关于“截图不被接受”的讲座——见评论;)