Scikit learn 对于一个VSRESTClassifier中的多个类,分类器的参数是否可以不同
有人知道sklearn是否支持OneVsRestClassifier中不同分类器的不同参数吗?例如,在这个例子中,我希望对于不同的类有不同的C值Scikit learn 对于一个VSRESTClassifier中的多个类,分类器的参数是否可以不同,scikit-learn,svm,multiclass-classification,Scikit Learn,Svm,Multiclass Classification,有人知道sklearn是否支持OneVsRestClassifier中不同分类器的不同参数吗?例如,在这个例子中,我希望对于不同的类有不同的C值 from sklearn.multiclass import OneVsRestClassifier from sklearn.svm import LinearSVC text_clf = OneVsRestClassifier(LinearSVC(C=1.0, class_weight="balanced")) 目前没有一个VSREST分类器不使
from sklearn.multiclass import OneVsRestClassifier
from sklearn.svm import LinearSVC
text_clf = OneVsRestClassifier(LinearSVC(C=1.0, class_weight="balanced"))
目前没有一个VSREST分类器不使用不同的估计量参数或不同类别的不同估计量 还有一些是在其他方面实现的,比如会根据类自动调整参数的不同值,但它还没有扩展到OneVsRestClassifier 但是如果您需要,我们可以在源代码中进行更改以实现这一点 电流源: 正如您所看到的,同一个估计器self.estimator正在传递给所有要训练的类。因此,我们将制作新版本的OneVsRestClassifier来改变这一点:
from sklearn.multiclass import OneVsRestClassifier
from sklearn.preprocessing import LabelBinarizer
from sklearn.externals.joblib import Parallel, delayed
from sklearn.multiclass import _fit_binary
class CustomOneVsRestClassifier(OneVsRestClassifier):
# Changed the estimator to estimators which can take a list now
def __init__(self, estimators, n_jobs=1):
self.estimators = estimators
self.n_jobs = n_jobs
def fit(self, X, y):
self.label_binarizer_ = LabelBinarizer(sparse_output=True)
Y = self.label_binarizer_.fit_transform(y)
Y = Y.tocsc()
self.classes_ = self.label_binarizer_.classes_
columns = (col.toarray().ravel() for col in Y.T)
# This is where we change the training method
self.estimators_ = Parallel(n_jobs=self.n_jobs)(delayed(_fit_binary)(
estimator, X, column, classes=[
"not %s" % self.label_binarizer_.classes_[i],
self.label_binarizer_.classes_[i]])
for i, (column, estimator) in enumerate(zip(columns, self.estimators)))
return self
现在你可以使用它了
# Make sure you add those many estimators as there are classes
# In binary case, only a single estimator should be used
estimators = []
# I am considering 3 classes as of now
estimators.append(LinearSVC(C=1.0, class_weight="balanced"))
estimators.append(LinearSVC(C=0.1, class_weight="balanced"))
estimators.append(LinearSVC(C=10, class_weight="balanced"))
clf = CustomOneVsRestClassifier(estimators)
clf.fit(X, y)
注意:我还没有在it中实现部分_-fit。如果你打算使用它,我们可以处理它
# Make sure you add those many estimators as there are classes
# In binary case, only a single estimator should be used
estimators = []
# I am considering 3 classes as of now
estimators.append(LinearSVC(C=1.0, class_weight="balanced"))
estimators.append(LinearSVC(C=0.1, class_weight="balanced"))
estimators.append(LinearSVC(C=10, class_weight="balanced"))
clf = CustomOneVsRestClassifier(estimators)
clf.fit(X, y)