Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/csharp/315.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Scikit learn 拟合sklearn.naiver_bayes.多项式nb()时出错:9_Scikit Learn_Classification_Feature Extraction_Text Classification_Countvectorizer - Fatal编程技术网

Scikit learn 拟合sklearn.naiver_bayes.多项式nb()时出错:9

Scikit learn 拟合sklearn.naiver_bayes.多项式nb()时出错:9,scikit-learn,classification,feature-extraction,text-classification,countvectorizer,Scikit Learn,Classification,Feature Extraction,Text Classification,Countvectorizer,在用TfidfCountvectorizer拟合多项式朴素贝叶斯分类器时,我被杀了:9个错误 def classify(vector, df): clf = MultinomialNB() model = clf.fit(vector, df.iloc[0:, 1].values) if __name__ == "__main__": train, test = gen_train_test(pd.read_csv('Data/datalabel.csv'))

在用TfidfCountvectorizer拟合多项式朴素贝叶斯分类器时,我被杀了:9个错误

def classify(vector, df):
    clf = MultinomialNB()
    model = clf.fit(vector, df.iloc[0:, 1].values)

if __name__ == "__main__":
    train, test = gen_train_test(pd.read_csv('Data/datalabel.csv'))
    vector = joblib.load('Data/tf.pkl')
    classify(vector, train, name='mnb')
    print('Program executed!')
这里,列车大小为409MB,矢量大小为20.3GB


我使用的是MacBook Pro-13 2017,8GB RAM,256GB SSD。

这个简单的代码修改解决了这个问题

vector = joblib.load(open('Data/tf.pkl', 'rb'))

您是如何将大小为20.3GB的向量加载到8GB RAM中的?是否有任何堆栈错误跟踪?你怎么称呼这个脚本?在终端中,在IDE?MaxU-中,我使用joblib包转储和加载矢量器,而不是pickle。Joblib压缩数据。Vivek Kumar-它只显示错误:9。这与记忆有关。我在PyCharm上使用python3命令调用脚本。它在保存到光盘时压缩数据,在加载到内存时将全部解压缩为原始格式。你一开始是怎么保存的?