Python dnn模型中的过拟合
我有一个数据集,在该数据集上训练DNN模型。 我的数据集包含398个样本和330个特征,我使用ExtraTreeclassifier()将特征重设为39。这是我的模型:Python dnn模型中的过拟合,python,machine-learning,neural-network,Python,Machine Learning,Neural Network,我有一个数据集,在该数据集上训练DNN模型。 我的数据集包含398个样本和330个特征,我使用ExtraTreeclassifier()将特征重设为39。这是我的模型: X_train, X_test, y_train, y_test = train_test_split(xfinal, val_y, test_size = 0.2, random_state = 0) model=Sequential() model.add(Dense(units=20, kernel_initializer
X_train, X_test, y_train, y_test = train_test_split(xfinal, val_y, test_size = 0.2, random_state = 0)
model=Sequential()
model.add(Dense(units=20, kernel_initializer='uniform', activation='relu',input_dim=nb_features))
model.add(Dense(units=20, kernel_initializer='uniform', activation='relu'))
model.add(Dense(units=10, kernel_initializer='uniform', activation='relu'))
model.add(Dense(units=5, kernel_initializer='uniform', activation='relu'))
model.add(Dense(units=1,kernel_initializer='uniform',activation='sigmoid'))
model.compile(optimizer='adam',loss='binary_crossentropy',metrics=['accuracy'])
history = model.fit(X_train,y_train,validation_data=(X_test,y_test),batch_size=32,epochs=250)
我试过退学,但我的模型钢过度装配:
我的模型有什么解决方案吗?您可以在
密集层
之间添加辍学层,如下所示
model.add(Dropout(0.2))
还可以从体系结构中删除一个或多个隐藏层
还有一件事,您可以使用earlystoping
方法在正确的历元编号处停止
您的最终模型架构可以如下所示:
callbacks = [EarlyStopping(monitor='val_loss', patience=5)]
model=Sequential()
model.add(Dense(units=20, kernel_initializer='uniform', activation='relu',input_dim=nb_features))
model.add(Dropout(0.2))
model.add(Dense(units=5, kernel_initializer='uniform', activation='relu'))
model.add(Dense(units=1,kernel_initializer='uniform',activation='sigmoid'))
model.compile(optimizer='adam',loss='binary_crossentropy',metrics=['accuracy'])
history = model.fit(X_train,y_train,validation_data=(X_test,y_test),batch_size=32,epochs=250, callbacks=callbacks)
删除所有kernel\u initializer='uniform'
语句,glorot\u uniform
效果最好。