Python 如何在CNN中绘制epoch vs.val_acc和epoch vs.val_损耗图?

Python 如何在CNN中绘制epoch vs.val_acc和epoch vs.val_损耗图?,python,python-3.x,tensorflow,image-processing,conv-neural-network,Python,Python 3.x,Tensorflow,Image Processing,Conv Neural Network,我使用卷积神经网络CNN来训练数据集。在这里,我得到了历元,损失,损失,总损失,训练时间,等等作为历史。如果我想计算平均精度,那么如何访问val_acc,以及如何绘制epoch vs.val_acc和epoch vs.val_损耗图 convnet=input\u datashape=[None,IMG\u SIZE,IMG\u SIZE,3],name='input' convnet=conv_2dconvnet,32,3,activation='relu' convnet=max\u poo

我使用卷积神经网络CNN来训练数据集。在这里,我得到了历元,损失,损失,总损失,训练时间,等等作为历史。如果我想计算平均精度,那么如何访问val_acc,以及如何绘制epoch vs.val_acc和epoch vs.val_损耗图

convnet=input\u datashape=[None,IMG\u SIZE,IMG\u SIZE,3],name='input' convnet=conv_2dconvnet,32,3,activation='relu' convnet=max\u pool\u 2dconvnet,3 convnet=conv_2dconvnet,64,3,activation='relu' convnet=max\u pool\u 2dconvnet,3 convnet=conv_2dconvnet,128,3,激活class='relu' convnet=max\u pool\u 2dconvnet,3 convnet=conv_2dconvnet,32,3,activation='relu' convnet=max\u pool\u 2dconvnet,3 convnet=conv_2dconvnet,64,3,activation='relu' convnet=max\u pool\u 2dconvnet,3 convnet=完全连接的convnet,1024,激活class='relu' convnet=dropoutconvnet,0.8 convnet=完全连接的convnet,4,激活='softmax' convnet=regressionconvnet,optimizer='adam',learning\u rate=LR,loss='classifical\u crossentropy',name='targets' model=tflearn.dnncovnet,tensorboard_dir='log' 如果os.path.exists'{}.meta'.formatMODEL\u名称: model.loadMODEL\u名称 打印“已加载模型!” 列车=列车数据[:-150] 测试=列车数据[-50:] X=np。数组[i[0]表示序列中的i]。重塑-1,IMG\u大小,IMG\u大小,3 Y=[i[1]表示列车中的i] test\u x=np。数组[i[0]表示测试中的i]。重塑-1,IMG\u大小,IMG\u大小,3 test_y=[i[1]表示测试中的i] hist=model.fit{'input':X},{'targets':Y},n_epoch=8,validation_set={'input':test_X},{'targets':test_Y}, 快照\u步骤=40,显示\u度量=True,运行\u id=MODEL\u NAME model.saveMODEL\u名称 你可以用。特别是,您可以使用CSVLogger,它将您的历元结果流式传输到CSV。从那里你可以做各种各样的分析

基于您的代码的示例:

csv_logger = CSVLogger('training.log')
model.fit({'input': X}, {'targets': Y}, ..., callbacks=[csv_logger]
请尝试以下操作:

history = model.fit(X_train, Y_train, validation_data=(X_test, Y_test), batch_size=32, epochs=10, verbose=1)

# Get training and test loss histories
training_loss = history.history['loss']
test_loss = history.history['val_loss']

# Create count of the number of epochs
epoch_count = range(1, len(training_loss) + 1)

# Visualize loss history
plt.plot(epoch_count, training_loss, 'r--')
plt.plot(epoch_count, test_loss, 'b-')
plt.legend(['Training Loss', 'Test Loss'])
plt.xlabel('Epoch')
plt.ylabel('Loss')
plt.show();
归功于