Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/322.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python ValueError:分类指标可以';t处理连续多输出和多标签指示器目标的混合_Python_Tensorflow_Keras_Deep Learning - Fatal编程技术网

Python ValueError:分类指标可以';t处理连续多输出和多标签指示器目标的混合

Python ValueError:分类指标可以';t处理连续多输出和多标签指示器目标的混合,python,tensorflow,keras,deep-learning,Python,Tensorflow,Keras,Deep Learning,您好,我的预测模型出现了这个错误。我从一个excel文件中获取数据,该文件有4个输入和4个输出。我是新来的深度学习者。我知道是关于你的考试,但我不知道我应该写什么。这是我的密码。提前感谢您的帮助 df = pd.read_excel ("C:/Users/hayri/Desktop/aa.xlsx") xdata = df print(xdata) min_max_scaler = preprocessing.MinMaxScaler() x_scaled = min_max_scal

您好,我的预测模型出现了这个错误。我从一个excel文件中获取数据,该文件有4个输入和4个输出。我是新来的深度学习者。我知道是关于你的考试,但我不知道我应该写什么。这是我的密码。提前感谢您的帮助

df = pd.read_excel ("C:/Users/hayri/Desktop/aa.xlsx") 

xdata = df

print(xdata)


min_max_scaler = preprocessing.MinMaxScaler()
x_scaled = min_max_scaler.fit_transform(xdata)

df = pd.read_excel ("C:/Users/hayri/Desktop/bb.xlsx") 

ydata = df


min_max_scaler = preprocessing.MinMaxScaler()
y_scaled = min_max_scaler.fit_transform(ydata)


(x_train, x_test, y_train, y_test)= train_test_split(x_scaled,y_scaled,test_size = 0.2,random_state=0)



conv = Sequential()
conv.add(Conv1D(filters=32, kernel_size=4, activation='relu', input_shape=(4, 4)))

conv.add(Dropout(0.5))
conv.add(MaxPooling1D(3))
conv.add(Flatten())
conv.add(Dense(4, activation = 'sigmoid'))


sgd = optimizers.SGD(lr=0.3, momentum = 0.6, decay = 0, nesterov = False)
conv.compile(loss = 'binary_crossentropy', optimizer ='sgd', metrics = ['accuracy'])
history=conv.fit(x_train, y_train, batch_size =10, epochs =200,validation_data=(x_test,y_test), verbose = 1)
score = conv.evaluate(x_test, y_test, batch_size=10)

y_test=np.argmax(y_test, axis=1)

y_pred = conv.predict(x_test,batch_size=64)
y_pred = np.argmax(y_pred, axis=1)

results = confusion_matrix(y_test, y_pred) 
sns.heatmap(results,cmap="Blues")

accuracy = conv.evaluate(x_test, y_test)
print('Accuracy: %.2f' % (accuracy*100))

    model.compile(loss = 'binary_crossentropy', optimizer ='adam', metrics = ['accuracy'])
    history=model.fit(x_train, y_train, batch_size =10, epochs =200,validation_data=(x_test,y_test), verbose = 1)
    score = model.evaluate(x_test, y_test, batch_size=10)

    y_pred = model.predict(x_test,batch_size=64)
    y_pred = np.argmax(y_pred, axis=1)

如果看不到更多的代码,很难判断。如果您想发布整个内容,以及您的输入形状和数据类型,我愿意提供帮助;ValueError:输入形状为[?,1,1,64],[1,4,64,32]的“conv1d_15/卷积”(op:“Conv2D”)从1中减去4导致负维度大小。