Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/363.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python ValueError:Tensorflow2输出尺寸错误_Python_Tensorflow_Sequential - Fatal编程技术网

Python ValueError:Tensorflow2输出尺寸错误

Python ValueError:Tensorflow2输出尺寸错误,python,tensorflow,sequential,Python,Tensorflow,Sequential,我正在使用tensorflow2处理Iris数据集 安装模型后,我收到此错误消息 ValueError: A target array with shape (135, 4, 8) was passed for an output of shape (None, 3) while using as loss `categorical_crossentropy`. This loss expects targets to have the same shape as the output. 我正

我正在使用tensorflow2处理Iris数据集

安装模型后,我收到此错误消息

ValueError: A target array with shape (135, 4, 8) was passed for an output of shape (None, 3) while using as loss `categorical_crossentropy`. This loss expects targets to have the same shape as the output.
我正在导入/拆分/一个热编码模型:

iris_data = datasets.load_iris()    
def read_in_and_split_data(iris_data):
        return model_selection.train_test_split(iris_data["data"], iris_data["data"], test_size=0.1)
train_data, test_data, train_targets, test_targets = read_in_and_split_data(iris_data)
列车组数据形状为(135,4)

列车目标形状为(135,4)

loss=“分类的交叉熵”

谢谢你的帮助

解决了它

在def read_in_和_split_数据(iris_数据)中:我读取加载“数据”两次,而不是加载“数据”然后加载“目标”

以下是正确的代码:

def read_in_and_split_data(iris_data):
        return model_selection.train_test_split(iris_data["data"], iris_data["target"], test_size=0.1)

您的
train\u数据
train\u目标
中没有相同数量的样本?很抱歉,我在提问时混淆了测试数据和train\u目标。谢谢你注意到这一点!编辑了问题并添加了如何导入模型。你对此有什么建议吗?非常感谢。尝试将最后一层更改为密集(4,activation=“softmax”)。然而,您的错误表明您的
train\u target
数据集的形状与您所说的不同。实际上,iris数据集的目标维度不应为(135,4)。由于数据集中有3个类,我将输出层设置为3个单位。我现在将修改如何加载集合。
def get_model(input_shape):
   model = Sequential([
       Dense(64, activation = "relu", kernel_initializer='he_uniform', bias_initializer='ones', input_shape=input_shape),
       Dense(128, activation = "relu"),
       Dense(128, activation = "relu"),
       Dense(128, activation = "relu"),
       Dense(128, activation = "relu"),
       Dense(64, activation = "relu"),
       Dense(64, activation = "relu"),
       Dense(64, activation = "relu"),
       Dense(64, activation = "relu"),
       Dense(3, activation = "softmax"),
       ])
    return model
model = get_model(train_data[0].shape)
def train_model(model, train_data, train_targets, epochs):
    return model.fit(train_data, train_targets, epochs)
history = train_model(model, train_data, train_targets, epochs=800)
def read_in_and_split_data(iris_data):
        return model_selection.train_test_split(iris_data["data"], iris_data["target"], test_size=0.1)