Python 如何使用平均相对误差函数作为keras中的损失

Python 如何使用平均相对误差函数作为keras中的损失,python,keras,Python,Keras,我建立了一个神经网络,我试图给它内置的平均相对误差作为损失函数。我设置如下 def customLoss(yTrue,yPred): err, loss_value = mean_relative_error(yTrue, yPred, yTrue) return loss_value def model(inp_size): inp = Input(shape=(inp_size,)) x1 = Dense(100, activation='relu')((i

我建立了一个神经网络,我试图给它内置的平均相对误差作为损失函数。我设置如下

def customLoss(yTrue,yPred):
    err, loss_value = mean_relative_error(yTrue, yPred, yTrue)
    return loss_value

def model(inp_size):
    inp = Input(shape=(inp_size,))
    x1 = Dense(100, activation='relu')((inp))
    for i in range (6):
        x1 = Dense(100, activation='relu')(x1)
    x1 = Dense(1, activation = 'linear')(x1)

    x2 = Dense(100, activation='relu')(inp)
    for i in range (6):
        x2 = Dense(100, activation='relu')(x2)
    x2 = Dense(1, activation = 'linear')(x2)

    x3 = Dense(100, activation='relu')(inp)
    for i in range (6):
        x3 = Dense(100, activation='relu')(x3)
    x3 = Dense(1, activation = 'linear')(x3)

    x4 = Dense(100, activation='relu')(inp)
    for i in range (6):
        x4 = Dense(100, activation='relu')(x4)
    x4 = Dense(1, activation = 'linear')(x4)



    x1 = Lambda(lambda x: x * baseline[0])(x1)
    x2 = Lambda(lambda x: x * baseline[1])(x2)
    x3 = Lambda(lambda x: x * baseline[2])(x3)
    x4 = Lambda(lambda x: x * baseline[3])(x4)

    out = Add()([x1, x2, x3, x4])

    return Model(inputs = inp, outputs = out)
y_train=y_train.astype('float32')
y_test=y_test.astype('float32')

NN_model = model(X_train.shape[1])
NN_model.compile(loss=customLoss, optimizer='Adamax', metrics=[customLoss])
NN_model.build(X_train.shape)

#NN_model.summary()
NN_model.fit(X_train, y_train, epochs=2,verbose = 1)
train_predictions = NN_model.predict(X_train)


predictions = NN_model.predict(X_test)
但是,我得到了以下错误

ValueError:一个操作对渐变有
None
。请确保 您的所有操作都定义了梯度(即 可微的)。无梯度的普通操作:K.argmax,K.round, K.eval

有人知道吗? 谢谢

使用

def customLoss(yTrue,yPred):
    return tf.reduce_mean(tf.abs(yPred-yTrue))
平均绝对误差
用于评估,它没有梯度,因此不能用于反向传播


这不完全是我想要的,但这帮助我高效地实现了自己的目标。谢谢你的提示!