Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/284.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/tensorflow/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 我不能用梯度下降法训练我的自定义损失函数_Python_Tensorflow_Keras_Deep Learning_Loss - Fatal编程技术网

Python 我不能用梯度下降法训练我的自定义损失函数

Python 我不能用梯度下降法训练我的自定义损失函数,python,tensorflow,keras,deep-learning,loss,Python,Tensorflow,Keras,Deep Learning,Loss,我尝试使用梯度下降法训练我的多输入(100个特征)多输出(3个输出)模型,但损失函数不趋向于0。 损失函数如下图所示:。此损失函数奖励具有较大值的数据的分类(如果您得到正确的符号) 以下是我的损失函数的代码: def my_loss_fn(y_true, y_pred) : d = tf.raw_ops.Sum(input = tf.raw_ops.Abs(x= y_true, name = None), axis = -1, keep_dims = False, name = Non

我尝试使用梯度下降法训练我的多输入(100个特征)多输出(3个输出)模型,但损失函数不趋向于0。

损失函数如下图所示:。此损失函数奖励具有较大值的数据的分类(如果您得到正确的符号)

以下是我的损失函数的代码:

def my_loss_fn(y_true, y_pred) : 
    d = tf.raw_ops.Sum(input = tf.raw_ops.Abs(x= y_true, name = None), axis = -1, keep_dims = False, name = None)
    n = tf.raw_ops.Sum(input = tf.raw_ops.Sign(x = y_true*y_pred, name = None)*tf.raw_ops.Abs(x = y_true, name = None), axis = -1, keep_dims = False, name = None)
    return 1-n/d
我使用的是
tf.raw_ops
,因为
sign()
abs()
似乎是可微的(奇怪的……):

我实现了这个损失函数来训练以下模型(ANN,3层,每个层50个节点):

损失不趋向于零,在0.99和1.05之间变化

我使用的是tensorflow 1.14.0

你能帮我解决这个问题吗


提前谢谢大家

您的输出层有
relu
激活,您可能想先更改它。我还尝试对输出层使用“tanh”,因为我需要负值和正值,但绝对值不重要,我得到了相同的结果
from tensorflow.keras import Sequential
from tensorflow.keras.layers import Dense
from tensorflow.keras.layers import Dropout

classifier = Sequential()

#Layers 
classifier.add(Dense(units = 50, input_shape = (100, ), activation = "relu"))
classifier.add(Dropout(0.2))
classifier.add(Dense(units = 50, activation = "relu"))
classifier.add(Dropout(0.2))
classifier.add(Dense(units = 50, activation = "relu"))
classifier.add(Dropout(0.2))

# Output layer
classifier.add(Dense(units=3, activation = 'tanh'))

# Compilation
classifier.compile(optimizer = 'adam', loss = my_loss_fn) 

#Training
nb_epochs = 300
history = classifier.fit(X_train, y_train, epochs = nb_epochs, batch_size = 128, verbose = True)