Deep learning 为什么发电机损耗为0.00,但在GAN中仍然产生白色图像?

Deep learning 为什么发电机损耗为0.00,但在GAN中仍然产生白色图像?,deep-learning,pytorch,loss-function,generative-adversarial-network,Deep Learning,Pytorch,Loss Function,Generative Adversarial Network,我已经创建了一个用于创建CIFAR-100图像的GAN(生成性对抗网络)。该模型运行良好,但会生成白色图像 我的代码如下(Colab笔记本): 我已经运行了几个时代,下面是日志: Epoch [0/500] Batch 0/100 Loss D: 0.6931, loss G: 0.7558 Epoch [1/500] Batch 0/100 Loss D: 0.6931, loss G: 0.0000 Ep

我已经创建了一个用于创建CIFAR-100图像的GAN(生成性对抗网络)。该模型运行良好,但会生成白色图像

我的代码如下(Colab笔记本):

我已经运行了几个时代,下面是日志:

Epoch [0/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.7558
Epoch [1/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0000
Epoch [2/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0010
Epoch [3/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0002
Epoch [4/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0004
Epoch [5/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [6/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0007
Epoch [7/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [8/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0000
Epoch [9/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [10/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0000
Epoch [11/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0000
Epoch [12/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [13/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [14/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [15/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [16/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [17/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [18/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [19/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [20/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [21/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [22/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [23/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [24/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [25/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [26/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [27/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [28/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0002
Epoch [29/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0000
Epoch [30/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0000
Epoch [31/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0000
Epoch [32/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0000
Epoch [33/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0000
Epoch [34/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0004
Epoch [35/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [36/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0004
Epoch [37/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [38/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [39/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [40/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [41/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [42/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [43/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0002
Epoch [44/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [45/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0000
Epoch [46/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [47/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [48/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [49/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0002
Epoch [50/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [51/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0008
Epoch [52/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [53/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0000
Epoch [54/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0000
Epoch [55/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [56/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [57/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0000
Epoch [58/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [59/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [60/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0000
Epoch [61/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0000
Epoch [62/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [63/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [64/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0000
Epoch [65/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [66/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [67/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0000
Epoch [68/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0000
Epoch [69/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0000
Epoch [70/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0000
Epoch [71/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0000
Epoch [72/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0002
Epoch [73/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0000
Epoch [74/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001

正如您所看到的,鉴别器和发电机的损耗都是停滞的。生成的图像是白色空白图像。我无法理解这个结果的原因。请解释一下。谢谢。

在张量进入编写器之前,你检查过它们了吗?如果正在编写它们,它们应该全部为1?或者,它们可能都是非常小的值,然后当您重新规范化它们时(使用normalize=True,不确定为什么要这样做?),它们会扩散到1的值。我也会尝试将学习速率降低到adam的默认水平,然后大幅减少批量大小,甚至可能是批量大小1,以增加损失的噪音。
Epoch [0/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.7558
Epoch [1/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0000
Epoch [2/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0010
Epoch [3/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0002
Epoch [4/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0004
Epoch [5/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [6/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0007
Epoch [7/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [8/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0000
Epoch [9/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [10/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0000
Epoch [11/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0000
Epoch [12/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [13/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [14/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [15/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [16/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [17/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [18/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [19/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [20/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [21/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [22/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [23/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [24/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [25/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [26/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [27/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [28/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0002
Epoch [29/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0000
Epoch [30/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0000
Epoch [31/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0000
Epoch [32/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0000
Epoch [33/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0000
Epoch [34/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0004
Epoch [35/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [36/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0004
Epoch [37/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [38/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [39/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [40/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [41/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [42/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [43/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0002
Epoch [44/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [45/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0000
Epoch [46/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [47/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [48/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [49/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0002
Epoch [50/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [51/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0008
Epoch [52/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [53/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0000
Epoch [54/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0000
Epoch [55/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [56/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [57/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0000
Epoch [58/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [59/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [60/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0000
Epoch [61/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0000
Epoch [62/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [63/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [64/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0000
Epoch [65/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [66/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001
Epoch [67/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0000
Epoch [68/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0000
Epoch [69/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0000
Epoch [70/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0000
Epoch [71/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0000
Epoch [72/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0002
Epoch [73/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0000
Epoch [74/500] Batch 0/100                       Loss D: 0.6931, loss G: 0.0001