Python 为什么Pytork需要保留图形?

Python 为什么Pytork需要保留图形?,python,pytorch,Python,Pytorch,我训练我的模特如下: for i in range(5): optimizer.zero_grad() y = next_input() loss = model(y) loss.backward() optimizer.step() 然后得到这个错误 RuntimeError: Trying to backward through the graph a second time, but the buffers have already been freed. Speci

我训练我的模特如下:

for i in range(5):
  optimizer.zero_grad()
  y = next_input()
  loss = model(y)
  loss.backward()
  optimizer.step()
然后得到这个错误

RuntimeError: Trying to backward through the graph a second time, but the buffers have already been freed. Specify retain_graph=True when calling backward the first time.
为什么需要我保留图形?如果衍生品被释放,它可能只是重新计算衍生品。证明它考虑代码:

for i in range(5):
  optimizer.zero_grad()
  model.zero_grad() # drop derivatives
  y = next_input()
  loss = model(y)
  loss.backward(retain_graph=True)
  optimizer.step()
在这种情况下,上一次迭代的导数也归零,但Torch并不在意,因为设置了标志
retain\u graph=True

我说的
model.zero\u grad()
取消了
retain\u graph=True
(即删除保留的导数)的效果,对吗