Pytorch 派托克:什么时候,为什么我应该使用缓冲区?
我使用缓冲区传递LSTM网络的隐藏状态Pytorch 派托克:什么时候,为什么我应该使用缓冲区?,pytorch,lstm,recurrent-neural-network,Pytorch,Lstm,Recurrent Neural Network,我使用缓冲区传递LSTM网络的隐藏状态 def\uuuuu init\uuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuu self.register\u buffer('hidden\u state1',hidden\u state1) self.hidden_state1=hidden_state1 ..#其他代码 为了避免错误,请执行以下操作: RuntimeError: Trying to ba
def\uuuuu init\uuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuu
self.register\u buffer('hidden\u state1',hidden\u state1)
self.hidden_state1=hidden_state1
..#其他代码
为了避免错误,请执行以下操作:
RuntimeError: Trying to backward through the graph a second time,
but the buffers have already been freed.
Specify retain_graph=True when calling backward the first time.
我使用.clone().detach()
分离缓冲区
由于我需要手动分离它们,我是否仍然需要使用缓冲区而不是Pytorch中的正常参数
带有“requires_grad=False”的正常参数是否足以替代缓冲区的使用
(实际上,我不知道这样传递隐藏状态是否是一种好方法)