Pytorch 运行时错误:渐变不是';t-CUDA张量

Pytorch 运行时错误:渐变不是';t-CUDA张量,pytorch,Pytorch,我得到以下错误 File "/net/if5/wua4nw/wasi/academic/research_with_prof_chang/projects/question_answering/duplicate_question_detection/source/train.py", line 62, in train loss.backward() File "/if5/wua4nw/anaconda3/lib/python3.5/site-packages/torch/auto

我得到以下错误

File "/net/if5/wua4nw/wasi/academic/research_with_prof_chang/projects/question_answering/duplicate_question_detection/source/train.py", line 62, in train
    loss.backward()
  File "/if5/wua4nw/anaconda3/lib/python3.5/site-packages/torch/autograd/variable.py", line 145, in backward
    self._execution_engine.run_backward((self,), (gradient,), retain_variables)
  File "/if5/wua4nw/anaconda3/lib/python3.5/site-packages/torch/autograd/function.py", line 208, in _do_backward
    result = super(NestedIOFunction, self)._do_backward(gradients, retain_variables)
  File "/if5/wua4nw/anaconda3/lib/python3.5/site-packages/torch/autograd/function.py", line 216, in backward
    result = self.backward_extended(*nested_gradients)
  File "/if5/wua4nw/anaconda3/lib/python3.5/site-packages/torch/nn/_functions/rnn.py", line 210, in backward_extended
    grad_hx)
  File "/if5/wua4nw/anaconda3/lib/python3.5/site-packages/torch/backends/cudnn/rnn.py", line 360, in backward_grad
    raise RuntimeError('Gradients aren\'t CUDA tensors')
RuntimeError: Gradients aren't CUDA tensors
在pytorch中运行
loss.backward()
时。正向传播工作正常,但当反向传播步骤执行时,我得到了错误。有人能给我建议如何解决这个问题吗


注:我没有发布源代码,因为它太长了

您能否发布导致此错误的相关代码段。只需显示
train.py中的内容,直到给出错误的行(62)。