Python argmax_cuda“;未针对'实施;布尔

Python argmax_cuda“;未针对'实施;布尔,python,tensorflow,pytorch,torchtext,Python,Tensorflow,Pytorch,Torchtext,我在google collab中遇到了这个错误。我尝试了其他数据类型,如布尔张量,但没有成功,请帮助 代码 def _mask(prev_generated_seq): prev_mask = torch.eq(prev_generated_seq, 1) lengths = torch.argmax(prev_mask,dim=1) #test = torch.max(prev_mask,dim=1) #lengths = to

我在google collab中遇到了这个错误。我尝试了其他数据类型,如布尔张量,但没有成功,请帮助

代码

 def _mask(prev_generated_seq):
        prev_mask = torch.eq(prev_generated_seq, 1)
        lengths = torch.argmax(prev_mask,dim=1)
        #test = torch.max(prev_mask,dim=1)
        #lengths = torch.FloatTensor(test)
        max_len = prev_generated_seq.size(1)
        mask = []
        for i in range(prev_generated_seq.size(0)):
            if lengths[i] == 0:
                mask_line = [0] * max_len
            else:
                mask_line = [0] * lengths[i].item()
                mask_line.extend([1] * (max_len - lengths[i].item()))
            mask.append(mask_line)
        mask = torch.ByteTensor(mask)
        if args.cuda:
            mask = mask.cuda()
        return prev_generated_seq.data.masked_fill_(mask, 0)
错误

File "main.py", line 179, in <module>
    train_epoches(abstracts, model, config.epochs, teacher_forcing_ratio=1)
  File "main.py", line 155, in train_epoches
    target_variables, model, teacher_forcing_ratio)
  File "main.py", line 139, in train_batch
    prev_generated_seq = _mask(prev_generated_seq)
  File "main.py", line 101, in _mask
    lengths = torch.argmax(prev_mask,dim=1)
RuntimeError: "argmax_cuda" not implemented for 'Bool'
文件“main.py”,第179行,在
培训时间段(摘要、模型、配置时间段、教师强制时间段比率=1)
第155行的“main.py”文件位于列车段
目标变量、模型、教师比例)
文件“main.py”,第139行,批量生产
上一个生成的顺序=_掩码(上一个生成的顺序)
文件“main.py”,第101行,在掩码中
长度=火炬.argmax(上一个遮罩,尺寸=1)
运行时错误:“argmax_cuda”未为“Bool”实现

您是否检查了
prev\u-generated\u-seq
prev\u-mask
的类型?