Machine learning BertSumExt没有制作摘要

Machine learning BertSumExt没有制作摘要,machine-learning,nlp,pytorch,summarization,huggingface-transformers,Machine Learning,Nlp,Pytorch,Summarization,Huggingface Transformers,我试图让抽取的BertSUM摘要器工作() 但我仍然得到以下信息 xent 0 at step -1" 没有任何摘要。我做错了什么?有人能帮我一下吗?也许可以提供一个有效的例子。当我在google colab中执行以下操作时,出现了上述消息: 需要1个克隆GitHub !git clone https://github.com/Alcamech/PreSumm.git 2更改Git分支以汇总原始文本数据 %cd /content/PreSumm !git checkout -b Raw_I

我试图让抽取的BertSUM摘要器工作() 但我仍然得到以下信息

xent 0 at step -1"
没有任何摘要。我做错了什么?有人能帮我一下吗?也许可以提供一个有效的例子。当我在google colab中执行以下操作时,出现了上述消息:

需要1个克隆GitHub

!git clone https://github.com/Alcamech/PreSumm.git
2更改Git分支以汇总原始文本数据

%cd /content/PreSumm
!git checkout -b  Raw_Input origin/PreSumm_Raw_Input_Text_Setup
!git pull
3安装要求

!pip install torch==1.1.0 pytorch_transformers tensorboardX multiprocess pyrouge
4安装CNN/DM抽出式bertext\u cnndm\u transformer.pt

!gdown https://drive.google.com/uc?id=1kKWoV0QCbeIuFt85beQgJ4v0lujaXobJ&export=download
!unzip /content/PreSumm/models/bertext_cnndm_transformer.zip
4.1下载CNN/Dailymail的预处理数据

%cd /content/PreSumm/bert_data/
!gdown https://drive.google.com/uc?id=1DN7ClZCCXsk2KegmC6t4ClBwtAf5galI&export=download
!unzip /content/PreSumm/bert_data/bert_data_cnndm_final.zip
5更改为/src文件夹

cd /content/PreSumm/src/
6运行抽取摘要器

!python /content/PreSumm/src/train.py -task ext -mode test_text -test_from /content/PreSumm/models/bertext_cnndm_transformer.pt -text_src /content/PreSumm/raw_data/temp_ext.raw_src -text_tgt /content/PreSumm/results/result.txt -log_file /content/PreSumm/logs/ext_bert_cnndm
步骤6的输出为:

[2020-05-07 11:20:12,355 INFO] Loading checkpoint from /content/PreSumm/models/bertext_cnndm_transformer.pt
Namespace(accum_count=1, alpha=0.6, batch_size=140, beam_size=5, bert_data_path='../bert_data_new/cnndm', beta1=0.9, beta2=0.999, block_trigram=True, dec_dropout=0.2, dec_ff_size=2048, dec_heads=8, dec_hidden_size=768, dec_layers=6, enc_dropout=0.2, enc_ff_size=512, enc_hidden_size=512, enc_layers=6, encoder='bert', ext_dropout=0.2, ext_ff_size=2048, ext_heads=8, ext_hidden_size=768, ext_layers=2, finetune_bert=True, generator_shard_size=32, gpu_ranks=[0], label_smoothing=0.1, large=False, load_from_extractive='', log_file='/content/PreSumm/logs/ext_bert_cnndm', lr=1, lr_bert=0.002, lr_dec=0.002, max_grad_norm=0, max_length=150, max_ndocs_in_batch=6, max_pos=512, max_tgt_len=140, min_length=15, mode='test_text', model_path='../models/', optim='adam', param_init=0, param_init_glorot=True, recall_eval=False, report_every=1, report_rouge=True, result_path='../results/cnndm', save_checkpoint_steps=5, seed=666, sep_optim=False, share_emb=False, task='ext', temp_dir='../temp', test_all=False, test_batch_size=200, test_from='/content/PreSumm/models/bertext_cnndm_transformer.pt', test_start_from=-1, text_src='/content/PreSumm/raw_data/temp_ext.raw_src', text_tgt='/content/PreSumm/results/result.txt', train_from='', train_steps=1000, use_bert_emb=False, use_interval=True, visible_gpus='-1', warmup_steps=8000, warmup_steps_bert=8000, warmup_steps_dec=8000, world_size=1)
[2020-05-07 11:20:13,361 INFO] https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-config.json not found in cache or force_download set to True, downloading to /tmp/tmpvck0jwoy
100% 433/433 [00:00<00:00, 309339.74B/s]
[2020-05-07 11:20:13,498 INFO] copying /tmp/tmpvck0jwoy to cache at ../temp/4dad0251492946e18ac39290fcfe91b89d370fee250efe9521476438fe8ca185.7156163d5fdc189c3016baca0775ffce230789d7fa2a42ef516483e4ca884517
[2020-05-07 11:20:13,499 INFO] creating metadata file for ../temp/4dad0251492946e18ac39290fcfe91b89d370fee250efe9521476438fe8ca185.7156163d5fdc189c3016baca0775ffce230789d7fa2a42ef516483e4ca884517
[2020-05-07 11:20:13,499 INFO] removing temp file /tmp/tmpvck0jwoy
[2020-05-07 11:20:13,499 INFO] loading configuration file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-config.json from cache at ../temp/4dad0251492946e18ac39290fcfe91b89d370fee250efe9521476438fe8ca185.7156163d5fdc189c3016baca0775ffce230789d7fa2a42ef516483e4ca884517
[2020-05-07 11:20:13,500 INFO] Model config {
  "architectures": [
    "BertForMaskedLM"
  ],
  "attention_probs_dropout_prob": 0.1,
  "finetuning_task": null,
  "hidden_act": "gelu",
  "hidden_dropout_prob": 0.1,
  "hidden_size": 768,
  "initializer_range": 0.02,
  "intermediate_size": 3072,
  "layer_norm_eps": 1e-12,
  "max_position_embeddings": 512,
  "model_type": "bert",
  "num_attention_heads": 12,
  "num_hidden_layers": 12,
  "num_labels": 2,
  "output_attentions": false,
  "output_hidden_states": false,
  "pad_token_id": 0,
  "pruned_heads": {},
  "torchscript": false,
  "type_vocab_size": 2,
  "vocab_size": 30522
}

[2020-05-07 11:20:13,571 INFO] https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-pytorch_model.bin not found in cache or force_download set to True, downloading to /tmp/tmp6b78t4_2
100% 440473133/440473133 [00:06<00:00, 71548841.10B/s]
[2020-05-07 11:20:19,804 INFO] copying /tmp/tmp6b78t4_2 to cache at ../temp/aa1ef1aede4482d0dbcd4d52baad8ae300e60902e88fcb0bebdec09afd232066.36ca03ab34a1a5d5fa7bc3d03d55c4fa650fed07220e2eeebc06ce58d0e9a157
[2020-05-07 11:20:21,212 INFO] creating metadata file for ../temp/aa1ef1aede4482d0dbcd4d52baad8ae300e60902e88fcb0bebdec09afd232066.36ca03ab34a1a5d5fa7bc3d03d55c4fa650fed07220e2eeebc06ce58d0e9a157
[2020-05-07 11:20:21,212 INFO] removing temp file /tmp/tmp6b78t4_2
[2020-05-07 11:20:21,267 INFO] loading weights file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-pytorch_model.bin from cache at ../temp/aa1ef1aede4482d0dbcd4d52baad8ae300e60902e88fcb0bebdec09afd232066.36ca03ab34a1a5d5fa7bc3d03d55c4fa650fed07220e2eeebc06ce58d0e9a157
gpu_rank 0
[2020-05-07 11:20:24,645 INFO] * number of parameters: 120512513
[2020-05-07 11:20:24,736 INFO] https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt not found in cache or force_download set to True, downloading to /tmp/tmpyv3mwnb6
100% 231508/231508 [00:00<00:00, 4268647.82B/s]
[2020-05-07 11:20:25,044 INFO] copying /tmp/tmpyv3mwnb6 to cache at /root/.cache/torch/pytorch_transformers/26bc1ad6c0ac742e9b52263248f6d0f00068293b33709fae12320c0e35ccfbbb.542ce4285a40d23a559526243235df47c5f75c197f04f37d1a0c124c32c9a084
[2020-05-07 11:20:25,045 INFO] creating metadata file for /root/.cache/torch/pytorch_transformers/26bc1ad6c0ac742e9b52263248f6d0f00068293b33709fae12320c0e35ccfbbb.542ce4285a40d23a559526243235df47c5f75c197f04f37d1a0c124c32c9a084
[2020-05-07 11:20:25,045 INFO] removing temp file /tmp/tmpyv3mwnb6
[2020-05-07 11:20:25,046 INFO] loading vocabulary file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt from cache at /root/.cache/torch/pytorch_transformers/26bc1ad6c0ac742e9b52263248f6d0f00068293b33709fae12320c0e35ccfbbb.542ce4285a40d23a559526243235df47c5f75c197f04f37d1a0c124c32c9a084
  0% 0/2 [00:00<?, ?it/s]
[2020-05-07 11:20:25,115 INFO] Validation xent: 0 at step -1
[2020-05-07 11:20:12355信息]从/content/PreSumm/models/bertext\u cnndm\u transformer.pt加载检查点
名称空间(accum\u count=1,阿尔法=0.6,7.6,1.6,1.6,1.1=0.9,beta1=0.9,beta1=0.9,beta=0.9,2=0.999,三元数字=1,阿尔法=0.6,1.6,批量大小=140,波束大小=5,波束大小=5,伯特数据路径,伯特数据路径路径=1,“../bert\u数据新/cnndm,伯特,伯特数据新的数据新/cnndm,BET/cnndm,beta1=0.9,beta1=0.9,beta1=0.9,beta1=0.9,beta1=0.9,beta1=0.9,beta1=0.9,beta1=0.9,beta1=0.9,beta1=0.9,beta1=0.9,beta1=0.9,beteads=8,ext\u hidden\u size=768,ext\u layers=2,finetune\u bert=True,generator\u shard\u size=32,gpu等级=[0],标签平滑=0.1,大=假,从提取中加载,日志文件='/content/PreSumm/logs/ext\u bert\u cnndm',lr=1,lr\u bert=0.002,lr\u dec=0.002,max\u grad\u norm=0,max\u length=150,max\u ndocs\u批次中的最大值=6,max\u pos=512,max\u tgt\u len=140,min\u length=15,mode='test\u text',\u path='../models\u grad\u norm=0,max\u-grad\u-norm=150,max\u-ind=6,max\u-endocs=6,max\u-pos=6,max\u-pos=512,max\u-pos=512,max\u-pos=512al=False,report\u every=1,report\u rouge=True,result\u path='../results/cnndm',save\u checkpoint\u steps=5,seed=666,sep\u optim=False,share\u emb=False,task='ext',temp\u dir='../temp',test\u all=False,test\u batch\u size=200,test\u from='/content/PreSumm/models/bertext\u cnndm\u transformer.pt',test start\u from=-1,text\src='/content/PreSumm/raw\u原始数据_src',text_tgt='/content/PreSumm/results/result.txt',train_from='',train_steps=1000,use_bert_emb=False,use_interval=True,use_gpu='-1',warmup_steps=8000,warmup_steps=8000,warmup_steps=8000,warmup_dec=8000,world_size=1)
[2020-05-07 11:20:13361信息]https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-config.json 在缓存中找不到或强制下载设置为True,下载到/tmp/tmpvck0jwoy

100%433/433[00:00您可以查看bertsum摘录摘要示例,网址为

请在此处发布,而不是在外部链接中发布。