Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/windows/16.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
运行py文件时出现windows spyder无效语法错误_Windows_Pytorch_Huggingface Transformers - Fatal编程技术网

运行py文件时出现windows spyder无效语法错误

运行py文件时出现windows spyder无效语法错误,windows,pytorch,huggingface-transformers,Windows,Pytorch,Huggingface Transformers,我正在尝试运行中的最后一个示例。我已经在目录C:/Users/nn/Desktop/BERT/transformersmaster中克隆了存储库。我在windows机器上使用spyder IDE。为什么我会出现以下错误,我如何解决它?如何输入诗的开头部分 import os os.chdir('C:/Users/nn/Desktop/BERT/transformers-master/examples') os.listdir()# It shows run_generation.py file

我正在尝试运行中的最后一个示例。我已经在目录
C:/Users/nn/Desktop/BERT/transformersmaster
中克隆了存储库。我在windows机器上使用spyder IDE。为什么我会出现以下错误,我如何解决它?如何输入诗的开头部分

import os

os.chdir('C:/Users/nn/Desktop/BERT/transformers-master/examples')
os.listdir()# It shows run_generation.py file

python run_generation.py \
    --model_type=gpt2 \
    --length=100 \
    --model_name_or_path=gpt2 \

python run_generation.py \
    --model_type=gpt2 \
    --length=100 \
    --model_name_or_path=gpt2 \
  File "<ipython-input-10-501d266b0e64>", line 1
    python run_generation.py \
                        ^
SyntaxError: invalid syntax
什么也没发生:(

当我尝试使用python命令执行相同操作时,会出现如下错误:(

#####更新2---------------------------

我按照评论中的建议进行了操作,代码似乎下载了3个文件

  • 我可以手动复制这些文件,这样我就不必每次都在临时文件夹中下载它们了吗
  • 我应该在哪里存储这些文件?哪个文件夹位置?它是
    C:\Users\nnn\Desktop\BERT\transformers master\examples
    -与
    run\u generation.py
    文件相同吗
  • abc

    C:\Users\nnn\Desktop\BERT\transformers-master\examples>python run_generation.py --model_type=gpt2 --length=100 --model_name_or_path=gpt2 --prompt="My job is"
    
    2019-12-12 11:11:57.740810: W tensorflow/stream_executor/platform/default/dso_loader.cc:55] Could not load dynamic library 'cudart64_100.dll'; dlerror: cudart64_100.dll not found
    2019-12-12 11:11:57.748330: I tensorflow/stream_executor/cuda/cudart_stub.cc:29] Ignore above cudart dlerror if you do not have a GPU set up on your machine.
    12/12/2019 11:12:01 - INFO - transformers.file_utils -   https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-vocab.json not found in cache or force_download set to True, downloading to C:\Users\nnn\AppData\Local\Temp\tmpt_29gyqi
    100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 1042301/1042301 [00:00<00:00, 2275416.04B/s]
    12/12/2019 11:12:02 - INFO - transformers.file_utils -   copying C:\Users\nnn\AppData\Local\Temp\tmpt_29gyqi to cache at C:\Users\nnn\.cache\torch\transformers\f2808208f9bec2320371a9f5f891c184ae0b674ef866b79c58177067d15732dd.1512018be4ba4e8726e41b9145129dc30651ea4fec86aa61f4b9f40bf94eac71
    12/12/2019 11:12:02 - INFO - transformers.file_utils -   creating metadata file for C:\Users\nnn\.cache\torch\transformers\f2808208f9bec2320371a9f5f891c184ae0b674ef866b79c58177067d15732dd.1512018be4ba4e8726e41b9145129dc30651ea4fec86aa61f4b9f40bf94eac71
    12/12/2019 11:12:02 - INFO - transformers.file_utils -   removing temp file C:\Users\nnn\AppData\Local\Temp\tmpt_29gyqi
    12/12/2019 11:12:03 - INFO - transformers.file_utils -   https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-merges.txt not found in cache or force_download set to True, downloading to C:\Users\nnn\AppData\Local\Temp\tmpj1_y4sn8
    100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 456318/456318 [00:00<00:00, 1456594.78B/s]
    12/12/2019 11:12:03 - INFO - transformers.file_utils -   copying C:\Users\nnn\AppData\Local\Temp\tmpj1_y4sn8 to cache at C:\Users\nnn\.cache\torch\transformers\d629f792e430b3c76a1291bb2766b0a047e36fae0588f9dbc1ae51decdff691b.70bec105b4158ed9a1747fea67a43f5dee97855c64d62b6ec3742f4cfdb5feda
    12/12/2019 11:12:03 - INFO - transformers.file_utils -   creating metadata file for C:\Users\nnn\.cache\torch\transformers\d629f792e430b3c76a1291bb2766b0a047e36fae0588f9dbc1ae51decdff691b.70bec105b4158ed9a1747fea67a43f5dee97855c64d62b6ec3742f4cfdb5feda
    12/12/2019 11:12:03 - INFO - transformers.file_utils -   removing temp file C:\Users\nnn\AppData\Local\Temp\tmpj1_y4sn8
    12/12/2019 11:12:03 - INFO - transformers.tokenization_utils -   loading file https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-vocab.json from cache at C:\Users\nnn\.cache\torch\transformers\f2808208f9bec2320371a9f5f891c184ae0b674ef866b79c58177067d15732dd.1512018be4ba4e8726e41b9145129dc30651ea4fec86aa61f4b9f40bf94eac71
    12/12/2019 11:12:03 - INFO - transformers.tokenization_utils -   loading file https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-merges.txt from cache at C:\Users\nnn\.cache\torch\transformers\d629f792e430b3c76a1291bb2766b0a047e36fae0588f9dbc1ae51decdff691b.70bec105b4158ed9a1747fea67a43f5dee97855c64d62b6ec3742f4cfdb5feda
    12/12/2019 11:12:04 - INFO - transformers.file_utils -   https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-config.json not found in cache or force_download set to True, downloading to C:\Users\nnn\AppData\Local\Temp\tmpyxywrts1
    100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 176/176 [00:00<00:00, 17738.31B/s]
    12/12/2019 11:12:04 - INFO - transformers.file_utils -   copying C:\Users\nnn\AppData\Local\Temp\tmpyxywrts1 to cache at C:\Users\nnn\.cache\torch\transformers\4be02c5697d91738003fb1685c9872f284166aa32e061576bbe6aaeb95649fcf.085d5f6a8e7812ea05ff0e6ed0645ab2e75d80387ad55c1ad9806ee70d272f80
    12/12/2019 11:12:04 - INFO - transformers.file_utils -   creating metadata file for C:\Users\nnn\.cache\torch\transformers\4be02c5697d91738003fb1685c9872f284166aa32e061576bbe6aaeb95649fcf.085d5f6a8e7812ea05ff0e6ed0645ab2e75d80387ad55c1ad9806ee70d272f80
    12/12/2019 11:12:04 - INFO - transformers.file_utils -   removing temp file C:\Users\nnn\AppData\Local\Temp\tmpyxywrts1
    12/12/2019 11:12:04 - INFO - transformers.configuration_utils -   loading configuration file https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-config.json from cache at C:\Users\nnn\.cache\torch\transformers\4be02c5697d91738003fb1685c9872f284166aa32e061576bbe6aaeb95649fcf.085d5f6a8e7812ea05ff0e6ed0645ab2e75d80387ad55c1ad9806ee70d272f80
    12/12/2019 11:12:04 - INFO - transformers.configuration_utils -   Model config {
      "attn_pdrop": 0.1,
      "embd_pdrop": 0.1,
      "finetuning_task": null,
      "initializer_range": 0.02,
      "layer_norm_epsilon": 1e-05,
      "n_ctx": 1024,
      "n_embd": 768,
      "n_head": 12,
      "n_layer": 12,
      "n_positions": 1024,
      "num_labels": 1,
      "output_attentions": false,
      "output_hidden_states": false,
      "output_past": true,
      "pruned_heads": {},
      "resid_pdrop": 0.1,
      "summary_activation": null,
      "summary_first_dropout": 0.1,
      "summary_proj_to_labels": true,
      "summary_type": "cls_index",
      "summary_use_proj": true,
      "torchscript": false,
      "use_bfloat16": false,
      "vocab_size": 50257
    }
    
    12/12/2019 11:12:04 - INFO - transformers.file_utils -   https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-pytorch_model.bin not found in cache or force_download set to True, downloading to C:\Users\nnn\AppData\Local\Temp\tmpn8i9o_tm
    100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 548118077/548118077 [01:12<00:00, 7544610.26B/s]
    12/12/2019 11:13:18 - INFO - transformers.file_utils -   copying C:\Users\nnn\AppData\Local\Temp\tmpn8i9o_tm to cache at C:\Users\nnn\.cache\torch\transformers\4295d67f022061768f4adc386234dbdb781c814c39662dd1662221c309962c55.778cf36f5c4e5d94c8cd9cefcf2a580c8643570eb327f0d4a1f007fab2acbdf1
    12/12/2019 11:13:24 - INFO - transformers.file_utils -   creating metadata file for C:\Users\nnn\.cache\torch\transformers\4295d67f022061768f4adc386234dbdb781c814c39662dd1662221c309962c55.778cf36f5c4e5d94c8cd9cefcf2a580c8643570eb327f0d4a1f007fab2acbdf1
    12/12/2019 11:13:24 - INFO - transformers.file_utils -   removing temp file C:\Users\nnn\AppData\Local\Temp\tmpn8i9o_tm
    12/12/2019 11:13:24 - INFO - transformers.modeling_utils -   loading weights file https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-pytorch_model.bin from cache at C:\Users\nnn\.cache\torch\transformers\4295d67f022061768f4adc386234dbdb781c814c39662dd1662221c309962c55.778cf36f5c4e5d94c8cd9cefcf2a580c8643570eb327f0d4a1f007fab2acbdf1
    12/12/2019 11:13:32 - INFO - __main__ -   Namespace(device=device(type='cpu'), length=100, model_name_or_path='gpt2', model_type='gpt2', n_gpu=0, no_cuda=False, num_samples=1, padding_text='', prompt='My job is', repetition_penalty=1.0, seed=42, stop_token=None, temperature=1.0, top_k=0, top_p=0.9, xlm_lang='')
    100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 100/100 [00:23<00:00,  2.49it/s]
     to know when it will change, it's up to you."
    
    National Communications Director Alex Brynner said the Trump administration needs to help then-Secretary of State Rex Tillerson learn from him.
    
    "The Cabinet, like any other government job, has to be attentive to the needs of an individual that might challenge his or her position," Brynner said. "This is especially true in times of renewed volatility."
    
    Brynner said Tillerson has not "failed at vetting
    
    C:\Users\nnn\Desktop\BERT\transformers master\examples>python run\u generation.py--model\u type=gpt2--length=100--model\u name\u or\u path=gpt2--prompt=“我的工作是”
    2019-12-12 11:11:57.740810:W tensorflow/stream_executor/platform/default/dso_loader.cc:55]无法加载动态库“cudart64_100.dll”;错误:未找到cudart64_100.dll
    2019-12-12 11:11:57.748330:I tensorflow/stream_executor/cuda/cudart_stub.cc:29]如果您的机器上没有设置GPU,请忽略上面的cudart dler错误。
    2019年12月12日11:12:01-信息-transformers.file_utils-https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-vocab.json 在缓存中找不到或强制下载设置为True,下载到C:\Users\nnn\AppData\Local\Temp\tmpt\u 29gyqi
    
    100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 1042301/1042301[00:00您是否阅读了任务的先决条件:

    在机器上安装PyTorch变压器 在Python中安装Pytorch Transformers非常简单。您只需使用pip安装:

    或者,如果您正在使用Colab:

    由于这些模型中的大多数都是GPU重型的,因此我建议在本文的这一部分与Google Colab合作

    链接到Colab:

    colab中使用的命令语法略有不同,正如您可以从上面的pip命令中观察到的

    示例命令在删除\characters(当使用cmd时)并使所有命令参数在同一行上仅用空格char分隔时运行得更好

    命令可能需要大量计算,所以使用示例中的语法的colab直接在其中工作


    编辑:

    现在,回顾您的代码,有一个重要的输入错误:

    用另一个p而不是promt来使用prompt。这样你就很可能能够输入算法的种子。我正在自己的计算机上测试这个问题,如果我发现另一个问题,我会很快发表评论


    Edit2:

    我的测试花了一些时间,现在我已经准备好了。我必须遵循3个Readme.md文件才能让教程正常工作:一个来自根文件夹,一个位于Transformers下面,最后一个来自samples。然后运行此命令,我得到了第一个结果:

      python pytorch-transformers/examples/run_generation.py --model_type=gpt2 --length=2 --model_name_or_path=gpt2 --prompt="My job is"
    

    电脑思考了一会儿,说得很好:“知道”.

    我的机器上安装了pytorch transformers。我尝试在没有\的情况下运行该命令,但它不起作用。您是否尝试使用其他相同的参数运行代码,但出现另一个提示?我尝试过,它会自动从缓存中获取3个文件。如果我想重新启动机器并运行,我认为从另一个提示运行不会有帮助命令之后,我指的是--prompt选项。但是的,缓存不会永远存在,坦白说,我没有解决方案。好吧,回头看:/pytorch transformers/transformers/file_utils.py有文件处理和缓存。你应该以某种方式硬编码文件位置,以使用相同的缓存文件或其他东西。很难说,可能不是e简易任务。
    C:\Users\nnn\Desktop\BERT\transformers-master\examples>python run_generation.py --model_type=gpt2 --length=100 --model_name_or_path=gpt2 --prompt="My job is"
    
    2019-12-12 11:11:57.740810: W tensorflow/stream_executor/platform/default/dso_loader.cc:55] Could not load dynamic library 'cudart64_100.dll'; dlerror: cudart64_100.dll not found
    2019-12-12 11:11:57.748330: I tensorflow/stream_executor/cuda/cudart_stub.cc:29] Ignore above cudart dlerror if you do not have a GPU set up on your machine.
    12/12/2019 11:12:01 - INFO - transformers.file_utils -   https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-vocab.json not found in cache or force_download set to True, downloading to C:\Users\nnn\AppData\Local\Temp\tmpt_29gyqi
    100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 1042301/1042301 [00:00<00:00, 2275416.04B/s]
    12/12/2019 11:12:02 - INFO - transformers.file_utils -   copying C:\Users\nnn\AppData\Local\Temp\tmpt_29gyqi to cache at C:\Users\nnn\.cache\torch\transformers\f2808208f9bec2320371a9f5f891c184ae0b674ef866b79c58177067d15732dd.1512018be4ba4e8726e41b9145129dc30651ea4fec86aa61f4b9f40bf94eac71
    12/12/2019 11:12:02 - INFO - transformers.file_utils -   creating metadata file for C:\Users\nnn\.cache\torch\transformers\f2808208f9bec2320371a9f5f891c184ae0b674ef866b79c58177067d15732dd.1512018be4ba4e8726e41b9145129dc30651ea4fec86aa61f4b9f40bf94eac71
    12/12/2019 11:12:02 - INFO - transformers.file_utils -   removing temp file C:\Users\nnn\AppData\Local\Temp\tmpt_29gyqi
    12/12/2019 11:12:03 - INFO - transformers.file_utils -   https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-merges.txt not found in cache or force_download set to True, downloading to C:\Users\nnn\AppData\Local\Temp\tmpj1_y4sn8
    100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 456318/456318 [00:00<00:00, 1456594.78B/s]
    12/12/2019 11:12:03 - INFO - transformers.file_utils -   copying C:\Users\nnn\AppData\Local\Temp\tmpj1_y4sn8 to cache at C:\Users\nnn\.cache\torch\transformers\d629f792e430b3c76a1291bb2766b0a047e36fae0588f9dbc1ae51decdff691b.70bec105b4158ed9a1747fea67a43f5dee97855c64d62b6ec3742f4cfdb5feda
    12/12/2019 11:12:03 - INFO - transformers.file_utils -   creating metadata file for C:\Users\nnn\.cache\torch\transformers\d629f792e430b3c76a1291bb2766b0a047e36fae0588f9dbc1ae51decdff691b.70bec105b4158ed9a1747fea67a43f5dee97855c64d62b6ec3742f4cfdb5feda
    12/12/2019 11:12:03 - INFO - transformers.file_utils -   removing temp file C:\Users\nnn\AppData\Local\Temp\tmpj1_y4sn8
    12/12/2019 11:12:03 - INFO - transformers.tokenization_utils -   loading file https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-vocab.json from cache at C:\Users\nnn\.cache\torch\transformers\f2808208f9bec2320371a9f5f891c184ae0b674ef866b79c58177067d15732dd.1512018be4ba4e8726e41b9145129dc30651ea4fec86aa61f4b9f40bf94eac71
    12/12/2019 11:12:03 - INFO - transformers.tokenization_utils -   loading file https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-merges.txt from cache at C:\Users\nnn\.cache\torch\transformers\d629f792e430b3c76a1291bb2766b0a047e36fae0588f9dbc1ae51decdff691b.70bec105b4158ed9a1747fea67a43f5dee97855c64d62b6ec3742f4cfdb5feda
    12/12/2019 11:12:04 - INFO - transformers.file_utils -   https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-config.json not found in cache or force_download set to True, downloading to C:\Users\nnn\AppData\Local\Temp\tmpyxywrts1
    100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 176/176 [00:00<00:00, 17738.31B/s]
    12/12/2019 11:12:04 - INFO - transformers.file_utils -   copying C:\Users\nnn\AppData\Local\Temp\tmpyxywrts1 to cache at C:\Users\nnn\.cache\torch\transformers\4be02c5697d91738003fb1685c9872f284166aa32e061576bbe6aaeb95649fcf.085d5f6a8e7812ea05ff0e6ed0645ab2e75d80387ad55c1ad9806ee70d272f80
    12/12/2019 11:12:04 - INFO - transformers.file_utils -   creating metadata file for C:\Users\nnn\.cache\torch\transformers\4be02c5697d91738003fb1685c9872f284166aa32e061576bbe6aaeb95649fcf.085d5f6a8e7812ea05ff0e6ed0645ab2e75d80387ad55c1ad9806ee70d272f80
    12/12/2019 11:12:04 - INFO - transformers.file_utils -   removing temp file C:\Users\nnn\AppData\Local\Temp\tmpyxywrts1
    12/12/2019 11:12:04 - INFO - transformers.configuration_utils -   loading configuration file https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-config.json from cache at C:\Users\nnn\.cache\torch\transformers\4be02c5697d91738003fb1685c9872f284166aa32e061576bbe6aaeb95649fcf.085d5f6a8e7812ea05ff0e6ed0645ab2e75d80387ad55c1ad9806ee70d272f80
    12/12/2019 11:12:04 - INFO - transformers.configuration_utils -   Model config {
      "attn_pdrop": 0.1,
      "embd_pdrop": 0.1,
      "finetuning_task": null,
      "initializer_range": 0.02,
      "layer_norm_epsilon": 1e-05,
      "n_ctx": 1024,
      "n_embd": 768,
      "n_head": 12,
      "n_layer": 12,
      "n_positions": 1024,
      "num_labels": 1,
      "output_attentions": false,
      "output_hidden_states": false,
      "output_past": true,
      "pruned_heads": {},
      "resid_pdrop": 0.1,
      "summary_activation": null,
      "summary_first_dropout": 0.1,
      "summary_proj_to_labels": true,
      "summary_type": "cls_index",
      "summary_use_proj": true,
      "torchscript": false,
      "use_bfloat16": false,
      "vocab_size": 50257
    }
    
    12/12/2019 11:12:04 - INFO - transformers.file_utils -   https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-pytorch_model.bin not found in cache or force_download set to True, downloading to C:\Users\nnn\AppData\Local\Temp\tmpn8i9o_tm
    100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 548118077/548118077 [01:12<00:00, 7544610.26B/s]
    12/12/2019 11:13:18 - INFO - transformers.file_utils -   copying C:\Users\nnn\AppData\Local\Temp\tmpn8i9o_tm to cache at C:\Users\nnn\.cache\torch\transformers\4295d67f022061768f4adc386234dbdb781c814c39662dd1662221c309962c55.778cf36f5c4e5d94c8cd9cefcf2a580c8643570eb327f0d4a1f007fab2acbdf1
    12/12/2019 11:13:24 - INFO - transformers.file_utils -   creating metadata file for C:\Users\nnn\.cache\torch\transformers\4295d67f022061768f4adc386234dbdb781c814c39662dd1662221c309962c55.778cf36f5c4e5d94c8cd9cefcf2a580c8643570eb327f0d4a1f007fab2acbdf1
    12/12/2019 11:13:24 - INFO - transformers.file_utils -   removing temp file C:\Users\nnn\AppData\Local\Temp\tmpn8i9o_tm
    12/12/2019 11:13:24 - INFO - transformers.modeling_utils -   loading weights file https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-pytorch_model.bin from cache at C:\Users\nnn\.cache\torch\transformers\4295d67f022061768f4adc386234dbdb781c814c39662dd1662221c309962c55.778cf36f5c4e5d94c8cd9cefcf2a580c8643570eb327f0d4a1f007fab2acbdf1
    12/12/2019 11:13:32 - INFO - __main__ -   Namespace(device=device(type='cpu'), length=100, model_name_or_path='gpt2', model_type='gpt2', n_gpu=0, no_cuda=False, num_samples=1, padding_text='', prompt='My job is', repetition_penalty=1.0, seed=42, stop_token=None, temperature=1.0, top_k=0, top_p=0.9, xlm_lang='')
    100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 100/100 [00:23<00:00,  2.49it/s]
     to know when it will change, it's up to you."
    
    National Communications Director Alex Brynner said the Trump administration needs to help then-Secretary of State Rex Tillerson learn from him.
    
    "The Cabinet, like any other government job, has to be attentive to the needs of an individual that might challenge his or her position," Brynner said. "This is especially true in times of renewed volatility."
    
    Brynner said Tillerson has not "failed at vetting
    
         pip install pytorch-transformers
    
         !pip install pytorch-transformers
    
      python pytorch-transformers/examples/run_generation.py --model_type=gpt2 --length=2 --model_name_or_path=gpt2 --prompt="My job is"