Python 使用generate.py StyleGAN 2 ADA无法识别的参数

Python 使用generate.py StyleGAN 2 ADA无法识别的参数,python,machine-learning,generative-adversarial-network,stylegan,Python,Machine Learning,Generative Adversarial Network,Stylegan,我已经在Google Colab中使用自定义乳房X光数据集训练了StyleGAN 2 ADA()的ffqh1024模型。我的训练有素的model.pkl文件已准备好放在驱动器文件夹中,我希望使用该.pkl文件生成图像。我试过: !python generate.py --outdir='/content/drive/MyDrive/TFM/Generated' --trunc=1 --seeds=85,265,297,849 \ --network='/content/drive/MyDrive

我已经在Google Colab中使用自定义乳房X光数据集训练了StyleGAN 2 ADA()的ffqh1024模型。我的训练有素的model.pkl文件已准备好放在驱动器文件夹中,我希望使用该.pkl文件生成图像。我试过:

!python generate.py --outdir='/content/drive/MyDrive/TFM/Generated' --trunc=1 --seeds=85,265,297,849 \ --network='/content/drive/MyDrive/TFM/colab-sg2-ada/stylegan2-ada/results/00025-ddsm-auto1-bg-resumecustom/network-snapshot-000096.pkl'
正如GitHub上建议的那样,但我得到了以下错误:

usage: generate.py [-h] {generate-images,truncation-traversal,generate-latent-walk,generate-neighbors,lerp-video} ... generate.py: error: unrecognized arguments: --outdir='/content/drive/MyDrive/TFM/Generated' --trunc=1 --seeds=85,265,297,849 --network='/content/drive/MyDrive/TFM/colab-sg2-ada/stylegan2-ada/results/00025-ddsm-auto1-bg-resumecustom/network-snapshot-000096.pkl'
我真的不知道为什么generate.py无法识别参数。。。我不得不这么做!pip安装opensimplex以生成运行的.py,不知道是否与此问题有关

在StyleGAN 2 ADA repo中,有使用经过训练的模型生成图像的示例:

# Generate curated MetFaces images without truncation (Fig.10 left)
python generate.py --outdir=out --trunc=1 --seeds=85,265,297,849 \
    --network=https://nvlabs-fi-cdn.nvidia.com/stylegan2-ada/pretrained/metfaces.pkl
这是我们所做的一切的结果!python generate.py-h:

usage: generate.py [-h]
                   {generate-images,truncation-traversal,generate-latent-walk,generate-neighbors,lerp-video}
                   ...

Generate images using pretrained network pickle.

positional arguments:
  {generate-images,truncation-traversal,generate-latent-walk,generate-neighbors,lerp-video}
                        Sub-commands
    generate-images     Generate images
    truncation-traversal
                        Generate truncation walk
    generate-latent-walk
                        Generate latent walk
    generate-neighbors  Generate random neighbors of a seed
    lerp-video          Generate interpolation video (lerp) between random
                        vectors

optional arguments:
  -h, --help            show this help message and exit

examples:

  # Generate curated MetFaces images without truncation (Fig.10 left)
  python generate.py --outdir=out --trunc=1 --seeds=85,265,297,849 \
      --network=https://nvlabs-fi-cdn.nvidia.com/stylegan2-ada/pretrained/metfaces.pkl

  # Generate uncurated MetFaces images with truncation (Fig.12 upper left)
  python generate.py --outdir=out --trunc=0.7 --seeds=600-605 \
      --network=https://nvlabs-fi-cdn.nvidia.com/stylegan2-ada/pretrained/metfaces.pkl

  # Generate class conditional CIFAR-10 images (Fig.17 left, Car)
  python generate.py --outdir=out --trunc=1 --seeds=0-35 --class=1 \
      --network=https://nvlabs-fi-cdn.nvidia.com/stylegan2-ada/pretrained/cifar10.pkl

  # Render image from projected latent vector
  python generate.py --outdir=out --dlatents=out/dlatents.npz \
      --network=https://nvlabs-fi-cdn.nvidia.com/stylegan2-ada/pretrained/ffhq.pkl