Nlp 运行时出错";config=RobertaConfig.from_pretrained(";/absolutepath to/BERTweet_base)&u transformers/config.json";
我试图运行“transformers”版本的代码来使用新的预先训练过的BERTweet模型,但我遇到了一个错误 以下代码行在我的Google Colab笔记本中成功运行:Nlp 运行时出错";config=RobertaConfig.from_pretrained(";/absolutepath to/BERTweet_base)&u transformers/config.json";,nlp,google-colaboratory,bert-language-model,huggingface-transformers,roberta-language-model,Nlp,Google Colaboratory,Bert Language Model,Huggingface Transformers,Roberta Language Model,我试图运行“transformers”版本的代码来使用新的预先训练过的BERTweet模型,但我遇到了一个错误 以下代码行在我的Google Colab笔记本中成功运行: !pip install fairseq import fairseq !pip install fastBPE import fastBPE # download the pre-trained BERTweet model zipped file !wget https://public.vinai.io/BERTwe
!pip install fairseq
import fairseq
!pip install fastBPE
import fastBPE
# download the pre-trained BERTweet model zipped file
!wget https://public.vinai.io/BERTweet_base_fairseq.tar.gz
# unzip the pre-trained BERTweet model files
!tar -xzvf BERTweet_base_fairseq.tar.gz
!pip install transformers
import transformers
import torch
import argparse
from transformers import RobertaConfig
from transformers import RobertaModel
from fairseq.data.encoders.fastbpe import fastBPE
from fairseq.data import Dictionary
然后我尝试运行以下代码:
# Load model
config = RobertaConfig.from_pretrained(
"/Absolute-path-to/BERTweet_base_transformers/config.json"
)
BERTweet = RobertaModel.from_pretrained(
"/Absolute-path-to/BERTweet_base_transformers/model.bin",
config=config
)
…并显示一个错误:
---------------------------------------------------------------------------
OSError Traceback (most recent call last)
/usr/local/lib/python3.6/dist-packages/transformers/configuration_utils.py in get_config_dict(cls, pretrained_model_name_or_path, **kwargs)
242 if resolved_config_file is None:
--> 243 raise EnvironmentError
244 config_dict = cls._dict_from_json_file(resolved_config_file)
OSError:
During handling of the above exception, another exception occurred:
OSError Traceback (most recent call last)
2 frames
/usr/local/lib/python3.6/dist-packages/transformers/configuration_utils.py in get_config_dict(cls, pretrained_model_name_or_path, **kwargs)
250 f"- or '{pretrained_model_name_or_path}' is the correct path to a directory containing a {CONFIG_NAME} file\n\n"
251 )
--> 252 raise EnvironmentError(msg)
253
254 except json.JSONDecodeError:
OSError: Can't load config for '/Absolute-path-to/BERTweet_base_transformers/config.json'. Make sure that:
- '/Absolute-path-to/BERTweet_base_transformers/config.json' is a correct model identifier listed on 'https://huggingface.co/models'
- or '/Absolute-path-to/BERTweet_base_transformers/config.json' is the correct path to a directory containing a config.json file
我猜问题是我需要用其他东西来替换“/Absolute path to”,但如果是这样的话,应该用什么来替换呢?这可能是一个非常简单的答案,我觉得问这个问题很愚蠢,但我需要帮助。首先,您必须下载github自述中描述的正确软件包:
!wget https://public.vinai.io/BERTweet_base_transformers.tar.gz
!tar -xzvf BERTweet_base_transformers.tar.gz
之后,您可以单击目录图标(屏幕左侧)并列出下载的数据:
右键单击BERTweet_base_transformers,选择复制路径
,然后将剪贴板中的内容插入代码:
config=RobertaConfig.from\u预训练(
“/content/BERTweet\u base\u transformers/config.json”
)
BERTweet=RobertaModel.from_预训练(
“/content/BERTweet\u base\u transformers/model.bin”,
config=config
)
首先,您必须下载github自述文件中描述的正确软件包:
!wget https://public.vinai.io/BERTweet_base_transformers.tar.gz
!tar -xzvf BERTweet_base_transformers.tar.gz
之后,您可以单击目录图标(屏幕左侧)并列出下载的数据:
右键单击BERTweet_base_transformers,选择复制路径
,然后将剪贴板中的内容插入代码:
config=RobertaConfig.from\u预训练(
“/content/BERTweet\u base\u transformers/config.json”
)
BERTweet=RobertaModel.from_预训练(
“/content/BERTweet\u base\u transformers/model.bin”,
config=config
)
你用实际路径替换了“/code>/Absolute path to/了吗?没有。这是我的问题。绝对路径是什么?我如何找出我应该用什么替换“/Absolute path to/”呢?你用实际路径替换了“/code>/Absolute path to/了吗?没有。这是我的问题。绝对路径是什么?我如何找出我应该做什么将“/Absolute path to/”替换为?Ahhh是的,看起来我下载了fairseq文件:(!wget!tar-xzvf BERTweet_base_fairseq.tar.gz)但没有下载变压器文件。谢谢!Ahhh是的,看起来我下载了fairseq文件:(!wget!tar-xzvf BERTweet_base_fairseq.tar.gz),但没有下载变压器文件。谢谢!