Machine learning 如何使用BertForQuestionAnswering从上下文中获取多个答案
如何使用BertForQuestionAnswering从课文中获得多个答案,就像下面的问题一样,有两个可能的答案Machine learning 如何使用BertForQuestionAnswering从上下文中获取多个答案,machine-learning,nlp,Machine Learning,Nlp,如何使用BertForQuestionAnswering从课文中获得多个答案,就像下面的问题一样,有两个可能的答案 漂亮的木偶 软件工程师 下面是相同的代码段: from transformers import BertTokenizer, BertForQuestionAnswering import torch tokenizer = BertTokenizer.from_pretrained('bert-base-uncased') model = BertForQuestionAnsw
from transformers import BertTokenizer, BertForQuestionAnswering
import torch
tokenizer = BertTokenizer.from_pretrained('bert-base-uncased')
model = BertForQuestionAnswering.from_pretrained('bert-large-uncased-whole-word-masking-finetuned-squad')
question, text = "Who was Jim Henson?", "Jim Henson was a nice puppet.Jim Henson was a software engineer."
input_ids = tokenizer.encode(question, text)
token_type_ids = [0 if i <= input_ids.index(102) else 1 for i in range(len(input_ids))]
start_scores, end_scores = model(torch.tensor([input_ids]), token_type_ids=torch.tensor([token_type_ids]))
all_tokens = tokenizer.convert_ids_to_tokens(input_ids)
answer = ' '.join(all_tokens[torch.argmax(start_scores) : torch.argmax(end_scores)+1])
print(answer)
'a software engineer' ```
Thanks in advance!!
从transformers导入BertTokenizer,BertForQuestionAnswering
进口火炬
标记器=BertTokenizer.from_pretrained('bert-base-uncased'))
model=bertforquestionresponsing.from_pretrained('bert-large-uncased-whole-word-masking-fineted-band'))
问题,text=“谁是吉姆·汉森?”,“吉姆·汉森是个不错的木偶。吉姆·汉森是个软件工程师。”
input_id=tokenizer.encode(问题,文本)
令牌类型标识=[0如果我你只是根据BERT输出最有可能的答案。如果你想要多个答案,你需要实际选择多个答案。在这种情况下,这将是第一个和第二个最有可能的答案。为此,创建一个新的答案变量,并从e开始分数和结束分数变量。我建议您在torch.topk()中使用
让我知道这是怎么回事
answer1_start, answer2_start = torch.topk(start_scores)
answer2_end, answer2_end = torch.topk(end_scores)
answer1 = ' '.join(all_tokens[answer1_start : answer1_end + 1])
answer2 = ' '.join(all_tokens[answer2_start : answer2_end + 1])
print(answer1, answer2)