Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/344.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/4/string/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/8/xslt/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 列表的类型错误_Python_String_List_Nltk - Fatal编程技术网

Python 列表的类型错误

Python 列表的类型错误,python,string,list,nltk,Python,String,List,Nltk,我使用的是nltk,我标记了包含评论的整个文本文件,并将其存储在可变文本中: with open("reviews.txt") as f: text=f.read() 现在,我在句子标记化中标记了整个文本 import nltk from nltk.tokenize import sent_tokenize tokenized=sent_tokenize(text) 现在,整个标记化数据以句子的形式存储在tokenized 当我试图将这个句子标记化的数据存储在txt文件中时,我得

我使用的是nltk,我标记了包含评论的整个文本文件,并将其存储在可变文本中:

with open("reviews.txt") as f:
     text=f.read()
现在,我在句子标记化中标记了整个文本

import nltk
from nltk.tokenize import sent_tokenize

tokenized=sent_tokenize(text)
现在,整个标记化数据以句子的形式存储在
tokenized

当我试图将这个句子标记化的数据存储在txt文件中时,我得到一个类型错误

with open("sentences.txt","w+") as f1:
     f1.write(tokenized)
当我执行它时,我得到类型错误

TypeError: must be str, not list
sent\u tokenize
返回字符串列表,而不是字符串 或者在这个上下文中-句子字符串。如果要将它们写入文件,则应替换

with open("sentences.txt","w+") as f1:
     f1.write(tokenized)
使用此代码(假设您希望每行写一句话):


'\n'
替换为所需的句子分隔符(逗号
,“
分号
”;“
等)。

您应该能够在交互模式下回答此问题

  • 打印(标记化)
  • 帮助(f1.write)
sent\u tokenize(text)返回一个列表,而str“write”需要一个字符串,并且您正在传递一个可以加入内容的列表,即:f1.write(“”.join(tokenix))
with open("sentences.txt","w+") as f1:
     f1.write('\n'.join(tokenized))