Warning: file_get_contents(/data/phpspider/zhask/data//catemap/4/unix/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 2.7 斯坦福神经依赖解析器中的nltk查找错误_Python 2.7_Nltk - Fatal编程技术网

Python 2.7 斯坦福神经依赖解析器中的nltk查找错误

Python 2.7 斯坦福神经依赖解析器中的nltk查找错误,python-2.7,nltk,Python 2.7,Nltk,我正在尝试使用nltk提供的斯坦福神经依赖解析器。我遇到的问题是,当我调用st=nltk.parse.stanford.StanfordNeuralDependencyParser()时,我得到以下错误: >>> st = nltk.parse.stanford.StanfordNeuralDependencyParser() Traceback (most recent call last): File "C:\Users\<user>\Anaconda2\l

我正在尝试使用nltk提供的斯坦福神经依赖解析器。我遇到的问题是,当我调用
st=nltk.parse.stanford.StanfordNeuralDependencyParser()
时,我得到以下错误:

>>> st = nltk.parse.stanford.StanfordNeuralDependencyParser()
Traceback (most recent call last):
  File "C:\Users\<user>\Anaconda2\lib\site-packages\IPython\core\interactiveshell.py", line 2885, in run_code
    exec(code_obj, self.user_global_ns, self.user_ns)
  File "<ipython-input-5-ca2dec4f3c1f>", line 1, in <module>
    st = nltk.parse.stanford.StanfordNeuralDependencyParser()
  File "C:\Users\<user>\Anaconda2\lib\site-packages\nltk\parse\stanford.py", line 378, in __init__
    super(StanfordNeuralDependencyParser, self).__init__(*args, **kwargs)
  File "C:\Users\<user>\Anaconda2\lib\site-packages\nltk\parse\stanford.py", line 51, in __init__
    key=lambda model_name: re.match(self._JAR, model_name)
  File "C:\Users\<user>\Anaconda2\lib\site-packages\nltk\internals.py", line 714, in find_jar_iter
    raise LookupError('\n\n%s\n%s\n%s' % (div, msg, div))
LookupError: 

===========================================================================
  NLTK was unable to find stanford-corenlp-(\d+)(\.(\d+))+\.jar! Set
  the CLASSPATH environment variable.

  For more information, on stanford-corenlp-(\d+)(\.(\d+))+\.jar, see:
    <http://nlp.stanford.edu/software/lex-parser.shtml>
===========================================================================
我知道我在
C:\nltk\u data\stanford\
中有
corenlp
jar文件,所以我运行了下面的程序,最后出现了一个稍微不同的错误

>>> st = nltk.parse.stanford.StanfordNeuralDependencyParser('C:\\nltk_data\\stanford\\')
Traceback (most recent call last):
  File "C:\Users\<user>\Anaconda2\lib\site-packages\IPython\core\interactiveshell.py", line 2885, in run_code
    exec(code_obj, self.user_global_ns, self.user_ns)
  File "<ipython-input-22-28d797d702d9>", line 1, in <module>
    st = StanfordNeuralDependencyParser('C:\\nltk_data\\stanford\\')
  File "C:\Users\<user>\Anaconda2\lib\site-packages\nltk\parse\stanford.py", line 378, in __init__
    super(StanfordNeuralDependencyParser, self).__init__(*args, **kwargs)
  File "C:\Users\<user>\Anaconda2\lib\site-packages\nltk\parse\stanford.py", line 51, in __init__
    key=lambda model_name: re.match(self._JAR, model_name)
  File "C:\Users\<user>\Anaconda2\lib\site-packages\nltk\internals.py", line 635, in find_jar_iter
    (name_pattern, path_to_jar))
LookupError: Could not find stanford-corenlp-(\d+)(\.(\d+))+\.jar jar file at C:\nltk_data\stanford\
>>st=nltk.parse.StanfordNeuralDependencyParser('C:\\nltk\u data\\stanford\\'))
回溯(最近一次呼叫最后一次):
文件“C:\Users\\Anaconda2\lib\site packages\IPython\core\interactiveshell.py”,第2885行,运行代码
exec(代码对象、self.user\u全局、self.user\n)
文件“”,第1行,在
st=StanfordNeuralDependencyParser('C:\\nltk\U data\\stanford\\\'))
文件“C:\Users\\Anaconda2\lib\site packages\nltk\parse\stanford.py”,第378行,在\uuu init中__
超级(StanfordNeuralDependencyParser,self)。\uuu初始值(*args,**kwargs)
文件“C:\Users\\Anaconda2\lib\site packages\nltk\parse\stanford.py”,第51行,在\uuu init中__
key=lambda model\u name:re.match(self.\u JAR,model\u name)
文件“C:\Users\\Anaconda2\lib\site packages\nltk\internals.py”,第635行,在find\u jar\u iter中
(名称\模式,路径\到\ jar))
LookupError:在C:\nltk\U data\stanford中找不到stanford corenlp-(\d+)(\.(\d+)+\.jar jar文件\
我从斯坦福NLP网站下载了jar stanford-english-corenlp-2016-01-10-models.jar,并将其重命名为stanford-corenlp-2016-01-10.jar,以尝试匹配模式,但最终还是出现了相同的错误。我还下载了Stanford Parser版本3.6.0,但它不包含任何corenlp文件

有没有办法让这一切顺利进行,或者我误解了什么

>>> st = nltk.parse.stanford.StanfordNeuralDependencyParser('C:\\nltk_data\\stanford\\')
Traceback (most recent call last):
  File "C:\Users\<user>\Anaconda2\lib\site-packages\IPython\core\interactiveshell.py", line 2885, in run_code
    exec(code_obj, self.user_global_ns, self.user_ns)
  File "<ipython-input-22-28d797d702d9>", line 1, in <module>
    st = StanfordNeuralDependencyParser('C:\\nltk_data\\stanford\\')
  File "C:\Users\<user>\Anaconda2\lib\site-packages\nltk\parse\stanford.py", line 378, in __init__
    super(StanfordNeuralDependencyParser, self).__init__(*args, **kwargs)
  File "C:\Users\<user>\Anaconda2\lib\site-packages\nltk\parse\stanford.py", line 51, in __init__
    key=lambda model_name: re.match(self._JAR, model_name)
  File "C:\Users\<user>\Anaconda2\lib\site-packages\nltk\internals.py", line 635, in find_jar_iter
    (name_pattern, path_to_jar))
LookupError: Could not find stanford-corenlp-(\d+)(\.(\d+))+\.jar jar file at C:\nltk_data\stanford\