Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/image-processing/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python (模式,nlp)ValueError:numpy.dtype的大小不正确,请尝试重新编译_Python_Numpy_Design Patterns_Nlp_Python Pattern - Fatal编程技术网

Python (模式,nlp)ValueError:numpy.dtype的大小不正确,请尝试重新编译

Python (模式,nlp)ValueError:numpy.dtype的大小不正确,请尝试重新编译,python,numpy,design-patterns,nlp,python-pattern,Python,Numpy,Design Patterns,Nlp,Python Pattern,我一直在尝试安装Reddit scraper(Python w/Pattern/NLP),但我的Numpy一直存在不兼容的问题 我尝试卸载并重新安装Numpy、模式、Python。。。你知道怎么解决这个问题吗 这是我的错误消息: Traceback (most recent call last): File "example-TRP.py", line 1, in <module> from redditnlp import RedditWordCount

我一直在尝试安装Reddit scraper(Python w/Pattern/NLP),但我的Numpy一直存在不兼容的问题

我尝试卸载并重新安装Numpy、模式、Python。。。你知道怎么解决这个问题吗

这是我的错误消息:

Traceback (most recent call last):
      File "example-TRP.py", line 1, in <module>
        from redditnlp import RedditWordCounter, TfidfCorpus
      File "/Users/-----/Desktop/reddit-nlp/redditnlp/__init__.py", line 12, in <module>
        import nltk
      File "/Library/Python/2.7/site-packages/nltk/__init__.py", line 114, in <module>
        from nltk.collocations import *
      File "/Library/Python/2.7/site-packages/nltk/collocations.py", line 39, in <module>
        from nltk.metrics import ContingencyMeasures, BigramAssocMeasures, TrigramAssocMeasures
      File "/Library/Python/2.7/site-packages/nltk/metrics/__init__.py", line 16, in <module>
        from nltk.metrics.scores import          (accuracy, precision, recall, f_measure,
      File "/Library/Python/2.7/site-packages/nltk/metrics/scores.py", line 18, in <module>
        from scipy.stats.stats import betai
      File "/System/Library/Frameworks/Python.framework/Versions/2.7/Extras/lib/python/scipy/stats/__init__.py", line 324, in <module>
        from .stats import *
      File "/System/Library/Frameworks/Python.framework/Versions/2.7/Extras/lib/python/scipy/stats/stats.py", line 242, in <module>
        import scipy.special as special
      File "/System/Library/Frameworks/Python.framework/Versions/2.7/Extras/lib/python/scipy/special/__init__.py", line 531, in <module>
        from ._ufuncs import *
      File "numpy.pxd", line 155, in init scipy.special._ufuncs (scipy/special/_ufuncs.c:19983)
    ValueError: numpy.dtype has the wrong size, try recompiling
回溯(最近一次呼叫最后一次):
文件“example TRP.py”,第1行,在
从redditnlp导入RedditWordCounter,TFIDFCompus
文件“/Users/----/Desktop/reddit nlp/redditnlp/_init__uuu.py”,第12行,在
导入nltk
文件“/Library/Python/2.7/site packages/nltk/_init__uu.py”,第114行,在
从nltk.consolutions导入*
文件“/Library/Python/2.7/site packages/nltk/collabons.py”,第39行,在
从nltk.metrics导入意外度量、BigramAssocMeasures、TrigramAssocMeasures
文件“/Library/Python/2.7/site-packages/nltk/metrics/_-init___;.py”,第16行,在
从nltk.metrics.scores导入(准确性、精密度、召回率、f_度量、,
文件“/Library/Python/2.7/site packages/nltk/metrics/scores.py”,第18行,在
从scipy.stats.stats导入betai
文件“/System/Library/Frameworks/Python.framework/Versions/2.7/Extras/lib/Python/scipy/stats/_init__.py”,第324行,在
从.stats导入*
文件“/System/Library/Frameworks/Python.framework/Versions/2.7/Extras/lib/Python/scipy/stats/stats.py”,第242行,在
将scipy.special作为特殊导入
文件“/System/Library/Frameworks/Python.framework/Versions/2.7/Extras/lib/Python/scipy/special/_init__.py”,第531行,在
从.\u ufuncs导入*
文件“numpy.pxd”,第155行,在init scipy.special.\u ufuncs(scipy/special/\u ufuncs.c:19983)中
ValueError:numpy.dtype大小错误,请尝试重新编译