Python Scrapy start项目以ImportError结束:没有名为Spider的模块

Python Scrapy start项目以ImportError结束:没有名为Spider的模块,python,scrapy,Python,Scrapy,我创造了一个新的环境 conda create -n scraping python=2.7 我使用 source activate scraping 然后我继续用(-n scraping似乎不再需要了,但我添加了它以防万一) 我现在安装了以下程序 % conda list # packages in environment at /Users/alexis/anaconda3/envs/scraping: # cffi 1.1.2

我创造了一个新的环境

conda create -n scraping python=2.7
我使用

source activate scraping
然后我继续用(-n scraping似乎不再需要了,但我添加了它以防万一)

我现在安装了以下程序

 % conda list
# packages in environment at /Users/alexis/anaconda3/envs/scraping:
#
cffi                      1.1.2                    py27_0
cryptography              0.9.2                    py27_0
cssselect                 0.9.1                    py27_0
enum34                    1.0.4                    py27_0
idna                      2.0                      py27_0
ipaddress                 1.0.7                    py27_0
ipython                   3.2.1                    py27_0
libxml2                   2.9.2                         0
libxslt                   1.1.28                        2
lxml                      3.4.4                    py27_0
nose                      1.3.7                    py27_0
openssl                   1.0.1k                        1
pip                       7.1.0                    py27_0
pyasn1                    0.1.7                    py27_0
pycparser                 2.14                     py27_0
pyopenssl                 0.14                     py27_0
python                    2.7.10                        0
python.app                1.2                      py27_4
queuelib                  1.2.2                    py27_0
readline                  6.2                           2
scrapy                    0.24.4                   py27_0
setuptools                18.0.1                   py27_0
six                       1.9.0                    py27_0
sqlite                    3.8.4.1                       1
tk                        8.5.18                        0
twisted                   15.2.1                   py27_0
w3lib                     1.8.1                    py27_1
zlib                      1.2.8                         0
zope.interface            4.1.2                    py27_1
(我还安装了ipython)

现在,当我尝试开始一个项目时,我得到了

 % scrapy startproject megadeluxe
Traceback (most recent call last):
  File "/Users/alexis/anaconda3/envs/scraping/bin/scrapy", line 4, in     <module>
    from scrapy.cmdline import execute
  File "/Users/alexis/anaconda3/envs/scraping/lib/python2.7/site-    packages/scrapy/__init__.py", line 48, in <module>
    from scrapy.spiders import Spider
ImportError: No module named spiders

我注意到您安装了scrapy版本0.24.4。您不运行新版本1.0的原因是什么


我相信Scrapy.spider是1.0级,而不是0.24级。我会尝试在您的环境中安装最新的版本,看看这是否有效。

我的工作与您的工作完全相同

我按照这里的步骤创建了一个环境:

在这里激活我的环境:

我是用anaconda命令提示符解决这些问题的

我很久以前安装了
Scrapy
,因此当我创建环境时,Scrapy已经存在了

当前我在导入items.py时遇到问题:


如果我能提供进一步帮助,请告诉我。

conda似乎默认安装此版本。按照你的建议看我的编辑。我想我已经把这件事搞清楚了。看起来在scrapy图书馆的结构上有很多变化。一些以前有效的导入不再有效。从scrapy.Spider导入Spider==>这是基于scrapy 1.0结构的。如果您希望运行相同的导入,但版本为0.24,则应该如下所示:from scrapy.contrib.Spider import Spider
 % scrapy startproject megadeluxe
Traceback (most recent call last):
  File "/Users/alexis/anaconda3/envs/scraping/bin/scrapy", line 4, in     <module>
    from scrapy.cmdline import execute
  File "/Users/alexis/anaconda3/envs/scraping/lib/python2.7/site-    packages/scrapy/__init__.py", line 48, in <module>
    from scrapy.spiders import Spider
ImportError: No module named spiders
  DYLD_FRAMEWORK_PATH
         This is a colon separated list of directories that contain 
         frameworks.  The  dynamic  linker searches  these  
         directories  before  it searches for the framework by its 
         install name.  It allows you to test new versions of 
         existing frameworks. (A framework is  a  library  install
         name  that  ends  in the form XXX.framework/Versions/YYY/XXX 
         or XXX.framework/XXX, where XXX and YYY are any name.)

         For each framework that a program uses, the dynamic linker 
         looks for the framework  in  each directory  in 
         DYLD_FRAMEWORK_PATH in turn. If it looks in all the 
         directories and can't find the framework, it searches the 
         directories in DYLD_LIBRARY_PATH in turn. If it  still  
         can't find   the   framework,   it   then  searches 
         DYLD_FALLBACK_FRAMEWORK_PATH  and  DYLD_FALL-
         BACK_LIBRARY_PATH in turn.

         Use the -L option to otool(1).  to discover the frameworks 
         and  shared  libraries  that  the executable is linked 
         against.