Python Pylint导入和ModuleNotFound错误-无法导入诸如scrapy和sqlalchemy之类的模块
我正在编写一个爬行器来抓取网站,但是当我运行Python Pylint导入和ModuleNotFound错误-无法导入诸如scrapy和sqlalchemy之类的模块,python,scrapy,Python,Scrapy,我正在编写一个爬行器来抓取网站,但是当我运行scrapy crawl scraper\u bot\u name时,我收到一个ModuleNotFound错误:没有名为scrapy.spider的模块scrapy.spider是第一个导入语句 此外,我检查了VSCode,如果我能找到其他内容,我在所有的from…import…语句中都看到了这个问题,错误是-无法导入'scrapy.spider'pylint(导入错误) 我试图找到一个解决方案,提到的最常见的修复方法是检查项目目录结构。具体来说,提
scrapy crawl scraper\u bot\u name
时,我收到一个ModuleNotFound
错误:没有名为scrapy.spider的模块scrapy.spider
是第一个导入语句
此外,我检查了VSCode,如果我能找到其他内容,我在所有的from…import…
语句中都看到了这个问题,错误是-无法导入'scrapy.spider'pylint(导入错误)
我试图找到一个解决方案,提到的最常见的修复方法是检查项目目录结构。具体来说,提到检查项目中是否有任何文件被命名为scrapy.py
。我的项目不是这样,我也面临着sqlalchemy
的问题,所以我认为这不是问题所在。
这是目录结构-
- 刮削工作区
- 我的刮刀
- 刮刀应用程序
- init.py
- items.py
- models.py
- 设置.py
- 管道.py
- 蜘蛛
- init.py
- angellist_spider.py
- 刮痧
- 测试(虚拟环境-在与scrape_工作区并行的目录中还有另一个虚拟环境)
这是回溯-
Traceback (most recent call last):
File “/Users/arif/newcoderProjects/scrape/ScrapProj/bin/scrapy”, line 10, in <module>
sys.exit(execute())
File “/Users/arif/newcoderProjects/scrape/ScrapProj/lib/python3.8/site-packages/scrapy/cmdline.py”, line 142, in execute
cmd.crawler_process = CrawlerProcess(settings)
File “/Users/arif/newcoderProjects/scrape/ScrapProj/lib/python3.8/site-packages/scrapy/crawler.py”, line 280, in __init__
super(CrawlerProcess, self).__init__(settings)
File “/Users/arif/newcoderProjects/scrape/ScrapProj/lib/python3.8/site-packages/scrapy/crawler.py”, line 152, in __init__
self.spider_loader = self._get_spider_loader(settings)
File "/Users/arif/newcoderProjects/scrape/ScrapProj/lib/python3.8/site-packages/scrapy/crawler.py", line 146, in _get_spider_loader
return loader_cls.from_settings(settings.frozencopy())
File "/Users/arif/newcoderProjects/scrape/ScrapProj/lib/python3.8/site-packages/scrapy/spiderloader.py", line 68, in from_settings
return cls(settings)
File "/Users/arif/newcoderProjects/scrape/ScrapProj/lib/python3.8/site-packages/scrapy/spiderloader.py", line 24, in __init__
self._load_all_spiders()
File "/Users/arif/newcoderProjects/scrape/ScrapProj/lib/python3.8/site-packages/scrapy/spiderloader.py", line 51, in _load_all_spiders
for module in walk_modules(name):
File "/Users/arif/newcoderProjects/scrape/ScrapProj/lib/python3.8/site-packages/scrapy/utils/misc.py”, line 77, in walk_modules
submod = import_module(fullpath)
File “/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/importlib/__init__.py”, line 127, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File “<frozen importlib._bootstrap>”, line 1014, in _gcd_import
File "<frozen importlib._bootstrap>", line 991, in _find_and_load
File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 671, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 783, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/Users/arif/newcoderProjects/scrape_workspace/my_scraper/scraper_app/spiders/angellist_spider.py", line 1, in <module>
from scrapy.spider import BaseSpider
ModuleNotFoundError: No module named 'scrapy.spider'
回溯(最近一次呼叫最后一次):
文件“/Users/arif/newcoderProjects/scrape/scraproj/bin/scrapy”,第10行,在
sys.exit(execute())
文件“/Users/arif/newcoderProjects/scrape/scraproj/lib/python3.8/site packages/scrapy/cmdline.py”,执行中的第142行
cmd.crawler_process=CrawlerProcess(设置)
文件“/Users/arif/newcoderProjects/scrape/scraproj/lib/python3.8/site packages/scrapy/crawler.py”,第280行,在__
超级(爬虫进程,自我)。\uuuuu初始化\uuuuu(设置)
文件“/Users/arif/newcoderProjects/scrap/scraproj/lib/python3.8/site packages/scrapy/crawler.py”,第152行,在__
self.spider\u loader=self.\u get\u spider\u loader(设置)
文件“/Users/arif/newcoderProjects/scrape/scraproj/lib/python3.8/site packages/scrapy/crawler.py”,第146行,在“获取蜘蛛”加载程序中
从\u设置返回加载程序\u cls.(settings.frozencopy())
文件“/Users/arif/newcoderProjects/scrape/scraproj/lib/python3.8/site packages/scrapy/spiderloader.py”,第68行,在from_设置中
返回cls(设置)
文件“/Users/arif/newcoderProjects/scrape/scraproj/lib/python3.8/site packages/scrapy/spiderloader.py”,第24行,在__
self.\u加载\u所有\u蜘蛛()
文件“/Users/arif/newcoderProjects/scrape/scraproj/lib/python3.8/site packages/scrapy/spiderloader.py”,第51行,在“加载所有蜘蛛”中
对于walk_模块中的模块(名称):
文件“/Users/arif/newcoderProjects/scrape/scraproj/lib/python3.8/site packages/scrapy/utils/misc.py”,第77行,在walk_模块中
子模块=导入模块(完整路径)
文件“/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/importlib/\uuuu init\uuuuu.py”,第127行,在导入模块中
return _bootstrap._gcd_import(名称[级别:],包,级别)
文件“”,第1014行,在\u gcd\u导入中
文件“”,第991行,在“查找”和“加载”中
文件“”,第975行,在“查找”和“加载”中解锁
文件“”,第671行,在\u加载\u解锁
exec_模块中第783行的文件“”
文件“”,第219行,在“调用”中,删除了“帧”
文件“/Users/arif/newcoderProjects/scrape_workspace/my_scraper/scraper_app/spider/angellist_spider.py”,第1行,在
从scrapy.spider导入BaseSpider
ModuleNotFoundError:没有名为“scrapy.spider”的模块
看起来您没有正确导入BaseSpider
from scrapy.Spider import BaseSpider
从scrapy.Spider导入BaseSpider not scrapy.Spider是您可能想要的。是的,找到了问题,我指的是scrapy的旧文档,因此出现了错误。