Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/364.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python scrapy shell终端键错误:';e';_Python_Scrapy - Fatal编程技术网

Python scrapy shell终端键错误:';e';

Python scrapy shell终端键错误:';e';,python,scrapy,Python,Scrapy,在我的windows命令提示符下“python3-m scrapy shell”https://scrapy.org“”按预期工作,并按预期提供一个交互式shell 但是,如果我打开windows终端并使用windows linux子系统键入“ubuntu”,然后尝试使用“python3-m scrapy shell”启动scrapy shellhttps://scrapy.org'或'刮壳'https://scrapy.org““我收到以下错误消息: Traceback (most r

在我的windows命令提示符下“python3-m scrapy shell”https://scrapy.org“”按预期工作,并按预期提供一个交互式shell

但是,如果我打开windows终端并使用windows linux子系统键入“ubuntu”,然后尝试使用“python3-m scrapy shell”启动scrapy shellhttps://scrapy.org'或'刮壳'https://scrapy.org““我收到以下错误消息:

    Traceback (most recent call last):
  File "/usr/lib/python3.8/runpy.py", line 193, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/usr/lib/python3.8/runpy.py", line 86, in _run_code
    exec(code, run_globals)
  File "/mnt/e/scrapy-project/miniscrapy/tutorial/env/lib/python3.8/site-packages/scrapy/__main__.py", line 4, in <module>
    execute()
  File "/mnt/e/scrapy-project/miniscrapy/tutorial/env/lib/python3.8/site-packages/scrapy/cmdline.py", line 145, in execute
    _run_print_help(parser, _run_command, cmd, args, opts)
  File "/mnt/e/scrapy-project/miniscrapy/tutorial/env/lib/python3.8/site-packages/scrapy/cmdline.py", line 100, in _run_print_help
    func(*a, **kw)
  File "/mnt/e/scrapy-project/miniscrapy/tutorial/env/lib/python3.8/site-packages/scrapy/cmdline.py", line 153, in _run_command
    cmd.run(args, opts)
  File "/mnt/e/scrapy-project/miniscrapy/tutorial/env/lib/python3.8/site-packages/scrapy/commands/shell.py", line 68, in run
    crawler.engine = crawler._create_engine()
  File "/mnt/e/scrapy-project/miniscrapy/tutorial/env/lib/python3.8/site-packages/scrapy/crawler.py", line 101, in _create_engine
    return ExecutionEngine(self, lambda _: self.stop())
  File "/mnt/e/scrapy-project/miniscrapy/tutorial/env/lib/python3.8/site-packages/scrapy/core/engine.py", line 70, in __init__
    self.scraper = Scraper(crawler)
  File "/mnt/e/scrapy-project/miniscrapy/tutorial/env/lib/python3.8/site-packages/scrapy/core/scraper.py", line 71, in __init__
    self.itemproc = itemproc_cls.from_crawler(crawler)
  File "/mnt/e/scrapy-project/miniscrapy/tutorial/env/lib/python3.8/site-packages/scrapy/middleware.py", line 53, in from_crawler
    return cls.from_settings(crawler.settings, crawler)
  File "/mnt/e/scrapy-project/miniscrapy/tutorial/env/lib/python3.8/site-packages/scrapy/middleware.py", line 35, in from_settings
    mw = create_instance(mwcls, settings, crawler)
  File "/mnt/e/scrapy-project/miniscrapy/tutorial/env/lib/python3.8/site-packages/scrapy/utils/misc.py", line 156, in create_instance
    instance = objcls.from_crawler(crawler, *args, **kwargs)
  File "/mnt/e/scrapy-project/miniscrapy/tutorial/env/lib/python3.8/site-packages/scrapy/pipelines/media.py", line 68, in from_crawler
    pipe = cls.from_settings(crawler.settings)
  File "/mnt/e/scrapy-project/miniscrapy/tutorial/env/lib/python3.8/site-packages/scrapy/pipelines/images.py", line 105, in from_settings
    return cls(store_uri, settings=settings)
  File "/mnt/e/scrapy-project/miniscrapy/tutorial/env/lib/python3.8/site-packages/scrapy/pipelines/images.py", line 48, in __init__
    super(ImagesPipeline, self).__init__(store_uri, settings=settings,
  File "/mnt/e/scrapy-project/miniscrapy/tutorial/env/lib/python3.8/site-packages/scrapy/pipelines/files.py", line 361, in __init__
    self.store = self._get_store(store_uri)
  File "/mnt/e/scrapy-project/miniscrapy/tutorial/env/lib/python3.8/site-packages/scrapy/pipelines/files.py", line 409, in _get_store
    store_cls = self.STORE_SCHEMES[scheme]
KeyError: 'e'
回溯(最近一次呼叫最后一次):
文件“/usr/lib/python3.8/runpy.py”,第193行,在“运行”模块中作为“主”
返回运行代码(代码、主全局、无、,
文件“/usr/lib/python3.8/runpy.py”,第86行,在运行代码中
exec(代码、运行\全局)
文件“/mnt/e/scrapy-project/miniscrapy/tutorial/env/lib/python3.8/site-packages/scrapy/__-main.py”,第4行
执行()
文件“/mnt/e/scrapy project/miniscrapy/tutorial/env/lib/python3.8/site packages/scrapy/cmdline.py”,执行中的第145行
_运行\u打印\u帮助(解析器、\u运行\u命令、cmd、args、opts)
文件“/mnt/e/scrapy project/miniscrapy/tutorial/env/lib/python3.8/site packages/scrapy/cmdline.py”,第100行,在“运行”和“打印”帮助中
func(*a,**千瓦)
文件“/mnt/e/scrapy project/miniscrapy/tutorial/env/lib/python3.8/site packages/scrapy/cmdline.py”,第153行,在_run_命令中
cmd.run(参数、选项)
文件“/mnt/e/scrapy project/miniscrapy/tutorial/env/lib/python3.8/site packages/scrapy/commands/shell.py”,第68行,运行中
crawler.engine=crawler.\u create\u engine()
文件“/mnt/e/scrapy project/miniscrapy/tutorial/env/lib/python3.8/site packages/scrapy/crawler.py”,第101行,在创建引擎中
返回ExecutionEngine(self,lambda:self.stop())
文件“/mnt/e/scrapy project/miniscrapy/tutorial/env/lib/python3.8/site packages/scrapy/core/engine.py”,第70行,在__
self.scraper=铲运机(履带式)
文件“/mnt/e/scrapy project/miniscrapy/tutorial/env/lib/python3.8/site packages/scrapy/core/scraper.py”,第71行,在__
self.itemproc=itemproc\u cls.from\u爬虫程序(爬虫程序)
文件“/mnt/e/scrapy project/miniscrapy/tutorial/env/lib/python3.8/site packages/scrapy/middleware.py”,第53行,来自爬虫程序
返回cls.from_设置(crawler.settings,crawler)
文件“/mnt/e/scrapy project/miniscrapy/tutorial/env/lib/python3.8/site packages/scrapy/middleware.py”,第35行,在from_设置中
mw=创建_实例(mwcls、设置、爬虫程序)
文件“/mnt/e/scrapy project/miniscrapy/tutorial/env/lib/python3.8/site packages/scrapy/utils/misc.py”,第156行,在create_实例中
instance=objcls.from_crawler(crawler,*args,**kwargs)
文件“/mnt/e/scrapy project/miniscrapy/tutorial/env/lib/python3.8/site packages/scrapy/pipelines/media.py”,第68行,来自爬虫
管道=cls.from\u设置(爬网器设置)
文件“/mnt/e/scrapy project/miniscrapy/tutorial/env/lib/python3.8/site packages/scrapy/pipelines/images.py”,第105行,在from_设置中
返回cls(存储uri,设置=设置)
文件“/mnt/e/scrapy project/miniscrapy/tutorial/env/lib/python3.8/site packages/scrapy/pipelines/images.py”,第48行,在__
超级(ImagesPipeline,self)。\uuuuuu init\uuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuu,
文件“/mnt/e/scrapy project/miniscrapy/tutorial/env/lib/python3.8/site packages/scrapy/pipelines/files.py”,第361行,在__
self.store=self.\u获取\u存储(存储uri)
文件“/mnt/e/scrapy project/miniscrapy/tutorial/env/lib/python3.8/site packages/scrapy/pipelines/files.py”,第409行,在“获取”商店中
store_cls=自我。store_方案[方案]
KeyError:'e'
在lib/python3.8/site-packages/scrapy/pipelines/files.py中,我添加了以下要调试的打印消息:

其中打印:

uri:E:/scrapy project/miniscrapy/tutorial/images

_获取存储管道/文件方案:e

我已经尝试使用“export path=$path:/usr/local/bin”设置路径,并将其存储在~/.bashrc中。所有与python相关的库都是在包含/images文件夹的tutorial文件夹中的虚拟环境中使用pip安装的。我想是os.path.isbs()方法在运行常规windows命令提示符时起作用,但在我的非真实ubuntu上以某种方式失败

虚拟环境中的echo$路径:


/mnt/e/scrapy project/miniscrapy/tutorial/env/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/usr/games:/usr/local/games

您确定“e:/scrapy project/miniscrapy/tutorial/images”吗Ubuntu子系统中是否有有效路径?如果从Ubuntu子系统运行
ls E:/scrapy project/miniscrapy/tutorial/images
,会发生什么情况?venv中的ls E:/scrapy project/miniscrapy/tutorial/images不会返回此类文件或目录,而venv中的ls/mnt/E/scrapy project/miniscrapy/tutorial/images能够找到包含所有子文件夹的文件夹您的
IMAGES\u STORE
设置的值是什么?听起来它包含Windows路径,但在Ubuntu中运行spider时应该包含Ubuntu路径。IMAGES\u STORE确实设置为“E:/scrapy project/miniscrapy/tutorial/IMAGES”,而不是“/mnt/E/scrapy project/miniscrapy/tutorial/IMAGES”ried更改了路径,现在出现了其他意外行为。当我现在尝试在venv中运行python3-m scrapy shell“”和scrapy shell“”时,它会尝试在我的项目文件夹中开始运行spider。它不应该这样做,而是只打开终端中的网站来练习所有响应。xpath命令我有一些不相关的package导入现在在我的项目中得到了解决。但是,是的,在我的设置中将图像存储路径更新为ubuntu所需的路径。py解决了这个问题。非常感谢,现在“python3-m scrapy shell”在我的ubuntu环境中可以正常工作了。你确定“E:/scrapy project/miniscrapy/tutorial/IMAGES”吗Ubuntu子系统中是否有有效路径?如果在ve中从Ubuntu子系统运行
ls E:/scrapy project/miniscrapy/tutorial/images
,会发生什么情况