Scrapy 项目未显示在列表中

Scrapy 项目未显示在列表中,scrapy,scrapyd,Scrapy,Scrapyd,我是新来的, 我已将以下代码插入到scrapy.cfg文件中 [settings] default = uk.settings [deploy:scrapyd] url = http://localhost:6800/ project=ukmall [deploy:scrapyd2] url = http://scrapyd.mydomain.com/api/scrapyd/ username = john password = secret 如果我在代码下面运行 $scrapyd-de

我是新来的, 我已将以下代码插入到scrapy.cfg文件中

[settings]
default = uk.settings


[deploy:scrapyd]
url = http://localhost:6800/
project=ukmall

[deploy:scrapyd2]
url = http://scrapyd.mydomain.com/api/scrapyd/
username = john
password = secret
如果我在代码下面运行

$scrapyd-deploy -l
我可以

scrapyd2             http://scrapyd.mydomain.com/api/scrapyd/

scrapyd              http://localst:6800/
查看所有可用的项目

scrapyd-deploy -L scrapyd
但我的机器里什么都没显示

参考:

如果有

 $ scrapy deploy scrapyd2
anandhakumar@MMTPC104:~/ScrapyProject/mall_uk$ scrapy deploy scrapyd2
Packing version 1412322816
Traceback (most recent call last):
  File "/usr/bin/scrapy", line 4, in <module>
    execute()
  File "/usr/lib/pymodules/python2.7/scrapy/cmdline.py", line 142, in execute
    _run_print_help(parser, _run_command, cmd, args, opts)
  File "/usr/lib/pymodules/python2.7/scrapy/cmdline.py", line 88, in _run_print_help
    func(*a, **kw)
  File "/usr/lib/pymodules/python2.7/scrapy/cmdline.py", line 149, in _run_command
    cmd.run(args, opts)
  File "/usr/lib/pymodules/python2.7/scrapy/commands/deploy.py", line 103, in run
    egg, tmpdir = _build_egg()
  File "/usr/lib/pymodules/python2.7/scrapy/commands/deploy.py", line 228, in _build_egg
    retry_on_eintr(check_call, [sys.executable, 'setup.py', 'clean', '-a', 'bdist_egg', '-d', d], stdout=o, stderr=e)
  File "/usr/lib/pymodules/python2.7/scrapy/utils/python.py", line 276, in retry_on_eintr
    return function(*args, **kw)
  File "/usr/lib/python2.7/subprocess.py", line 540, in check_call
    raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '['/usr/bin/python', 'setup.py', 'clean', '-a', 'bdist_egg', '-d', '/tmp/scrapydeploy-VLM6W7']' returned non-zero exit status 1
anandhakumar@MMTPC104:~/ScrapyProject/mall_uk$ 

您只能列出已部署的爬行器。如果您还没有部署任何东西,那么要部署spider,只需使用scrapy deploy:

scrapy deploy [ <target:project> | -l <target> | -L ]

vagrant@portia:~/takeovertheworld$ scrapy deploy scrapyd2
Packing version 1410145736
Deploying to project "takeovertheworld" in http://ec2-xx-xxx-xx-xxx.compute-1.amazonaws.com:6800/addversion.json
Server response (200):
{"status": "ok", "project": "takeovertheworld", "version": "1410145736", "spiders": 1}

我也犯了同样的错误。正如@hugsbrugs所说,因为scrapy项目中的一个文件夹有根权限。所以,我这样做了

sudo scrapy部署scrapyd2


您是否已将任何内容部署到本地scrapyd?我不知道,请告诉我如何部署。我只是参考文档并执行相同的操作。(我有一个scrapy项目。这是在运行上述命令子流程后引发的错误。CalledProcessError:command'['/usr/bin/python',setup.py',clean','-a',bdist_egg','-d','/tmp/scrapydeploy-VLM6W7']'返回非零退出状态1我需要清理setup.py和egg,对吗?如果是,请告诉我如何做?你能在你的问题中发布完整的跟踪吗?我有相同的错误,因为我的scrapy项目中的一个文件夹有根权限,我删除了该文件夹,就是这样:也许检查一些文件权限问题。。。
scrapy deploy [ <target:project> | -l <target> | -L ]

vagrant@portia:~/takeovertheworld$ scrapy deploy scrapyd2
Packing version 1410145736
Deploying to project "takeovertheworld" in http://ec2-xx-xxx-xx-xxx.compute-1.amazonaws.com:6800/addversion.json
Server response (200):
{"status": "ok", "project": "takeovertheworld", "version": "1410145736", "spiders": 1}
vagrant@portia:~/takeovertheworld$ curl http://ec2-xx-xxx-xx-xxx.compute-1.amazonaws.com:6800/listprojects.json
{"status": "ok", "projects": ["takeovertheworld"]}