部署scrapyd项目时出错

部署scrapyd项目时出错,scrapy,scrapyd,Scrapy,Scrapyd,尝试执行此命令时: scrapyd-deploy test -p project=myProject curl http://localhost:6800/schedule.json -d project=myproject -d spider=spider1.py 我得到以下错误: Traceback (most recent call last): File "/usr/bin/scrapyd-deploy", line 269, in <module>

尝试执行此命令时:

scrapyd-deploy test -p project=myProject
curl http://localhost:6800/schedule.json -d project=myproject -d spider=spider1.py
我得到以下错误:

Traceback (most recent call last):
      File "/usr/bin/scrapyd-deploy", line 269, in <module>
        main()
      File "/usr/bin/scrapyd-deploy", line 95, in main
        egg, tmpdir = _build_egg()
      File "/usr/bin/scrapyd-deploy", line 236, in _build_egg
        retry_on_eintr(check_call, [sys.executable, 'setup.py', 'clean', '-a', 'bdist_egg', '-d', d], stdout=o, stderr=e)
      File "/usr/local/lib/python2.7/dist-packages/scrapy/utils/python.py", line 331, in retry_on_eintr
        return function(*args, **kw)
      File "/usr/lib/python2.7/subprocess.py", line 540, in check_call
        raise CalledProcessError(retcode, cmd)
    subprocess.CalledProcessError: Command '['/usr/bin/python', 'setup.py', 'clean', '-a', 'bdist_egg', '-d', '/tmp/scrapydeploy-wV3h4k']' returned non-zero exit status 1
尝试使用此命令计划爬行器时:

scrapyd-deploy test -p project=myProject
curl http://localhost:6800/schedule.json -d project=myproject -d spider=spider1.py
我得到这个错误:

Traceback (most recent call last):
          File "/usr/local/lib/python2.7/dist-packages/twisted/web
            req.requestReceived(command, path, version)
          File "/usr/local/lib/python2.7/dist-packages/twisted/web
            self.process()
          File "/usr/local/lib/python2.7/dist-packages/twisted/web
            self.render(resrc)
          File "/usr/local/lib/python2.7/dist-packages/twisted/web
            body = resrc.render(self)
        --- <exception caught here> ---
          File "/usr/local/lib/python2.7/dist-packages/scrapyd/web
            return JsonResource.render(self, txrequest)
          File "/usr/local/lib/python2.7/dist-packages/scrapyd/uti
            r = resource.Resource.render(self, txrequest)
          File "/usr/local/lib/python2.7/dist-packages/twisted/web
            return m(request)
          File "/usr/local/lib/python2.7/dist-packages/scrapyd/web
            spiders = get_spider_list(project)
          File "/usr/local/lib/python2.7/dist-packages/scrapyd/uti
            runner = Config().get('runner')
          File "/usr/local/lib/python2.7/dist-packages/scrapyd/con
            self.cp.read(sources)
          File "/usr/lib/python2.7/ConfigParser.py", line 305, in
            self._read(fp, filename)
          File "/usr/lib/python2.7/ConfigParser.py", line 512, in
            raise MissingSectionHeaderError(fpname, lineno, line)
        ConfigParser.MissingSectionHeaderError: File contains no s
        file: /etc/scrapyd/conf.d/twistd.pid, line: 1
        '24262'
有趣的是,如果我将我认为是节头[section header]的内容添加到twistd.pid文件中,我会得到一个错误,说它包含的不是twistd的pid的数值


这些问题相互关联吗?

我也遇到了同样的错误。。对我有效的是对sudo使用命令

该错误可能是因为命令没有获得正确的权限

sudo scrapyd-deploy test -p project=myProject

当您直接运行python setup.py clean-a bdist_egg时,您会得到什么作为控制台输出?好的,我编辑了我的文章,以包含python setup.py clean-a bdist_egg的输出,所以它似乎可以工作。我不知道为什么scrapy deploy会有问题。是的,奇怪的是,我只是在尝试发布spider时添加了一些内容,它告诉我我需要twistd.pid文件中的节标题。如果我加上它们,它会向我尖叫。超级混乱。