Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/django/22.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python django动态爬行器错误“检查强制变量”_Python_Django_Web Scraping_Scrapy - Fatal编程技术网

Python django动态爬行器错误“检查强制变量”

Python django动态爬行器错误“检查强制变量”,python,django,web-scraping,scrapy,Python,Django,Web Scraping,Scrapy,我正在使用Django dynamic spider库构建一个动态刮板。应用范围是抓取网站并将数据动态保存到我的数据库中。从这里,我将它呈现到我的Django应用程序,以自动显示上下文。但是,当我使用命令运行它时 scrapy crawl article_spider -a id=1 -a do_action=yes 我得到以下错误: 2016-05-08 12:56:06 [django.db.backends] DEBUG: (0.000) QUERY = u'SELECT &qu

我正在使用Django dynamic spider库构建一个动态刮板。应用范围是抓取网站并将数据动态保存到我的数据库中。从这里,我将它呈现到我的Django应用程序,以自动显示上下文。但是,当我使用命令运行它时

scrapy crawl article_spider -a id=1 -a do_action=yes
我得到以下错误:

    2016-05-08 12:56:06 [django.db.backends] DEBUG: (0.000) QUERY = u'SELECT "dynami
c_scraper_scrapedobjclass"."id", "dynamic_scraper_scrapedobjclass"."name", "dyna
mic_scraper_scrapedobjclass"."scraper_scheduler_conf", "dynamic_scraper_scrapedo
bjclass"."checker_scheduler_conf", "dynamic_scraper_scrapedobjclass"."comments"
FROM "dynamic_scraper_scrapedobjclass" WHERE "dynamic_scraper_scrapedobjclass"."
id" = %s' - PARAMS = (1,); args=(1,)
2016-05-08 12:56:06 [root] INFO: Spider for NewsWebsite "Wikinews" (1) initializ
ed.
Unhandled error in Deferred:
2016-05-08 12:56:06 [twisted] CRITICAL: Unhandled error in Deferred:


Traceback (most recent call last):
  File "C:\Users\shazia\testscrapy\lib\site-packages\scrapy\cmdline.py", line 150
, in _run_command
    cmd.run(args, opts)
  File "C:\Users\shazia\testscrapy\lib\site-packages\scrapy\commands\crawl.py", l
ine 57, in run
    self.crawler_process.crawl(spname, **opts.spargs)
  File "C:\Users\shazia\testscrapy\lib\site-packages\scrapy\crawler.py", line 153
, in crawl
    d = crawler.crawl(*args, **kwargs)
  File "C:\Users\shazia\testscrapy\lib\site-packages\twisted\internet\defer.py",
line 1274, in unwindGenerator
    return _inlineCallbacks(None, gen, Deferred())
--- <exception caught here> ---
  File "C:\Users\shazia\testscrapy\lib\site-packages\twisted\internet\defer.py",
line 1128, in _inlineCallbacks
    result = g.send(result)
  File "C:\Users\shazia\testscrapy\lib\site-packages\scrapy\crawler.py", line 71,
 in crawl
    self.engine = self._create_engine()
  File "C:\Users\shazia\testscrapy\lib\site-packages\scrapy\crawler.py", line 83,
 in _create_engine
    return ExecutionEngine(self, lambda _: self.stop())
  File "C:\Users\shazia\testscrapy\lib\site-packages\scrapy\core\engine.py", line
 66, in __init__
    self.downloader = downloader_cls(crawler)
  File "C:\Users\shazia\testscrapy\lib\site-packages\scrapy\core\downloader\__ini
t__.py", line 65, in __init__
    self.handlers = DownloadHandlers(crawler)
  File "C:\Users\shazia\testscrapy\lib\site-packages\scrapy\core\downloader\handl
ers\__init__.py", line 23, in __init__
    cls = load_object(clspath)
  File "C:\Users\shazia\testscrapy\lib\site-packages\scrapy\utils\misc.py", line
44, in load_object
    mod = import_module(module)
  File "C:\Python27\Lib\importlib\__init__.py", line 37, in import_module
    __import__(name)
  File "C:\Users\shazia\testscrapy\lib\site-packages\scrapy\core\downloader\handl
ers\s3.py", line 6, in <module>
    from .http import HTTPDownloadHandler
  File "C:\Users\shazia\testscrapy\lib\site-packages\scrapy\core\downloader\handl
ers\http.py", line 5, in <module>
    from .http11 import HTTP11DownloadHandler as HTTPDownloadHandler
  File "C:\Users\shazia\testscrapy\lib\site-packages\scrapy\core\downloader\handl
ers\http11.py", line 15, in <module>
    from scrapy.xlib.tx import Agent, ProxyAgent, ResponseDone, \
  File "C:\Users\shazia\testscrapy\lib\site-packages\scrapy\xlib\tx\__init__.py",
 line 3, in <module>
    from twisted.web import client
  File "C:\Users\shazia\testscrapy\lib\site-packages\twisted\web\client.py", line
 41, in <module>
    from twisted.internet.endpoints import TCP4ClientEndpoint, SSL4ClientEndpoin
t
  File "C:\Users\shazia\testscrapy\lib\site-packages\twisted\internet\endpoints.p
y", line 34, in <module>
    from twisted.internet.stdio import StandardIO, PipeAddress
  File "C:\Users\shazia\testscrapy\lib\site-packages\twisted\internet\stdio.py",
line 30, in <module>
    from twisted.internet import _win32stdio
  File "C:\Users\shazia\testscrapy\lib\site-packages\twisted\internet\_win32stdio
.py", line 7, in <module>
    import win32api
exceptions.ImportError: No module named win32api
2016-05-08 12:56:06 [twisted] CRITICAL:

我使用的测试完整的代码参考。请告知。

当我从sourceforge尝试pywin32时,它发生在我身上。请从这里下载软件包。确保您的pip已升级到最新版本。您可以像easy_install-upgrade pip或pip install-upgrade pip那样升级它。然后从给定位置重新安装pywin32。希望它对你有用。如果您仍然有问题,请告诉我。

我相信这与在Windows上安装Twisted有关。请参阅PyWin32 required