Warning: file_get_contents(/data/phpspider/zhask/data//catemap/8/logging/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Logging 设置日志级别对scrapy没有任何影响_Logging_Scrapy - Fatal编程技术网

Logging 设置日志级别对scrapy没有任何影响

Logging 设置日志级别对scrapy没有任何影响,logging,scrapy,Logging,Scrapy,我正在使用CrawlerProcess运行scrapy爬虫程序,如下所示 logging.basicConfig(level=logging.INFO) l = logging.getLogger("crawl") try: p = CrawlerProcess(get_project_settings()) crawler = p.create_crawler('my_crawler') p.crawl(crawler) p.sta

我正在使用CrawlerProcess运行scrapy爬虫程序,如下所示

logging.basicConfig(level=logging.INFO)
l = logging.getLogger("crawl")

try:
    p = CrawlerProcess(get_project_settings())
   
    crawler = p.create_crawler('my_crawler')
    p.crawl(crawler)
    p.start()
    crawl_stats = crawler.stats.get_stats()
    ...using crawl_stats
except Exception:
    l.exception("Failed to crawl")
my settings.py具有以下日志设置

LOG_ENABLED = True
LOG_LEVEL = 'WARNING'
运行爬虫程序时,scrapy正在控制台上打印大量调试消息。将日志级别设置为“警告”没有任何影响

环境: Scrapy=2.5.0 Python=3.8
Debian

我也有同样的问题。不知何故,在没有指定的情况下,无法自动获取项目设置

from path.to.your.settings import settings as st
 

logging.basicConfig(level=logging.INFO)
l = logging.getLogger("crawl")

try:
    crawler_settings = Settings()
    crawler_settings.setmodule(st)
    p = CrawlerProcess(settings=crawler_settings)
    crawler = p.create_crawler('my_crawler')
    p.crawl(crawler)
    p.start()
    crawl_stats = crawler.stats.get_stats()
    ...using crawl_stats
except Exception:
    l.exception("Failed to crawl")

它对我有效。

不,它没有为我解决问题。