Scrapy 在向s3铲斗屈服时,无法连接卡盘关闭

Scrapy 在向s3铲斗屈服时,无法连接卡盘关闭,scrapy,Scrapy,程序未连接到\u close功能,未发现任何错误,且蜘蛛代码中的产量项目上传正常(除非“u close”未发生任何情况) 我尝试在设置中删除s3,效果很好(即进入_close功能) 如何修复?尝试下面的代码,它应该可以工作 #-*-编码:utf-8-*- 从刮擦进口信号 从scrapy.xlib.pydispatch导入调度程序 类示例Spider(scrapy.Spider): 名称=‘永远’ 定义初始化(自): dispatcher.connect(self.spider\u关闭,signa

程序未连接到
\u close
功能,未发现任何错误,且蜘蛛代码中的产量项目上传正常(除非“u close”未发生任何情况)

我尝试在设置中删除s3,效果很好(即进入_close功能)


如何修复?

尝试下面的代码,它应该可以工作

#-*-编码:utf-8-*-
从刮擦进口信号
从scrapy.xlib.pydispatch导入调度程序
类示例Spider(scrapy.Spider):
名称=‘永远’
定义初始化(自):
dispatcher.connect(self.spider\u关闭,signals.spider\u关闭)
def卡盘_关闭(自身、卡盘):
打印(f“\n\n使用{len(self.items)}新项目关闭Spider”)
# -*- coding: utf-8 -*-
import scrapy
from scrapy.utils.response import open_in_browser
from pydispatch import dispatcher
from scrapy.signalmanager import SignalManager
#from scrapy.xlib.pydispatch import dispatcher
from scrapy import signals
class ExampleSpider(scrapy.Spider):
    name = 'forever'
    allowed_domains = ['example.com']
    kohlind = max_kohls = 0
    total_products = 0
    colected = 0
    items = []
    #AWS_ACCESS_KEY_ID = 'id'
    #AWS_SECRET_ACCESS_KEY = 'pass'
    start_urls=['https://www.example.com/']
    custom_settings = {'FEED_URI' : f's3://example-products/fulltest.csv',
    'FEED_EXPORT_FIELDS': ['ITEM_ID','URL','SELLER','PRICE','SALE_PRICE','MAIN_IMAGE','OTHER_IMAGE','SKU','PRODUCT_NAME']

    }
    def __init__(self):
        SignalManager(dispatcher.Any).connect(receiver=self._close, signal=signals.spider_closed)
#spider code
    def _close(self):
        print(f"\n\nClosing Spider with {len(self.items)} New Items")
        for i in self.items:
            if "URL" in i.keys():
                yield item

        print("Done")