Python 从Scrapy项目生成exe

Python 从Scrapy项目生成exe,python,scrapy,web-crawler,exe,pyinstaller,Python,Scrapy,Web Crawler,Exe,Pyinstaller,我试图使用PyInstaller(更具体地说,是GUI)从使用Scrapy的项目生成一个exe文件 主文件按顺序执行两个spider: 从scrapy.crawler导入CrawlerRunner 从twisted.internet导入反应器,延迟 从spiderDummy1导入spiderDummy1 从spiderDummy2导入spiderDummy2 # ... 一些日志配置 @defer.inlineCallbacks def爬网(流道): 屈服奔跑者爬行(蜘蛛侠1) 屈服于runne

我试图使用PyInstaller(更具体地说,是GUI)从使用Scrapy的项目生成一个exe文件

主文件按顺序执行两个spider:

从scrapy.crawler导入CrawlerRunner
从twisted.internet导入反应器,延迟
从spiderDummy1导入spiderDummy1
从spiderDummy2导入spiderDummy2
# ... 一些日志配置
@defer.inlineCallbacks
def爬网(流道):
屈服奔跑者爬行(蜘蛛侠1)
屈服于runner.crawl(蜘蛛侠2,起始URL=[”https://google.com"])
反应堆停止()
跑步者=爬虫者(
{
“LOG_STDOUT”:正确,
“已启用日志”:True,
“提要”:{
'items.jl':{
“格式”:“jsonlines”,
“编码”:“utf8”
}   
},
}
)
爬行(跑步者)
反应堆运行()
这是我的蜘蛛

import scrapy
SpiderDummy1类(scrapy.Spider):
name=“蜘蛛虚拟1”
def start_请求(自我):
url=”https://google.com"
生成scrapy.Request(url、self.parse)
def解析(自我,响应):
产量{“foo1”:“bar1”}
这是我的蜘蛛

import scrapy
蜘蛛纲2(刮毛蜘蛛):
name=“蜘蛛虚拟2”
def解析(自我,响应):
产量{“foo2”:“bar2”}
使用
python main.py运行此代码将按预期生成包含以下内容的items.jl:

{"foo1": "bar1"}
{"foo2": "bar2"}
这是运行此程序的日志:

2020-08-13 17:28:37,552 INFO scrapy.crawler Overridden settings:
{'LOG_STDOUT': True}
2020-08-13 17:28:37,575 INFO scrapy.extensions.telnet Telnet Password: ec57f644dc444404
2020-08-13 17:28:37,622 INFO scrapy.middleware Enabled extensions:
['scrapy.extensions.corestats.CoreStats',
 'scrapy.extensions.telnet.TelnetConsole',
 'scrapy.extensions.feedexport.FeedExporter',
 'scrapy.extensions.logstats.LogStats']
2020-08-13 17:28:37,942 INFO scrapy.middleware Enabled downloader middlewares:
['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware',
 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware',
 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware',
 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware',
 'scrapy.downloadermiddlewares.retry.RetryMiddleware',
 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware',
 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware',
 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware',
 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware',
 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware',
 'scrapy.downloadermiddlewares.stats.DownloaderStats']
2020-08-13 17:28:37,949 INFO scrapy.middleware Enabled spider middlewares:
['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware',
 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware',
 'scrapy.spidermiddlewares.referer.RefererMiddleware',
 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware',
 'scrapy.spidermiddlewares.depth.DepthMiddleware']
2020-08-13 17:28:37,950 INFO scrapy.middleware Enabled item pipelines:
[]
2020-08-13 17:28:37,950 INFO scrapy.core.engine Spider opened
2020-08-13 17:28:37,954 INFO scrapy.extensions.logstats Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
2020-08-13 17:28:37,955 INFO scrapy.extensions.telnet Telnet console listening on 127.0.0.1:6023
2020-08-13 17:28:38,206 DEBUG scrapy.downloadermiddlewares.redirect Redirecting (301) to <GET https://www.google.com/> from <GET https://google.com>
2020-08-13 17:28:38,490 DEBUG scrapy.core.engine Crawled (200) <GET https://www.google.com/> (referer: None)
2020-08-13 17:28:38,593 DEBUG scrapy.core.scraper Scraped from <200 https://www.google.com/>

{'foo1': 'bar1'}
2020-08-13 17:28:38,604 INFO scrapy.core.engine Closing spider (finished)
2020-08-13 17:28:38,606 INFO scrapy.extensions.feedexport Stored jsonlines feed (1 items) in: items.jl
2020-08-13 17:28:38,608 INFO scrapy.statscollectors Dumping Scrapy stats:
{'downloader/request_bytes': 424,
 'downloader/request_count': 2,
 'downloader/request_method_count/GET': 2,
 'downloader/response_bytes': 7252,
 'downloader/response_count': 2,
 'downloader/response_status_count/200': 1,
 'downloader/response_status_count/301': 1,
 'elapsed_time_seconds': 0.651448,
 'finish_reason': 'finished',
 'finish_time': datetime.datetime(2020, 8, 13, 20, 28, 38, 605292),
 'item_scraped_count': 1,
 'log_count/DEBUG': 3,
 'log_count/INFO': 11,
 'response_received_count': 1,
 'scheduler/dequeued': 2,
 'scheduler/dequeued/memory': 2,
 'scheduler/enqueued': 2,
 'scheduler/enqueued/memory': 2,
 'start_time': datetime.datetime(2020, 8, 13, 20, 28, 37, 953844)}
2020-08-13 17:28:38,609 INFO scrapy.core.engine Spider closed (finished)
2020-08-13 17:28:38,646 INFO scrapy.crawler Overridden settings:
{'LOG_STDOUT': True}
2020-08-13 17:28:38,647 INFO scrapy.extensions.telnet Telnet Password: 717ee8d5719577dd
2020-08-13 17:28:38,649 INFO scrapy.middleware Enabled extensions:
['scrapy.extensions.corestats.CoreStats',
 'scrapy.extensions.telnet.TelnetConsole',
 'scrapy.extensions.feedexport.FeedExporter',
 'scrapy.extensions.logstats.LogStats']
2020-08-13 17:28:38,652 INFO scrapy.middleware Enabled downloader middlewares:
['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware',
 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware',
 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware',
 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware',
 'scrapy.downloadermiddlewares.retry.RetryMiddleware',
 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware',
 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware',
 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware',
 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware',
 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware',
 'scrapy.downloadermiddlewares.stats.DownloaderStats']
2020-08-13 17:28:38,653 INFO scrapy.middleware Enabled spider middlewares:
['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware',
 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware',
 'scrapy.spidermiddlewares.referer.RefererMiddleware',
 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware',
 'scrapy.spidermiddlewares.depth.DepthMiddleware']
2020-08-13 17:28:38,653 INFO scrapy.middleware Enabled item pipelines:
[]
2020-08-13 17:28:38,653 INFO scrapy.core.engine Spider opened
2020-08-13 17:28:38,654 INFO scrapy.extensions.logstats Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
2020-08-13 17:28:38,655 INFO scrapy.extensions.telnet Telnet console listening on 127.0.0.1:6023
2020-08-13 17:28:38,890 DEBUG scrapy.downloadermiddlewares.redirect Redirecting (301) to <GET https://www.google.com/> from <GET https://google.com>
2020-08-13 17:28:39,149 DEBUG scrapy.core.engine Crawled (200) <GET https://www.google.com/> (referer: None)
2020-08-13 17:28:39,252 DEBUG scrapy.core.scraper Scraped from <200 https://www.google.com/>

{'foo2': 'bar2'}
2020-08-13 17:28:39,254 INFO scrapy.core.engine Closing spider (finished)
2020-08-13 17:28:39,255 INFO scrapy.extensions.feedexport Stored jsonlines feed (1 items) in: items.jl
2020-08-13 17:28:39,256 INFO scrapy.statscollectors Dumping Scrapy stats:
{'downloader/request_bytes': 424,
 'downloader/request_count': 2,
 'downloader/request_method_count/GET': 2,
 'downloader/response_bytes': 7242,
 'downloader/response_count': 2,
 'downloader/response_status_count/200': 1,
 'downloader/response_status_count/301': 1,
 'elapsed_time_seconds': 0.601099,
 'finish_reason': 'finished',
 'finish_time': datetime.datetime(2020, 8, 13, 20, 28, 39, 255262),
 'item_scraped_count': 1,
 'log_count/DEBUG': 3,
 'log_count/INFO': 11,
 'response_received_count': 1,
 'scheduler/dequeued': 2,
 'scheduler/dequeued/memory': 2,
 'scheduler/enqueued': 2,
 'scheduler/enqueued/memory': 2,
 'start_time': datetime.datetime(2020, 8, 13, 20, 28, 38, 654163)}
2020-08-13 17:28:39,256 INFO scrapy.core.engine Spider closed (finished)
它不生成items.jl文件,并提供以下日志:

2020-08-13 17:36:14,927 INFO scrapy.crawler Overridden settings:
{'LOG_STDOUT': True}
2020-08-13 17:36:14,941 INFO scrapy.extensions.telnet Telnet Password: 50e993c30b2eb0fd
2020-08-13 17:36:14,959 INFO scrapy.middleware Enabled extensions:
['scrapy.extensions.corestats.CoreStats',
 'scrapy.extensions.telnet.TelnetConsole',
 'scrapy.extensions.feedexport.FeedExporter',
 'scrapy.extensions.logstats.LogStats']
2020-08-13 17:36:15,323 INFO scrapy.middleware Enabled downloader middlewares:
['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware',
 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware',
 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware',
 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware',
 'scrapy.downloadermiddlewares.retry.RetryMiddleware',
 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware',
 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware',
 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware',
 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware',
 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware',
 'scrapy.downloadermiddlewares.stats.DownloaderStats']
2020-08-13 17:36:15,326 INFO scrapy.middleware Enabled spider middlewares:
['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware',
 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware',
 'scrapy.spidermiddlewares.referer.RefererMiddleware',
 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware',
 'scrapy.spidermiddlewares.depth.DepthMiddleware']
2020-08-13 17:36:15,327 INFO scrapy.middleware Enabled item pipelines:
[]
2020-08-13 17:36:15,327 INFO scrapy.core.engine Spider opened
2020-08-13 17:36:15,330 INFO scrapy.extensions.logstats Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
2020-08-13 17:36:15,331 INFO scrapy.extensions.telnet Telnet console listening on 127.0.0.1:6023
2020-08-13 17:36:15,570 DEBUG scrapy.downloadermiddlewares.redirect Redirecting (301) to <GET https://www.google.com/> from <GET https://google.com>
2020-08-13 17:36:15,825 DEBUG scrapy.core.engine Crawled (200) <GET https://www.google.com/> (referer: None)
2020-08-13 17:36:15,926 ERROR scrapy.core.scraper Spider error processing <GET https://www.google.com/> (referer: None)
Traceback (most recent call last):
  File "twisted\internet\defer.py", line 1418, in _inlineCallbacks
StopIteration: <200 https://www.google.com/>

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "scrapy\utils\defer.py", line 55, in mustbe_deferred
  File "scrapy\core\spidermw.py", line 60, in process_spider_input
  File "scrapy\core\scraper.py", line 152, in call_spider
  File "scrapy\utils\misc.py", line 212, in warn_on_generator_with_return_value
  File "scrapy\utils\misc.py", line 197, in is_generator_with_return_value
  File "inspect.py", line 985, in getsource
  File "inspect.py", line 967, in getsourcelines
  File "inspect.py", line 798, in findsource
OSError: could not get source code
2020-08-13 17:36:16,028 INFO scrapy.core.engine Closing spider (finished)
2020-08-13 17:36:16,029 INFO scrapy.statscollectors Dumping Scrapy stats:
{'downloader/request_bytes': 424,
 'downloader/request_count': 2,
 'downloader/request_method_count/GET': 2,
 'downloader/response_bytes': 7275,
 'downloader/response_count': 2,
 'downloader/response_status_count/200': 1,
 'downloader/response_status_count/301': 1,
 'elapsed_time_seconds': 0.698791,
 'finish_reason': 'finished',
 'finish_time': datetime.datetime(2020, 8, 13, 20, 36, 16, 28306),
 'log_count/DEBUG': 2,
 'log_count/ERROR': 1,
 'log_count/INFO': 10,
 'response_received_count': 1,
 'scheduler/dequeued': 2,
 'scheduler/dequeued/memory': 2,
 'scheduler/enqueued': 2,
 'scheduler/enqueued/memory': 2,
 'spider_exceptions/OSError': 1,
 'start_time': datetime.datetime(2020, 8, 13, 20, 36, 15, 329515)}
2020-08-13 17:36:16,029 INFO scrapy.core.engine Spider closed (finished)
2020-08-13 17:36:16,061 INFO scrapy.crawler Overridden settings:
{'LOG_STDOUT': True}
2020-08-13 17:36:16,062 INFO scrapy.extensions.telnet Telnet Password: 345875f820220cf6
2020-08-13 17:36:16,065 INFO scrapy.middleware Enabled extensions:
['scrapy.extensions.corestats.CoreStats',
 'scrapy.extensions.telnet.TelnetConsole',
 'scrapy.extensions.feedexport.FeedExporter',
 'scrapy.extensions.logstats.LogStats']
2020-08-13 17:36:16,070 INFO scrapy.middleware Enabled downloader middlewares:
['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware',
 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware',
 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware',
 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware',
 'scrapy.downloadermiddlewares.retry.RetryMiddleware',
 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware',
 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware',
 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware',
 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware',
 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware',
 'scrapy.downloadermiddlewares.stats.DownloaderStats']
2020-08-13 17:36:16,071 INFO scrapy.middleware Enabled spider middlewares:
['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware',
 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware',
 'scrapy.spidermiddlewares.referer.RefererMiddleware',
 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware',
 'scrapy.spidermiddlewares.depth.DepthMiddleware']
2020-08-13 17:36:16,071 INFO scrapy.middleware Enabled item pipelines:
[]
2020-08-13 17:36:16,072 INFO scrapy.core.engine Spider opened
2020-08-13 17:36:16,073 INFO scrapy.extensions.logstats Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
2020-08-13 17:36:16,074 INFO scrapy.extensions.telnet Telnet console listening on 127.0.0.1:6023
2020-08-13 17:36:16,308 DEBUG scrapy.downloadermiddlewares.redirect Redirecting (301) to <GET https://www.google.com/> from <GET https://google.com>
2020-08-13 17:36:16,573 DEBUG scrapy.core.engine Crawled (200) <GET https://www.google.com/> (referer: None)
2020-08-13 17:36:16,675 ERROR scrapy.core.scraper Spider error processing <GET https://www.google.com/> (referer: None)
Traceback (most recent call last):
  File "twisted\internet\defer.py", line 1418, in _inlineCallbacks
StopIteration: <200 https://www.google.com/>

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "scrapy\utils\defer.py", line 55, in mustbe_deferred
  File "scrapy\core\spidermw.py", line 60, in process_spider_input
  File "scrapy\core\scraper.py", line 152, in call_spider
  File "scrapy\utils\misc.py", line 212, in warn_on_generator_with_return_value
  File "scrapy\utils\misc.py", line 197, in is_generator_with_return_value
  File "inspect.py", line 985, in getsource
  File "inspect.py", line 967, in getsourcelines
  File "inspect.py", line 798, in findsource
OSError: could not get source code
2020-08-13 17:36:16,776 INFO scrapy.core.engine Closing spider (finished)
2020-08-13 17:36:16,777 INFO scrapy.statscollectors Dumping Scrapy stats:
{'downloader/request_bytes': 424,
 'downloader/request_count': 2,
 'downloader/request_method_count/GET': 2,
 'downloader/response_bytes': 7272,
 'downloader/response_count': 2,
 'downloader/response_status_count/200': 1,
 'downloader/response_status_count/301': 1,
 'elapsed_time_seconds': 0.704001,
 'finish_reason': 'finished',
 'finish_time': datetime.datetime(2020, 8, 13, 20, 36, 16, 776310),
 'log_count/DEBUG': 2,
 'log_count/ERROR': 1,
 'log_count/INFO': 10,
 'response_received_count': 1,
 'scheduler/dequeued': 2,
 'scheduler/dequeued/memory': 2,
 'scheduler/enqueued': 2,
 'scheduler/enqueued/memory': 2,
 'spider_exceptions/OSError': 1,
 'start_time': datetime.datetime(2020, 8, 13, 20, 36, 16, 72309)}
2020-08-13 17:36:16,777 INFO scrapy.core.engine Spider closed (finished)

pyinstaller生成功能性exe时是否缺少任何包或配置?

这是否回答了您的问题?不幸的是不是。。。我的日志没有此FileNotFoundError和ModuleNotFoundError,并且我没有尝试使用
--onefile
选项生成exe。
2020-08-13 17:36:14,927 INFO scrapy.crawler Overridden settings:
{'LOG_STDOUT': True}
2020-08-13 17:36:14,941 INFO scrapy.extensions.telnet Telnet Password: 50e993c30b2eb0fd
2020-08-13 17:36:14,959 INFO scrapy.middleware Enabled extensions:
['scrapy.extensions.corestats.CoreStats',
 'scrapy.extensions.telnet.TelnetConsole',
 'scrapy.extensions.feedexport.FeedExporter',
 'scrapy.extensions.logstats.LogStats']
2020-08-13 17:36:15,323 INFO scrapy.middleware Enabled downloader middlewares:
['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware',
 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware',
 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware',
 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware',
 'scrapy.downloadermiddlewares.retry.RetryMiddleware',
 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware',
 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware',
 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware',
 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware',
 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware',
 'scrapy.downloadermiddlewares.stats.DownloaderStats']
2020-08-13 17:36:15,326 INFO scrapy.middleware Enabled spider middlewares:
['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware',
 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware',
 'scrapy.spidermiddlewares.referer.RefererMiddleware',
 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware',
 'scrapy.spidermiddlewares.depth.DepthMiddleware']
2020-08-13 17:36:15,327 INFO scrapy.middleware Enabled item pipelines:
[]
2020-08-13 17:36:15,327 INFO scrapy.core.engine Spider opened
2020-08-13 17:36:15,330 INFO scrapy.extensions.logstats Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
2020-08-13 17:36:15,331 INFO scrapy.extensions.telnet Telnet console listening on 127.0.0.1:6023
2020-08-13 17:36:15,570 DEBUG scrapy.downloadermiddlewares.redirect Redirecting (301) to <GET https://www.google.com/> from <GET https://google.com>
2020-08-13 17:36:15,825 DEBUG scrapy.core.engine Crawled (200) <GET https://www.google.com/> (referer: None)
2020-08-13 17:36:15,926 ERROR scrapy.core.scraper Spider error processing <GET https://www.google.com/> (referer: None)
Traceback (most recent call last):
  File "twisted\internet\defer.py", line 1418, in _inlineCallbacks
StopIteration: <200 https://www.google.com/>

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "scrapy\utils\defer.py", line 55, in mustbe_deferred
  File "scrapy\core\spidermw.py", line 60, in process_spider_input
  File "scrapy\core\scraper.py", line 152, in call_spider
  File "scrapy\utils\misc.py", line 212, in warn_on_generator_with_return_value
  File "scrapy\utils\misc.py", line 197, in is_generator_with_return_value
  File "inspect.py", line 985, in getsource
  File "inspect.py", line 967, in getsourcelines
  File "inspect.py", line 798, in findsource
OSError: could not get source code
2020-08-13 17:36:16,028 INFO scrapy.core.engine Closing spider (finished)
2020-08-13 17:36:16,029 INFO scrapy.statscollectors Dumping Scrapy stats:
{'downloader/request_bytes': 424,
 'downloader/request_count': 2,
 'downloader/request_method_count/GET': 2,
 'downloader/response_bytes': 7275,
 'downloader/response_count': 2,
 'downloader/response_status_count/200': 1,
 'downloader/response_status_count/301': 1,
 'elapsed_time_seconds': 0.698791,
 'finish_reason': 'finished',
 'finish_time': datetime.datetime(2020, 8, 13, 20, 36, 16, 28306),
 'log_count/DEBUG': 2,
 'log_count/ERROR': 1,
 'log_count/INFO': 10,
 'response_received_count': 1,
 'scheduler/dequeued': 2,
 'scheduler/dequeued/memory': 2,
 'scheduler/enqueued': 2,
 'scheduler/enqueued/memory': 2,
 'spider_exceptions/OSError': 1,
 'start_time': datetime.datetime(2020, 8, 13, 20, 36, 15, 329515)}
2020-08-13 17:36:16,029 INFO scrapy.core.engine Spider closed (finished)
2020-08-13 17:36:16,061 INFO scrapy.crawler Overridden settings:
{'LOG_STDOUT': True}
2020-08-13 17:36:16,062 INFO scrapy.extensions.telnet Telnet Password: 345875f820220cf6
2020-08-13 17:36:16,065 INFO scrapy.middleware Enabled extensions:
['scrapy.extensions.corestats.CoreStats',
 'scrapy.extensions.telnet.TelnetConsole',
 'scrapy.extensions.feedexport.FeedExporter',
 'scrapy.extensions.logstats.LogStats']
2020-08-13 17:36:16,070 INFO scrapy.middleware Enabled downloader middlewares:
['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware',
 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware',
 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware',
 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware',
 'scrapy.downloadermiddlewares.retry.RetryMiddleware',
 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware',
 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware',
 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware',
 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware',
 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware',
 'scrapy.downloadermiddlewares.stats.DownloaderStats']
2020-08-13 17:36:16,071 INFO scrapy.middleware Enabled spider middlewares:
['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware',
 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware',
 'scrapy.spidermiddlewares.referer.RefererMiddleware',
 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware',
 'scrapy.spidermiddlewares.depth.DepthMiddleware']
2020-08-13 17:36:16,071 INFO scrapy.middleware Enabled item pipelines:
[]
2020-08-13 17:36:16,072 INFO scrapy.core.engine Spider opened
2020-08-13 17:36:16,073 INFO scrapy.extensions.logstats Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
2020-08-13 17:36:16,074 INFO scrapy.extensions.telnet Telnet console listening on 127.0.0.1:6023
2020-08-13 17:36:16,308 DEBUG scrapy.downloadermiddlewares.redirect Redirecting (301) to <GET https://www.google.com/> from <GET https://google.com>
2020-08-13 17:36:16,573 DEBUG scrapy.core.engine Crawled (200) <GET https://www.google.com/> (referer: None)
2020-08-13 17:36:16,675 ERROR scrapy.core.scraper Spider error processing <GET https://www.google.com/> (referer: None)
Traceback (most recent call last):
  File "twisted\internet\defer.py", line 1418, in _inlineCallbacks
StopIteration: <200 https://www.google.com/>

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "scrapy\utils\defer.py", line 55, in mustbe_deferred
  File "scrapy\core\spidermw.py", line 60, in process_spider_input
  File "scrapy\core\scraper.py", line 152, in call_spider
  File "scrapy\utils\misc.py", line 212, in warn_on_generator_with_return_value
  File "scrapy\utils\misc.py", line 197, in is_generator_with_return_value
  File "inspect.py", line 985, in getsource
  File "inspect.py", line 967, in getsourcelines
  File "inspect.py", line 798, in findsource
OSError: could not get source code
2020-08-13 17:36:16,776 INFO scrapy.core.engine Closing spider (finished)
2020-08-13 17:36:16,777 INFO scrapy.statscollectors Dumping Scrapy stats:
{'downloader/request_bytes': 424,
 'downloader/request_count': 2,
 'downloader/request_method_count/GET': 2,
 'downloader/response_bytes': 7272,
 'downloader/response_count': 2,
 'downloader/response_status_count/200': 1,
 'downloader/response_status_count/301': 1,
 'elapsed_time_seconds': 0.704001,
 'finish_reason': 'finished',
 'finish_time': datetime.datetime(2020, 8, 13, 20, 36, 16, 776310),
 'log_count/DEBUG': 2,
 'log_count/ERROR': 1,
 'log_count/INFO': 10,
 'response_received_count': 1,
 'scheduler/dequeued': 2,
 'scheduler/dequeued/memory': 2,
 'scheduler/enqueued': 2,
 'scheduler/enqueued/memory': 2,
 'spider_exceptions/OSError': 1,
 'start_time': datetime.datetime(2020, 8, 13, 20, 36, 16, 72309)}
2020-08-13 17:36:16,777 INFO scrapy.core.engine Spider closed (finished)
Running auto-py-to-exe v2.7.5
Building directory: C:\Users\nayra\AppData\Local\Temp\tmp4p2fxau3
Provided command: pyinstaller --noconfirm --onedir --console --add-data "C:/Users/nayra/Desktop/scrapy/log;log/"  "C:/Users/nayra/Desktop/scrapy/main.py"
Recursion Limit is set to 5000
Executing: pyinstaller --noconfirm --onedir --console --add-data C:/Users/nayra/Desktop/scrapy/log;log/ C:/Users/nayra/Desktop/scrapy/main.py --distpath C:\Users\nayra\AppData\Local\Temp\tmp4p2fxau3\application --workpath C:\Users\nayra\AppData\Local\Temp\tmp4p2fxau3\build --specpath C:\Users\nayra\AppData\Local\Temp\tmp4p2fxau3

200432 INFO: PyInstaller: 4.0
200437 INFO: Python: 3.8.3 (conda)
200443 INFO: Platform: Windows-10-10.0.18362-SP0
200449 INFO: wrote C:\Users\nayra\AppData\Local\Temp\tmp4p2fxau3\main.spec
200459 INFO: UPX is not available.
200477 INFO: Extending PYTHONPATH with paths
['C:\\Users\\nayra\\Desktop\\scrapy',
 'C:\\Users\\nayra\\AppData\\Local\\Temp\\tmp4p2fxau3']
200514 INFO: checking Analysis
200520 INFO: Building Analysis because Analysis-00.toc is non existent
200528 INFO: Initializing module dependency graph...
200539 INFO: Caching module graph hooks...
200565 INFO: Analyzing base_library.zip ...
203878 INFO: Processing pre-find module path hook distutils from 'd:\\miniconda3\\lib\\site-packages\\PyInstaller\\hooks\\pre_find_module_path\\hook-distutils.py'.
203884 INFO: distutils: retargeting to non-venv dir 'd:\\miniconda3\\lib'
208951 INFO: Caching module dependency graph...
209154 INFO: running Analysis Analysis-00.toc
209189 INFO: Adding Microsoft.Windows.Common-Controls to dependent assemblies of final executable
  required by d:\miniconda3\python.exe
209627 INFO: Analyzing C:\Users\nayra\Desktop\scrapy\main.py
212480 INFO: Processing pre-safe import module hook six.moves from 'd:\\miniconda3\\lib\\site-packages\\PyInstaller\\hooks\\pre_safe_import_module\\hook-six.moves.py'.
216208 INFO: Processing module hooks...
216212 INFO: Loading module hook 'hook-cryptography.py' from 'd:\\miniconda3\\lib\\site-packages\\_pyinstaller_hooks_contrib\\hooks\\stdhooks'...
216579 INFO: Loading module hook 'hook-lxml.etree.py' from 'd:\\miniconda3\\lib\\site-packages\\_pyinstaller_hooks_contrib\\hooks\\stdhooks'...
216585 INFO: Loading module hook 'hook-pywintypes.py' from 'd:\\miniconda3\\lib\\site-packages\\_pyinstaller_hooks_contrib\\hooks\\stdhooks'...
217108 INFO: Loading module hook 'hook-distutils.py' from 'd:\\miniconda3\\lib\\site-packages\\PyInstaller\\hooks'...
217114 INFO: Loading module hook 'hook-encodings.py' from 'd:\\miniconda3\\lib\\site-packages\\PyInstaller\\hooks'...
217232 INFO: Loading module hook 'hook-lib2to3.py' from 'd:\\miniconda3\\lib\\site-packages\\PyInstaller\\hooks'...
217287 INFO: Loading module hook 'hook-scrapy.py' from 'd:\\miniconda3\\lib\\site-packages\\PyInstaller\\hooks'...
221723 INFO: Processing pre-find module path hook site from 'd:\\miniconda3\\lib\\site-packages\\PyInstaller\\hooks\\pre_find_module_path\\hook-site.py'.
221730 INFO: site: retargeting to fake-dir 'd:\\miniconda3\\lib\\site-packages\\PyInstaller\\fake-modules'
223193 INFO: Processing pre-safe import module hook setuptools.extern.six.moves from 'd:\\miniconda3\\lib\\site-packages\\PyInstaller\\hooks\\pre_safe_import_module\\hook-setuptools.extern.six.moves.py'.
227991 INFO: Loading module hook 'hook-setuptools.py' from 'd:\\miniconda3\\lib\\site-packages\\PyInstaller\\hooks'...
229014 INFO: Loading module hook 'hook-sqlite3.py' from 'd:\\miniconda3\\lib\\site-packages\\PyInstaller\\hooks'...
229160 INFO: Loading module hook 'hook-sysconfig.py' from 'd:\\miniconda3\\lib\\site-packages\\PyInstaller\\hooks'...
229168 INFO: Loading module hook 'hook-xml.dom.domreg.py' from 'd:\\miniconda3\\lib\\site-packages\\PyInstaller\\hooks'...
229178 INFO: Loading module hook 'hook-xml.etree.cElementTree.py' from 'd:\\miniconda3\\lib\\site-packages\\PyInstaller\\hooks'...
229184 INFO: Loading module hook 'hook-xml.py' from 'd:\\miniconda3\\lib\\site-packages\\PyInstaller\\hooks'...
229191 INFO: Loading module hook 'hook-_tkinter.py' from 'd:\\miniconda3\\lib\\site-packages\\PyInstaller\\hooks'...
229421 INFO: checking Tree
229429 INFO: Building Tree because Tree-00.toc is non existent
229438 INFO: Building Tree Tree-00.toc
229556 INFO: checking Tree
229565 INFO: Building Tree because Tree-01.toc is non existent
229582 INFO: Building Tree Tree-01.toc
229614 INFO: Loading module hook 'hook-eel.py' from 'd:\\miniconda3\\lib\\site-packages\\_pyinstaller_hooks_contrib\\hooks\\stdhooks'...
229836 INFO: Loading module hook 'hook-pycparser.py' from 'd:\\miniconda3\\lib\\site-packages\\_pyinstaller_hooks_contrib\\hooks\\stdhooks'...
229844 INFO: Loading module hook 'hook-gevent.py' from 'd:\\miniconda3\\lib\\site-packages\\PyInstaller\\hooks'...
230387 INFO: Determining a mapping of distributions to packages...
250993 WARNING: Unable to find package for requirement zope.event from package gevent.
251003 WARNING: Unable to find package for requirement zope.interface from package gevent.
251019 WARNING: Unable to find package for requirement greenlet from package gevent.
251026 INFO: Packages required by gevent:
['setuptools', 'cffi']
252462 INFO: Loading module hook 'hook-pkg_resources.py' from 'd:\\miniconda3\\lib\\site-packages\\PyInstaller\\hooks'...
252903 INFO: Processing pre-safe import module hook win32com from 'd:\\miniconda3\\lib\\site-packages\\_pyinstaller_hooks_contrib\\hooks\\pre_safe_import_module\\hook-win32com.py'.
253530 WARNING: Hidden import "pkg_resources.py2_warn" not found!
253542 WARNING: Hidden import "pkg_resources.markers" not found!
253556 INFO: Excluding import '__main__'
253572 INFO:   Removing import of __main__ from module pkg_resources
253588 INFO: Loading module hook 'hook-pythoncom.py' from 'd:\\miniconda3\\lib\\site-packages\\_pyinstaller_hooks_contrib\\hooks\\stdhooks'...
254138 INFO: Loading module hook 'hook-win32com.py' from 'd:\\miniconda3\\lib\\site-packages\\_pyinstaller_hooks_contrib\\hooks\\stdhooks'...
254787 INFO: Looking for ctypes DLLs
254945 INFO: Analyzing run-time hooks ...
254968 INFO: Including run-time hook 'd:\\miniconda3\\lib\\site-packages\\PyInstaller\\hooks\\rthooks\\pyi_rth__tkinter.py'
254979 INFO: Including run-time hook 'd:\\miniconda3\\lib\\site-packages\\PyInstaller\\hooks\\rthooks\\pyi_rth_multiprocessing.py'
254996 INFO: Including run-time hook 'd:\\miniconda3\\lib\\site-packages\\PyInstaller\\hooks\\rthooks\\pyi_rth_pkgres.py'
255008 INFO: Including run-time hook 'd:\\miniconda3\\lib\\site-packages\\PyInstaller\\hooks\\rthooks\\pyi_rth_win32comgenpy.py'
255022 INFO: Including run-time hook 'd:\\miniconda3\\lib\\site-packages\\_pyinstaller_hooks_contrib\\hooks\\rthooks\\pyi_rth_twisted.py'
255061 INFO: Looking for dynamic libraries
255771 INFO: Looking for eggs
255779 INFO: Using Python library d:\miniconda3\python38.dll
255794 INFO: Found binding redirects: 
[]
255821 INFO: Warnings written to C:\Users\nayra\AppData\Local\Temp\tmp4p2fxau3\build\main\warn-main.txt
256088 INFO: Graph cross-reference written to C:\Users\nayra\AppData\Local\Temp\tmp4p2fxau3\build\main\xref-main.html
256165 INFO: Appending 'datas' from .spec
256179 INFO: checking PYZ
256189 INFO: Building PYZ because PYZ-00.toc is non existent
256203 INFO: Building PYZ (ZlibArchive) C:\Users\nayra\AppData\Local\Temp\tmp4p2fxau3\build\main\PYZ-00.pyz
258584 INFO: Building PYZ (ZlibArchive) C:\Users\nayra\AppData\Local\Temp\tmp4p2fxau3\build\main\PYZ-00.pyz completed successfully.
258634 INFO: checking PKG
258643 INFO: Building PKG because PKG-00.toc is non existent
258658 INFO: Building PKG (CArchive) PKG-00.pkg
258716 INFO: Building PKG (CArchive) PKG-00.pkg completed successfully.
258727 INFO: Bootloader d:\miniconda3\lib\site-packages\PyInstaller\bootloader\Windows-64bit\run.exe
258740 INFO: checking EXE
258756 INFO: Building EXE because EXE-00.toc is non existent
258770 INFO: Building EXE from EXE-00.toc
258792 INFO: Appending archive to EXE C:\Users\nayra\AppData\Local\Temp\tmp4p2fxau3\build\main\main.exe
258846 INFO: Building EXE from EXE-00.toc completed successfully.
258870 INFO: checking COLLECT
258886 INFO: Building COLLECT because COLLECT-00.toc is non existent
258911 INFO: Building COLLECT COLLECT-00.toc
262077 INFO: Building COLLECT COLLECT-00.toc completed successfully.

Moving project to: C:\Users\nayra\Desktop\scrapy\dist
Complete.