Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/327.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 100000 HTTP响应代码检查_Python_Http Headers - Fatal编程技术网

Python 100000 HTTP响应代码检查

Python 100000 HTTP响应代码检查,python,http-headers,Python,Http Headers,我有一个大约100000个链接的列表,我想检查HTTP响应代码。以编程方式执行此检查的最佳方法是什么 我正在考虑使用以下Python代码: import requests try: for x in range(0, 100000): r = requests.head(''.join(["http://stackoverflow.com/", str(x)])) # They'll actually be read from a file, and aren't seque

我有一个大约100000个链接的列表,我想检查HTTP响应代码。以编程方式执行此检查的最佳方法是什么

我正在考虑使用以下Python代码:

import requests
try:
  for x in range(0, 100000):
    r = requests.head(''.join(["http://stackoverflow.com/", str(x)]))
    # They'll actually be read from a file, and aren't sequential
    print r.status_code
except requests.ConnectionError:
  print "failed to connect"

。。但是我不知道一次检查这么多URL的潜在副作用。想法?

我能想到的唯一副作用是时间,你可以通过并行请求来缓解时间。(使用或)。

找到了完全相同问题的答案!