Warning: file_get_contents(/data/phpspider/zhask/data//catemap/8/python-3.x/15.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 如何在通过代理的连接失败的情况下重试当前循环 p>所以我有一组链接要删除,但是经常发生的是,在中间循环(当试图连接到一个随机链接)时,突然通过代理连接失败,循环停止,程序关闭。p>_Python_Python 3.x_Exception_Beautifulsoup - Fatal编程技术网

Python 如何在通过代理的连接失败的情况下重试当前循环 p>所以我有一组链接要删除,但是经常发生的是,在中间循环(当试图连接到一个随机链接)时,突然通过代理连接失败,循环停止,程序关闭。p>

Python 如何在通过代理的连接失败的情况下重试当前循环 p>所以我有一组链接要删除,但是经常发生的是,在中间循环(当试图连接到一个随机链接)时,突然通过代理连接失败,循环停止,程序关闭。p>,python,python-3.x,exception,beautifulsoup,Python,Python 3.x,Exception,Beautifulsoup,代码如下: import requests from bs4 import BeautifulSoup as soup #Setting Proxy proxies = {"http": "http://232.454.676.898:8888"} #List Of Links link_strings = ['http://foo1.com','http://foo2.com','http://foo3.com', ... ,'http://foo999.com'] for link

代码如下:

import requests
from bs4 import BeautifulSoup as soup


#Setting Proxy
proxies = {"http": "http://232.454.676.898:8888"}

#List Of Links
link_strings = ['http://foo1.com','http://foo2.com','http://foo3.com', ... ,'http://foo999.com']

for link in link_strings:
    url = link

    uClient = requests.get(url, proxies=proxies)
    page_html = uClient.text
    uClient.close()

    page_soup = soup(page_html, "html.parser")

    #Do some scrapping
那么,如何处理呢?
我是否应该尝试通过代理连接到,直到成功?但是怎么做呢?

或者我应该再次运行当前循环?但是如何做到这一点呢?

尝试捕获
请求引发的异常。get()
并循环,直到连接工作:

exception = True
while (exception):
    exception = False
    try:
        uClient = requests.get(url, proxies=proxies)
    except requests.exceptions.RequestException as e:
        exception = True
请注意,如果连接不起作用,这可能会创建无限循环!相反,当连接失败时,您可以
继续

try:
    uClient = requests.get(url, proxies=proxies)
except requests.exceptions.RequestException as e:
    continue

尝试捕获
请求引发的异常。get()
并循环,直到连接工作:

exception = True
while (exception):
    exception = False
    try:
        uClient = requests.get(url, proxies=proxies)
    except requests.exceptions.RequestException as e:
        exception = True
请注意,如果连接不起作用,这可能会创建无限循环!相反,当连接失败时,您可以
继续

try:
    uClient = requests.get(url, proxies=proxies)
except requests.exceptions.RequestException as e:
    continue

使用try-exception块捕获异常,并以不同的方式处理。请给我一个代码示例好吗?使用try-exception块捕获异常时出现显式异常的正常try-exception,你能给我一个代码的例子吗?一个正常的尝试,除了包含重试计数器的显式异常可能是个好主意。包含重试计数器可能是个好主意。