Python 我怎样才能做到这一点?我应该为异常使用requests或urllib.error吗?

Python 我怎样才能做到这一点?我应该为异常使用requests或urllib.error吗?,python,python-3.x,exception,beautifulsoup,python-requests,Python,Python 3.x,Exception,Beautifulsoup,Python Requests,我正在尝试处理http响应中的异常 我的代码的问题是,我被迫使用和IF条件来捕获http错误代码 if page.status_code != requests.codes.ok: page.raise_for_status() 我不相信这是正确的方法,我正在尝试以下方法 import requests url = 'http://someurl.com/404-page.html' myHeaders = {'User-agent': 'myUserAgent'} s = req

我正在尝试处理http响应中的异常

我的代码的问题是,我被迫使用和IF条件来捕获http错误代码

if page.status_code != requests.codes.ok:
    page.raise_for_status()
我不相信这是正确的方法,我正在尝试以下方法

import requests

url = 'http://someurl.com/404-page.html'
myHeaders = {'User-agent': 'myUserAgent'}

s = requests.Session()

try:
    page = s.get(url, headers=myHeaders)
    #if page.status_code != requests.codes.ok:
    #     page.raise_for_status()
except requests.ConnectionError:
    print ("DNS problem or refused to connect")
    # Or Do something with it
except requests.HTTPError:
    print ("Some HTTP response error")
    #Or Do something with it
except requests.Timeout:
    print ("Error loading...too long")
    #Or Do something with it, perhaps retry
except requests.TooManyRedirects:
    print ("Too many redirect")
    #Or Do something with it
except requests.RequestException as e:
    print (e.message)
    #Or Do something with it
else:
    print ("nothing happen")
    #Do something if no exception

s.close()
这总是打印“什么都没发生”,我如何才能捕捉到与获取URL相关的所有可能的异常

如果要捕获所有异常,可以捕获一个:

import requests

try:
    r = requests.get(........)
except requests.RequestException as e:
    print(e.message)