Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/assembly/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 如何检查两个以上URL的HTTP错误?_Python_Http Error - Fatal编程技术网

Python 如何检查两个以上URL的HTTP错误?

Python 如何检查两个以上URL的HTTP错误?,python,http-error,Python,Http Error,问题:我有3个URL-testurl1、testurl2和testurl3。我想先尝试testurl1,如果我得到404错误,然后尝试testurl2,如果得到404错误,然后尝试testurl3。如何做到这一点?到目前为止,我已经尝试了下面的方法,但这只适用于两个url,如何添加对第三个url的支持? 普通旧for循环的另一个作业: for url in testurl1, testurl2, testurl3 req = Request(url) try: r

问题:我有3个URL-testurl1、testurl2和testurl3。我想先尝试testurl1,如果我得到404错误,然后尝试testurl2,如果得到404错误,然后尝试testurl3。如何做到这一点?到目前为止,我已经尝试了下面的方法,但这只适用于两个url,如何添加对第三个url的支持?


普通旧for循环的另一个作业:

for url in testurl1, testurl2, testurl3
    req = Request(url)
    try:
        response = urlopen(req)
    except HttpError as err:
        if err.code == 404:
            continue
        raise
    else:
        # do what you want with successful response here (or outside the loop)
        break
else:
    # They ALL errored out with HTTPError code 404.  Handle this?
    raise err

嗯,也许是这样的

from urllib2 import Request, urlopen
from urllib2 import URLError, HTTPError

def checkfiles():
    req = Request('http://testurl1')
    try:
        response = urlopen(req)
        url1=('http://testurl1')

    except HTTPError, URLError:
        try:
            url1 = ('http://testurl2')
        except HTTPError, URLError:
            url1 = ('http://testurl3')
    print url1
    finalURL='wget '+url1+'/testfile.tgz'

    print finalURL

checkfiles()

只需检查响应代码。如果它等于
404
执行您需要的操作。不,urllib为404引发HTTPError,而不是代码为404的响应
from urllib2 import Request, urlopen
from urllib2 import URLError, HTTPError

def checkfiles():
    req = Request('http://testurl1')
    try:
        response = urlopen(req)
        url1=('http://testurl1')

    except HTTPError, URLError:
        try:
            url1 = ('http://testurl2')
        except HTTPError, URLError:
            url1 = ('http://testurl3')
    print url1
    finalURL='wget '+url1+'/testfile.tgz'

    print finalURL

checkfiles()