Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/ssis/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python使用BeautifulSoup将URL写入文件_Python_Beautifulsoup - Fatal编程技术网

Python使用BeautifulSoup将URL写入文件

Python使用BeautifulSoup将URL写入文件,python,beautifulsoup,Python,Beautifulsoup,我正试图将抓取的URL保存到文本文件中,但在文件中找到的结果与打印的结果不同。我只在文件中找到最后一组 urls = ["http://google.com/page=","http://yahoo.com"] for url in urls: for number in range(1,10): conn = urllib2.urlopen(url+str(number)) html = conn.read() soup = BeautifulSoup(html)

我正试图将抓取的URL保存到文本文件中,但在文件中找到的结果与打印的结果不同。我只在文件中找到最后一组

urls = ["http://google.com/page=","http://yahoo.com"]
for url in urls:

for number in range(1,10):
    conn = urllib2.urlopen(url+str(number))
    html = conn.read()
    soup = BeautifulSoup(html)
    links = soup.find_all('a')
    file= open("file.txt","w")
    for tag in links:
        link = tag.get('href')
        print>>file, link
        print link
    file.close()

当您以
'w'
(写入)模式打开文件时,每次都会覆盖该文件。以附加模式打开文件:

file = open("file.txt", "a")