Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/346.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 将刮取的数据保存到csv文件中_Python_Python 2.7_Csv_Web Scraping_Beautifulsoup - Fatal编程技术网

Python 将刮取的数据保存到csv文件中

Python 将刮取的数据保存到csv文件中,python,python-2.7,csv,web-scraping,beautifulsoup,Python,Python 2.7,Csv,Web Scraping,Beautifulsoup,我正在迭代一个过程,在这个过程中,我将Python引导到一个网站,并指示Python在指定网站的csv文件中查找地址。我想告诉Python将网站中每个地址值的结果保存到csv文件中 from selenium import webdriver from bs4 import BeautifulSoup import time import csv driver = webdriver.Chrome("C:\Python27\Scripts\chromedriver.exe") chrome

我正在迭代一个过程,在这个过程中,我将Python引导到一个网站,并指示Python在指定网站的csv文件中查找地址。我想告诉Python将网站中每个地址值的结果保存到csv文件中

from selenium import webdriver
from bs4 import BeautifulSoup
import time
import csv


driver = webdriver.Chrome("C:\Python27\Scripts\chromedriver.exe")
chrome = driver.get('https://etrakit.friscotexas.gov/Search/permit.aspx')
with open('C:/Users/thefirstcolumnedited.csv','r') as f:
    addresses = f.readlines()

    for address in addresses:
        driver.find_element_by_css_selector('#cplMain_txtSearchString').clear()       
        driver.find_element_by_css_selector('#cplMain_txtSearchString').send_keys(address)
        driver.find_element_by_css_selector('#cplMain_btnSearch').click()
        time.sleep(5)

    soup = BeautifulSoup(chrome, 'html.parser')

    writer = csv.writer(open('thematchingresults.csv', 'w'))
    writer.writerow(soup)
例如:

 6579 Mountain Sky Rd

上面的地址值从网站检索五行数据。如何告诉Beauty Soup将每个地址值的结果保存在csv文件中?

方法是在循环中写入csv文件(如果要为所有输入地址生成单个
csv
文件,请使用
a
“append”模式)。至于提取结果,我认为(
time.sleep()
是不可靠的,并且通常比应该的要慢)用于结果
元素(带有
id=“ctl00\u cplMain\u rgSearchRslts\u ctl00”的元素),然后使用将
读取到数据框中,然后通过以下方式方便地将数据框转储到CSV文件中:

对于单个“6579 Mountain Sky Rd”地址,运行脚本后,MatchingResults.csv的内容将为:

,Permit Number,Address,Street Name,Applicant Name,Contractor Name,SITE_SUBDIVISION,RECORDID
0,B13-2809,6579 MOUNTAIN SKY RD,MOUNTAIN SKY RD,SHADDOCK HOMES LTD,SHADDOCK HOMES LTD,PCR - SHERIDAN,MAC:1308050328358768
1,B13-4096,6579 MOUNTAIN SKY RD,MOUNTAIN SKY RD,MIRAGE CUSTOM POOLS,MIRAGE CUSTOM POOLS,PCR - SHERIDAN,MAC:1312030307087756
2,L14-1640,6579 MOUNTAIN SKY RD,MOUNTAIN SKY RD,TDS IRRIGATION,TDS IRRIGATION,SHERIDAN,ECON:140506012624706
3,P14-0018,6579 MOUNTAIN SKY RD,MOUNTAIN SKY RD,MIRAGE CUSTOM POOLS,,SHERIDAN,LCR:1401130949212891
4,ROW14-3205,6579 MOUNTAIN SKY RD,MOUNTAIN SKY RD,Housley Group,Housley Group,,TLW:1406190424422330
希望这是一个很好的起点

,Permit Number,Address,Street Name,Applicant Name,Contractor Name,SITE_SUBDIVISION,RECORDID
0,B13-2809,6579 MOUNTAIN SKY RD,MOUNTAIN SKY RD,SHADDOCK HOMES LTD,SHADDOCK HOMES LTD,PCR - SHERIDAN,MAC:1308050328358768
1,B13-4096,6579 MOUNTAIN SKY RD,MOUNTAIN SKY RD,MIRAGE CUSTOM POOLS,MIRAGE CUSTOM POOLS,PCR - SHERIDAN,MAC:1312030307087756
2,L14-1640,6579 MOUNTAIN SKY RD,MOUNTAIN SKY RD,TDS IRRIGATION,TDS IRRIGATION,SHERIDAN,ECON:140506012624706
3,P14-0018,6579 MOUNTAIN SKY RD,MOUNTAIN SKY RD,MIRAGE CUSTOM POOLS,,SHERIDAN,LCR:1401130949212891
4,ROW14-3205,6579 MOUNTAIN SKY RD,MOUNTAIN SKY RD,Housley Group,Housley Group,,TLW:1406190424422330