Python Web抓取特殊字符提取数据

Python Web抓取特殊字符提取数据,python,html,web-scraping,beautifulsoup,Python,Html,Web Scraping,Beautifulsoup,我希望得到以下数据: 总数,17.6 土地,17.6 水,0.0页面上的“”不是“点”。它是一个unicode字符(\u2022) 您可以使用python的regex(re)模块来实现这一点 更新后的代码如下所示: cityURL='https://en.wikipedia.org/wiki/Elko,_Nevada' def createObj(url): html = urlopen(url) bsObj = BeautifulSoup(html, 'lxml')

我希望得到以下数据:

总数,17.6 土地,17.6 水,0.0

页面上的“”不是“点”。它是一个unicode字符(\u2022)

您可以使用python的regex(re)模块来实现这一点

更新后的代码如下所示:

cityURL='https://en.wikipedia.org/wiki/Elko,_Nevada'

def createObj(url):
    html = urlopen(url)
    bsObj = BeautifulSoup(html, 'lxml')
    return bsObj

bsObj1 = createObj(cityURL)

table1 = bsObj1.find("table", {"class":"infobox geography vcard"})
incorporated = table1.find("th", text='Incorporated (city)').findNext('td').get_text()

table1.find("th", text='. Total') # Problem here, due to the special dot, I cannot identify the "th"
或者,您可以使用比beautifulsoup快得多的lxml模块

import re
cityURL='https://en.wikipedia.org/wiki/Elko,_Nevada'

def createObj(url):
    html = urlopen(url)
    bsObj = BeautifulSoup(html, 'lxml')
    return bsObj

bsObj1 = createObj(cityURL)

table1 = bsObj1.find("table", {"class":"infobox geography vcard"})
incorporated = table1.find("th", text='Incorporated (city)').findNext('td').get_text()

pattern = re.compile(r'Total')
table1.find("th", text=pattern)

如果答案对你有用,一定要接受,因为这也会帮助别人。谢谢。
import requests
from lxml import html

cityURL='https://en.wikipedia.org/wiki/Elko,_Nevada'
r = requests.get(cityURL)
root = html.fromstring(r.content)

def normalize(text) : 
    return ''.join([i if ord(i) < 128 else ' ' for i in text]).strip().split()[0]

val_list = [(normalize(root.xpath('//table[@class="infobox geography vcard"]//tr[./th/text()="Area"]/following-sibling::tr[{}]//text()'.format(str(val)))[1]), normalize(root.xpath('//table[@class="infobox geography vcard"]//tr[./th/text()="Area"]/following-sibling::tr[{}]//text()'.format(str(val)))[3])) for val in xrange(1,4)]
print(val_list)
[(u'Total', u'17.6'), (u'Land', u'17.6'), (u'Water', u'0.0')]