Python 从多个链接中删除表格数据,并将其合并到一个excel文件中
我有一个链接,在这个链接中,我有一些产品。每种产品都有一个规格表。该表的第一列应该是标题,第二列应该是与之对应的数据。每个表的第一列都不同,有一些重叠的类别。我想得到一个大桌子,上面有所有这些类别,每行都有不同的产品。我能够获得一个表(一种产品)的数据,如下所示:Python 从多个链接中删除表格数据,并将其合并到一个excel文件中,python,beautifulsoup,Python,Beautifulsoup,我有一个链接,在这个链接中,我有一些产品。每种产品都有一个规格表。该表的第一列应该是标题,第二列应该是与之对应的数据。每个表的第一列都不同,有一些重叠的类别。我想得到一个大桌子,上面有所有这些类别,每行都有不同的产品。我能够获得一个表(一种产品)的数据,如下所示: import requests import pandas as pd import xlsxwriter import csv from lxml import html from bs4 import BeautifulSoup
import requests
import pandas as pd
import xlsxwriter
import csv
from lxml import html
from bs4 import BeautifulSoup
url= "https://www.1800cpap.com/resmed-airfit-n30-nasal-cpap-mask-with-headgear"
source_code= requests.get(url)
plain_text= source_code.text
soup= BeautifulSoup(plain_text, 'html.parser')
table= soup.find("table", {"class":"table"})
print(table)
output_rows=[]
table_rows= table.find_all('tr')
#print(table_rows)
headers = [td.text for td in soup.select_one('.table').select('td:nth-of-type(1)')]
with open("data.csv", "w", encoding="utf-8-sig", newline='') as csv_file:
w = csv.writer(csv_file, delimiter = ",", quoting=csv.QUOTE_MINIMAL)
w.writerow(headers)
for table in soup.select('table'):
w.writerow([td.text for td in table.select('td:nth-of-type(2)')])
我知道,对于不同的产品,我必须循环到eac产品的链接,我能够做到这一点。但是,如何将每个表附加到以前的输出中,以保持所需的表结构
import requests
import pandas as pd
from bs4 import BeautifulSoup
url = 'https://www.1800cpap.com/cpap-masks/nasal'
def get_item(url):
soup = BeautifulSoup(requests.get(url).content, 'html.parser')
print('Getting {}..'.format(url))
title = soup.select_one('h1.product-details-full-content-header-title').get_text(strip=True)
all_data = {'Item Title': title}
for tr in soup.select('#product-specs-list tr'):
h, v = [td.get_text(strip=True) for td in tr.select('td')]
all_data[h.rstrip(':')] = v
return all_data
all_data = []
for page in range(1, 2):
print('Page {}...'.format(page))
soup = BeautifulSoup(requests.get(url, params={'page': page}).content, 'html.parser')
for a in soup.select('a.facets-item-cell-grid-title'):
u = 'https://www.1800cpap.com' + a['href']
all_data.append(get_item(u))
df = pd.DataFrame(all_data)
df.to_csv('data.csv')
印刷品:
Page 1...
Getting https://www.1800cpap.com/resmed-airfit-n30-nasal-cpap-mask-with-headgear..
Getting https://www.1800cpap.com/dreamwear-nasal-cpap-mask-with-headgear-by-philips-respironics..
Getting https://www.1800cpap.com/eson-2-nasal-cpap-mask-with-headgear-by-fisher-and-paykel..
Getting https://www.1800cpap.com/resmed-mirage-fx-nasal-cpap-mask..
Getting https://www.1800cpap.com/airfit-n30i-nasal-cpap-mask-by-resmed..
Getting https://www.1800cpap.com/dreamwisp-nasal-cpap-mask-fitpack..
Getting https://www.1800cpap.com/respironics-comfortgel-blue-cpap-nasal-mask-with-headgear..
Getting https://www.1800cpap.com/resmed-mirage-fx-for-her-nasal-cpap-mask..
Getting https://www.1800cpap.com/airfit-n20-nasal-cpap-mask-with-headgear..
Getting https://www.1800cpap.com/wisp-nasal-cpap-mask-with-headgear..
Getting https://www.1800cpap.com/pico-nasal-cpap-mask-with-headgear-by-philips-respironics-2..
Getting https://www.1800cpap.com/airfit-n20-for-her-nasal-cpap-mask-with-headgear..
Getting https://www.1800cpap.com/airfit-f10-nasal-cpap-mask-with-headgear..
Getting https://www.1800cpap.com/fisher-and-paykel-zest-q-nasal-mask-with-headgear..
Getting https://www.1800cpap.com/resmed-swift-fx-nano-nasal-cpap-mask-with-headgear..
Getting https://www.1800cpap.com/resmed-ultra-mirage-2-nasal-cpap-mask..
Getting https://www.1800cpap.com/airfit-n10-for-her-nasal-cpap-mask-with-headgear..
Getting https://www.1800cpap.com/eson-nasal-cpap-mask-by-fisher-and-paykel..
Getting https://www.1800cpap.com/resmed-swift-fx-nano-nasal-cpap-mask-for-her-with-headgear..
Getting https://www.1800cpap.com/mirage-activa-lt-cpap-mask-by-resmed..
Getting https://www.1800cpap.com/resmed-mirage-micro-cpap-mask..
Getting https://www.1800cpap.com/phillips-respironics-trueblue-nasal-cpap-mask-with-headgear..
Getting https://www.1800cpap.com/fisher-paykel-zest-cpap-mask..
Getting https://www.1800cpap.com/viva-nasal-cpap-mask-by-3b-medical..
并保存data.csv
(来自LibreOffice的屏幕截图):
印刷品:
Page 1...
Getting https://www.1800cpap.com/resmed-airfit-n30-nasal-cpap-mask-with-headgear..
Getting https://www.1800cpap.com/dreamwear-nasal-cpap-mask-with-headgear-by-philips-respironics..
Getting https://www.1800cpap.com/eson-2-nasal-cpap-mask-with-headgear-by-fisher-and-paykel..
Getting https://www.1800cpap.com/resmed-mirage-fx-nasal-cpap-mask..
Getting https://www.1800cpap.com/airfit-n30i-nasal-cpap-mask-by-resmed..
Getting https://www.1800cpap.com/dreamwisp-nasal-cpap-mask-fitpack..
Getting https://www.1800cpap.com/respironics-comfortgel-blue-cpap-nasal-mask-with-headgear..
Getting https://www.1800cpap.com/resmed-mirage-fx-for-her-nasal-cpap-mask..
Getting https://www.1800cpap.com/airfit-n20-nasal-cpap-mask-with-headgear..
Getting https://www.1800cpap.com/wisp-nasal-cpap-mask-with-headgear..
Getting https://www.1800cpap.com/pico-nasal-cpap-mask-with-headgear-by-philips-respironics-2..
Getting https://www.1800cpap.com/airfit-n20-for-her-nasal-cpap-mask-with-headgear..
Getting https://www.1800cpap.com/airfit-f10-nasal-cpap-mask-with-headgear..
Getting https://www.1800cpap.com/fisher-and-paykel-zest-q-nasal-mask-with-headgear..
Getting https://www.1800cpap.com/resmed-swift-fx-nano-nasal-cpap-mask-with-headgear..
Getting https://www.1800cpap.com/resmed-ultra-mirage-2-nasal-cpap-mask..
Getting https://www.1800cpap.com/airfit-n10-for-her-nasal-cpap-mask-with-headgear..
Getting https://www.1800cpap.com/eson-nasal-cpap-mask-by-fisher-and-paykel..
Getting https://www.1800cpap.com/resmed-swift-fx-nano-nasal-cpap-mask-for-her-with-headgear..
Getting https://www.1800cpap.com/mirage-activa-lt-cpap-mask-by-resmed..
Getting https://www.1800cpap.com/resmed-mirage-micro-cpap-mask..
Getting https://www.1800cpap.com/phillips-respironics-trueblue-nasal-cpap-mask-with-headgear..
Getting https://www.1800cpap.com/fisher-paykel-zest-cpap-mask..
Getting https://www.1800cpap.com/viva-nasal-cpap-mask-by-3b-medical..
并保存data.csv
(来自LibreOffice的屏幕截图):
将所有结果附加到一个列表中,即创建一个列表,然后调用
pd.DataFrame(data=my_data,headers=[col_name])
我试着写了这篇文章,现在我得到了所有附加的表,但是如何过滤唯一的值,以便只有唯一的头出现在第一行中,这些产品在这些类别中按序列号分类?将所有结果附加到列表中,即制作列表,然后调用pd.DataFrame(data=my_data,headers=[col_name])
我尝试编写了这篇文章,现在我得到了所有附加的表,但是如何过滤唯一值,以便只有唯一的头出现在第一行中,这些产品在这些类别中按序列号归类?