Python 如何抓取网站内的特定链接?

Python 如何抓取网站内的特定链接?,python,pandas,beautifulsoup,Python,Pandas,Beautifulsoup,我成功地浏览了标题和链接 我想用链接中的主要文章替换摘要选项卡(因为标题和摘要是一样的) link=”https://www.vanglaini.org“+article.a['href'] (例如) 请帮我修改代码 下面是我的代码 import pandas as pd import requests from bs4 import BeautifulSoup source = requests.get('https://www.vanglaini.org/').text soup = B

我成功地浏览了标题链接

我想用链接中的主要文章替换摘要选项卡(因为标题和摘要是一样的)

link=”https://www.vanglaini.org“+article.a['href']

(例如)

请帮我修改代码

下面是我的代码

import pandas as pd
import requests
from bs4 import BeautifulSoup

source = requests.get('https://www.vanglaini.org/').text
soup = BeautifulSoup(source, 'lxml')

list_with_headlines = []
list_with_summaries = []
list_with_links = []

for article in soup.find_all('article'):
    if article.a is None:
        continue
    headline = article.a.text.strip()
    summary = article.p.text.strip()
    link = "https://www.vanglaini.org" + article.a['href']
    list_with_headlines.append(headline)
    list_with_summaries.append(summary)
    list_with_links.append(link)

news_csv = pd.DataFrame({
    'Headline': list_with_headlines,
    'Summary': list_with_summaries,
    'Link' : list_with_links,
})

print(news_csv)
news_csv.to_csv('test.csv')

只需在for循环内部再次执行请求并获取标记文本

import pandas as pd
import requests
from bs4 import BeautifulSoup

source = requests.get('https://www.vanglaini.org/').text
soup = BeautifulSoup(source, 'lxml')

list_with_headlines = []
list_with_summaries = []
list_with_links = []

for article in soup.find_all('article'):
    if article.a is None:
        continue
    headline = article.a.text.strip()
    link = "https://www.vanglaini.org" + article.a['href']
    list_with_headlines.append(headline)
    list_with_links.append(link)
    soup = BeautifulSoup(requests.get(link).text, 'lxml')
    list_with_summaries.append(soup.select_one(".pagesContent").text)

news_csv = pd.DataFrame({
    'Headline': list_with_headlines,
    'Summary': list_with_summaries,
    'Link' : list_with_links,
})

print(news_csv)
news_csv.to_csv('test.csv')
Csv将是这样的


@anddrew这就是你想要的?是的,非常感谢。我会问你是否还有其他问题,希望你能回答。(我是python新手)如何在一个文件中向csv添加新新闻?我的意思是继续每天爬行,并在相同的csv文件中添加新信息。我有这个错误