Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/331.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python27:如何使用URL的CSV文件循环此脚本?_Python_Csv - Fatal编程技术网

Python27:如何使用URL的CSV文件循环此脚本?

Python27:如何使用URL的CSV文件循环此脚本?,python,csv,Python,Csv,我正在尝试使用以下脚本,但我不想键入URL,而是希望它循环并从Links.csv文件中提取URL。最后,我想将所有结果导出到一个新的CSV文件中 import csv import urllib2 import re import requests from BeautifulSoup import BeautifulSoup f = open('Links.csv') csv_f = csv.reader(f) Links =[] for row in csv_f: Links

我正在尝试使用以下脚本,但我不想键入URL,而是希望它循环并从Links.csv文件中提取URL。最后,我想将所有结果导出到一个新的CSV文件中

import csv
import urllib2
import re
import requests
from BeautifulSoup import BeautifulSoup

f = open('Links.csv')

csv_f = csv.reader(f)

Links =[]

for row in csv_f:
    Links.append(row[0])

url = (Links)
response = requests.get(url)
html = response.content

soup = BeautifulSoup(html)
Title = soup.find(id="Title")
Price = soup.find(id="price")

print Title.text, Price.text
f.close()
任何帮助都将不胜感激。

使用函数

def get_title_and_price(url):
    response = requests.get(url)
    html = response.content
    soup = BeautifulSoup(html)
    return soup.find(id="Title"),soup.find(id="price")

data = [(row[0],get_title_and_price(row[0]))for row in csv_f]

循环浏览url列表?我尝试了这个,但似乎不起作用:
import csv import urlib2 import-re-import-import-requests-from-BeautifulSoup f=open('Links.csv',“r”)csv\u f=csv.reader(f)Links=[]用于csv\u f:Links.append中的行(行[0])url=[Links]def-get\u title\u和\u-price(url):response=requests.get(url)html=response.content-soup=BeautifulSoup(html)return-soup.find(id=“Title”)、soup.find(id=“price”)data=[(行[0]、get\u Title\u和\u price(行[0])用于csv中的行打印数据f.close()
我得到的响应是“[]”,没有任何错误。不确定它是否与实际文件本身有关,这是一个简单的URL列表。