Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/325.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/8/mysql/70.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/1/ssh/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
将刮取的数据保存到python数据库_Python_Mysql_Web Scraping_Mysql Connector - Fatal编程技术网

将刮取的数据保存到python数据库

将刮取的数据保存到python数据库,python,mysql,web-scraping,mysql-connector,Python,Mysql,Web Scraping,Mysql Connector,我刮取数据并将刮取的数据保存到五个列表中,我创建了五行的表,现在我不知道如何将刮取的数据保存到数据库中 我的代码是: import requests from bs4 import BeautifulSoup import re import mysql.connector url = 'https://www.ebay.com/b/Cars-Trucks/6001?_fsrp=0&_sacat=6001&LH_BIN=1&LH_ItemCondition=3000%7

我刮取数据并将刮取的数据保存到五个列表中,我创建了五行的表,现在我不知道如何将刮取的数据保存到数据库中

我的代码是:

import requests
from bs4 import BeautifulSoup
import re
import mysql.connector

url = 'https://www.ebay.com/b/Cars-Trucks/6001?_fsrp=0&_sacat=6001&LH_BIN=1&LH_ItemCondition=3000%7C1000%7C2500&rt=nc&_stpos=95125&Model%2520Year=2020%7C2019%7C2018%7C2017%7C2016%7C2015'
res = requests.get(url)
soup = BeautifulSoup(res.text, 'html.parser')
car_titles =[]
title = soup.find_all('h3', class_='s-item__title', limit = 20)
for title_of_car in title:
    car_titles.append(title_of_car.text)

car_brands = []
brands = soup.find_all('span', class_='s-item__dynamic s-item__dynamicAttributes1', limit = 20)
for brand in brands:
    brand = re.sub(r'Make: ','', brand.text)
    car_brands.append(brand)

car_models = []
models = soup.find_all('span', class_='s-item__dynamic s-item__dynamicAttributes2', limit = 20)
for model in models:
    model = re.sub(r'Model: ', '', model.text)
    car_models.append(model)

car_transmissions = []
transmissions = soup.find_all('span', class_='s-item__dynamic s-item__dynamicAttributes3', limit = 20)
for transmission in transmissions:
    transmission = re.sub(r'Transmission: ', '', transmission.text)
    car_transmissions.append(transmission)

car_prices = []
char_list = ['\$', '\,', '\.']
prices = soup.find_all('span', class_='s-item__price', limit = 20)
for price in prices:
    price = re.sub('|'.join(char_list), '', price.text)
    car_prices.append(int(price))

首先通过以下方式连接到mysql数据库:

import mysql.connector

config = {
  'user': 'username',
  'password': 'password',
  'host': '127.0.0.1',
  'database': 'dbname',
  'raise_on_warnings': True
}

cnx = mysql.connector.connect(**config)

# create cursor to execute mysql commands
cursor = cnx.cursor()

# Now create table like this one
cursor.execute("CREATE TABLE car_titles (title VARCHAR(255))")

# insert data to table like this one
for row in car_titles:
    cursor.execute("INSERT INTO car_titles (title) VALUES (%s)", row)

cnx.close()
主要需要使用sql查询来创建表和插入数据。因此,请学习如何将mysql与python结合使用


然后创建表

您好,您尝试或研究了哪些方法来将数据存储在数据库中?你做什么了吗?记住,这是一个支持有特定问题的人的网站,除非你自己先研究或尝试过,否则不要问。此外,请参阅本一般指南:。谢谢您好,是的,我已经搜索了一天这个案例,并尝试了一些方法,但没有成功(可能是因为这个原因,我的英语不是很好),谢谢您的英语不是很完美或很好,但我在您的代码中看不到任何与数据库相关的内容,而只是导入mysql.connector。将你的mysql代码添加到帖子中!否则,人们将如何帮助你?是的,你是对的,我正在擦除这部分代码,在这里发布,我告诉自己这可能会更好我,我知道为什么要用python连接mysql并在mysql中插入数据,但我不知道如何将我的刮取数据插入我的sql,因为我不是那么专业(如果可能的话,在这种情况下指导我)