Warning: file_get_contents(/data/phpspider/zhask/data//catemap/8/mysql/65.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
python pymysql.err.OperationalError:(2013年,在查询期间与MySQL服务器的连接中断';)_Python_Mysql_Pymysql - Fatal编程技术网

python pymysql.err.OperationalError:(2013年,在查询期间与MySQL服务器的连接中断';)

python pymysql.err.OperationalError:(2013年,在查询期间与MySQL服务器的连接中断';),python,mysql,pymysql,Python,Mysql,Pymysql,我正在抓取一个网站,然后将数据存储到mysql中,代码运行良好,但经过一段时间后,它会出现以下错误。我正在使用Python3.5.1和pymysql连接到数据库 pymysql.err.OperationalError: (2013, 'Lost connection to MySQL server during query') 这是我的密码: from bs4 import BeautifulSoup import urllib.request import re import json i

我正在抓取一个网站,然后将数据存储到mysql中,代码运行良好,但经过一段时间后,它会出现以下错误。我正在使用Python3.5.1和pymysql连接到数据库

pymysql.err.OperationalError: (2013, 'Lost connection to MySQL server during query')
这是我的密码:

from bs4 import BeautifulSoup
import urllib.request
import re
import json
import pymysql
import pymysql.cursors


connection = pymysql.connect(host='XXX.XXX.XXX.XX',
                             user='XXX',
                             password='XXX',
                             db='XXX',
                             charset='utf8mb4',
                             cursorclass=pymysql.cursors.DictCursor)

r = urllib.request.urlopen('http://i.cantonfair.org.cn/en/ExpExhibitorList.aspx?k=glassware')
soup = BeautifulSoup(r, "html.parser")

links = soup.find_all("a", href=re.compile(r"expexhibitorlist\.aspx\?categoryno=[0-9]+"))
linksfromcategories = ([link["href"] for link in links])

string = "http://i.cantonfair.org.cn/en/"
linksfromcategories = [string + x for x in linksfromcategories]


for link in linksfromcategories:

  response = urllib.request.urlopen(link)
  soup2 = BeautifulSoup(response, "html.parser")

  links2 = soup2.find_all("a", href=re.compile(r"\ExpExhibitorList\.aspx\?categoryno=[0-9]+"))
  linksfromsubcategories = ([link["href"] for link in links2])

  linksfromsubcategories = [string + x for x in linksfromsubcategories]
  for link in linksfromsubcategories:

        response = urllib.request.urlopen(link)
        soup3 = BeautifulSoup(response, "html.parser")
        links3 = soup3.find_all("a", href=re.compile(r"\ExpExhibitorList\.aspx\?categoryno=[0-9]+"))
        linksfromsubcategories2 = ([link["href"] for link in links3])

        linksfromsubcategories2 = [string + x for x in linksfromsubcategories2]
        for link in linksfromsubcategories2:

              response2 = urllib.request.urlopen(link)
              soup4 = BeautifulSoup(response2, "html.parser")
              companylink = soup4.find_all("a", href=re.compile(r"\expCompany\.aspx\?corpid=[0-9]+"))
              companylink = ([link["href"] for link in companylink])
              companydetail = soup4.find_all("div", id="contact")
              companylink = [string + x for x in companylink]
              my_list = list(set(companylink))

              for link in my_list:
                  print (link)
                  response3 = urllib.request.urlopen(link)
                  soup5 = BeautifulSoup(response3, "html.parser")
                  companydetail = soup5.find_all("div", id="contact")                      
                  for d in companydetail:
                        lis = d.find_all('li')
                        companyname = lis[0].get_text().strip()
                        companyaddress = lis[1].get_text().strip()
                        companycity = lis[2].get_text().strip()
                        try:
                            companypostalcode = lis[3].get_text().strip()
                            companypostalcode = companypostalcode.replace(",","")                                
                        except:
                            companypostalcode = lis[3].get_text().strip()
                        try:
                            companywebsite = lis[4].get_text().strip()
                            companywebsite = companywebsite.replace("\xEF\xBC\x8Cifl...","")
                        except IndexError:
                            companywebsite = 'null'


                        try:
                            with connection.cursor() as cursor:


                                print ('saving company details to db')
                                cursor.execute("""INSERT INTO company(
                                                                       companyname,address,city,pincode,website) 
                                                                   VALUES (%s, %s, %s, %s, %s)""",
                                                                   (companyname, companyaddress, companycity, 
                                                                    companypostalcode, companywebsite))
                            connection.commit()

                        finally:
                            print ("Company Data saved")
                  productlink = soup5.find_all("a", href=re.compile(r"\ExpProduct\.aspx\?corpid=[0-9]+.categoryno=[0-9]+"))
                  productlink = ([link["href"] for link in productlink])

                  productlink = [string + x for x in productlink]
                  productlinkun = list(set(productlink))
                  for link in productlinkun:

                      print (link)
                      responseproduct = urllib.request.urlopen(link)
                      soupproduct = BeautifulSoup(responseproduct, "html.parser")
                      productname = soupproduct.select('div[class="photolist"] li a')
                      for element in productname:
                          print ("====================Product Name=======================")
                          productnames = element.get_text().strip()
                          print (productnames)
                          try:
                              with connection.cursor() as cursor:

                                  # Create a new record
                                  print ('saving products to db')
                                  cursor.execute("""INSERT INTO products(
                                                                       companyname,products) 
                                                                   VALUES (%s, %s)""",
                                                                   (companyname, productnames))
                                  connection.commit()

                          finally:
                              print ("Products Data Saved")
现在我无法找出我的代码哪里出错了

希望它能有所帮助:

while True:  #it works until the data was not saved
    try:
        with connection.cursor() as cursor:


            print ('saving company details to db')
            cursor.execute("""INSERT INTO company(
                                                   companyname,address,city,pincode,website) 
                                               VALUES (%s, %s, %s, %s, %s)""",
                                               (companyname, companyaddress, companycity, 
                                                companypostalcode, companywebsite))
        connection.commit()
        break
    except OperationalError:
        connection.ping(True)
print ("Company Data saved")
您还可以看到使用连接池的类似情况

或者阅读

希望它能有所帮助:

while True:  #it works until the data was not saved
    try:
        with connection.cursor() as cursor:


            print ('saving company details to db')
            cursor.execute("""INSERT INTO company(
                                                   companyname,address,city,pincode,website) 
                                               VALUES (%s, %s, %s, %s, %s)""",
                                               (companyname, companyaddress, companycity, 
                                                companypostalcode, companywebsite))
        connection.commit()
        break
    except OperationalError:
        connection.ping(True)
print ("Company Data saved")
您还可以看到使用连接池的类似情况


或者阅读

首先请隐藏密码并登录此代码连接到远程sql server。有时网络丢失,连接中断。您应该使用try/except/finally块,因为finnaly不会处理错误。首先,请隐藏密码并登录此代码连接到远程sql server。有时网络丢失,连接中断。您应该使用try/except/finally块,因为finnaly不处理该错误。它现在发出错误“ConnectionResetError:[WinError 10054]现有连接被远程主机强制关闭”这是否有效?我遇到了同样的问题。似乎是在尝试提交超过500k行的数据时引起的。@JeffB成功了。也许您正在尝试仅在一个事务中插入?如果是这样,只需将数据分割成小块。它现在给出错误“ConnectionResetError:[WinError 10054]远程主机强制关闭了现有连接”这是否有效?我遇到了同样的问题。似乎是在尝试提交超过500k行的数据时引起的。@JeffB成功了。也许您正在尝试仅在一个事务中插入?如果是这样,只需将数据分割成小块。