Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/facebook/9.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
如何在Python中的多个进程之间共享单个MySQL数据库连接_Python_Mysql Python - Fatal编程技术网

如何在Python中的多个进程之间共享单个MySQL数据库连接

如何在Python中的多个进程之间共享单个MySQL数据库连接,python,mysql-python,Python,Mysql Python,如何创建单个数据库连接并让每个进程与之对话,以最大限度地减少每次迭代产生新连接的开销 下面是一些示例代码来说明我要做的事情: import multiprocessing import os.path import hashlib import sys VALID_EXTENSIONS = ('.JPG', '.GIF', '.JPEG') MAX_FILE_SZ = 1000000 #Declare a global mysql connection db = MySQLdb.conn

如何创建单个数据库连接并让每个进程与之对话,以最大限度地减少每次迭代产生新连接的开销

下面是一些示例代码来说明我要做的事情:

import multiprocessing
import os.path
import hashlib
import sys


VALID_EXTENSIONS = ('.JPG', '.GIF', '.JPEG')
MAX_FILE_SZ = 1000000

#Declare a global mysql connection

db = MySQLdb.connect(host="localhost",
                     user=config.mysql_user,
                     passwd=config.mysql_pass,
                     db=config.mysql_db)

def md5_file(fname):
    try:
        with open(fname) as fo:
            m = hashlib.md5()
            chunk_sz = m.block_size * 128
            data = fo.read(chunk_sz)
            while data:
                m.update(data)
                data = fo.read(chunk_sz)

        md5_hash = m.hexdigest()
        md5_file.queue.put((fname, md5_hash))

        #DATABASE LOGIC
        cursor = db.cursor()
        cursor.execute("""INSERT INTO ...""")

    except IOError:
        md5_file.queue.put((fname, None))


def is_valid_file(fname):
    ext = os.path.splitext(fname)[1].upper()
    fsz = os.path.getsize(fname)
    return ext in VALID_EXTENSIONS and fsz <= MAX_FILE_SZ


def init(queue):
    md5_file.queue = queue


def main():
    # Holds tuple (fname, md5sum) / md5sum will be none if an IOError occurs
    queue = multiprocessing.Queue()
    pool = multiprocessing.Pool(None, init, [queue])

    for dirpath, dirnames, filenames in os.walk(sys.argv[1]):
        # Convert filenames to full paths...
        full_path_fnames = map(lambda fn: os.path.join(dirpath, fn), 
                               filenames)
        full_path_fnames = filter(is_valid_file, full_path_fnames)
        pool.map(md5_file, full_path_fnames)

    # Dump the queue
    while not queue.empty():
        print queue.get()
    return 0

if __name__ == '__main__':
    sys.exit(main())
导入多处理
导入操作系统路径
导入hashlib
导入系统
有效的扩展名=('.JPG'、'.GIF'、'.JPEG')
最大文件大小=1000000
#声明一个全局mysql连接
db=MySQLdb.connect(host=“localhost”,
user=config.mysql\u user,
passwd=config.mysql\u pass,
db=config.mysql\u db)
def md5_文件(fname):
尝试:
以open(fname)作为fo:
m=hashlib.md5()
chunk_sz=m.block_size*128
数据=fo.read(chunk_sz)
而数据:
m、 更新(数据)
数据=fo.read(chunk_sz)
md5_hash=m.hexdigest()
md5_file.queue.put((fname,md5_散列))
#数据库逻辑
cursor=db.cursor()
cursor.execute(““插入…””)
除IOError外:
md5_file.queue.put((fname,None))
def是有效的文件(fname):
ext=os.path.splitext(fname)[1].upper()
fsz=os.path.getsize(fname)
在需要连接池的有效扩展名和fsz中返回ext。