Warning: file_get_contents(/data/phpspider/zhask/data//catemap/1/list/4.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 3.x 分块列表抛出';zip参数#2必须支持迭代';创建多个dict时_Python 3.x_List_Itertools - Fatal编程技术网

Python 3.x 分块列表抛出';zip参数#2必须支持迭代';创建多个dict时

Python 3.x 分块列表抛出';zip参数#2必须支持迭代';创建多个dict时,python-3.x,list,itertools,Python 3.x,List,Itertools,我在将分块列表转换为多个字典以成批发送请求时遇到问题: fd = open(filename, 'r') sqlFile = fd.read() fd.close() commands = sqlFile.split(';') for command in commands: try: c = conn.cursor() c.execute(command) // create a list with the query results

我在将分块列表转换为多个字典以成批发送请求时遇到问题:

fd = open(filename, 'r')
sqlFile = fd.read()
fd.close()
commands = sqlFile.split(';')
for command in commands:
    try:
        c = conn.cursor()
        c.execute(command)

        // create a list with the query results with batches of size 100
        for batch in grouper(c.fetchall(),100):
            // This is where the error occurs:
            result = [dict(zip([key[0] for key in c.description], i)) for i in batch]
            # TODO: Send the json with 100 items to API
        
    except RuntimeError:
        print('Error.') 
问题是它只在批中迭代一次,并给出以下错误。实际上,行数是167。因此,在第一个请求中应该有100个要发送的项目,而在第二个迭代中应该有67个要在第二个请求中发送的项目

TypeError: zip argument #2 must support iteration

我用
c.rowfactory=makeDictFactory(c)
立即制作了一本字典,解决了这个问题:

def makeDictFactory(cursor):
columnNames = [d[0] for d in cursor.description]
def createRow(*args):
    return dict(zip(columnNames, args))
return createRow

def getAndConvertDataFromDatabase:(filename)
fd = open(filename, 'r')
sqlFile = fd.read()
fd.close()
commands = sqlFile.split(';')
for command in commands:
    try:
        c = conn.cursor()
        c.execute(command)
        c.rowfactory = makeDictFactory(c)
        data = c.fetchall()
        for batch in [data[x:x+100] for x in range(0, len(data), 100)]:
        return postBody(json.dumps(batch,default = myconverter), dataList[filename])
    except RuntimeError:
        print('Error.')