Python 为什么使用sqlite的SQLAlchemy insert比直接使用sqlite3慢25倍?

Python 为什么使用sqlite的SQLAlchemy insert比直接使用sqlite3慢25倍?,python,orm,sqlite,sqlalchemy,Python,Orm,Sqlite,Sqlalchemy,为什么使用SQLAlchemy插入100000行的简单测试用例比直接使用sqlite3驱动程序慢25倍?我在现实世界的应用程序中也看到过类似的减速。我做错什么了吗 #!/usr/bin/env python # Why is SQLAlchemy with SQLite so slow? # Output from this program: # SqlAlchemy: Total time for 100000 records 10.74 secs # sqlite3: Total ti

为什么使用SQLAlchemy插入100000行的简单测试用例比直接使用sqlite3驱动程序慢25倍?我在现实世界的应用程序中也看到过类似的减速。我做错什么了吗

#!/usr/bin/env python
# Why is SQLAlchemy with SQLite so slow?
# Output from this program:
# SqlAlchemy: Total time for 100000 records 10.74 secs
# sqlite3:    Total time for 100000 records  0.40 secs


import time
import sqlite3

from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy import Column, Integer, String,  create_engine 
from sqlalchemy.orm import scoped_session, sessionmaker

Base = declarative_base()
DBSession = scoped_session(sessionmaker())

class Customer(Base):
    __tablename__ = "customer"
    id = Column(Integer, primary_key=True)
    name = Column(String(255))

def init_sqlalchemy(dbname = 'sqlite:///sqlalchemy.db'):
    engine  = create_engine(dbname, echo=False)
    DBSession.configure(bind=engine, autoflush=False, expire_on_commit=False)
    Base.metadata.drop_all(engine)
    Base.metadata.create_all(engine)

def test_sqlalchemy(n=100000):
    init_sqlalchemy()
    t0 = time.time()
    for i in range(n):
        customer = Customer()
        customer.name = 'NAME ' + str(i)
        DBSession.add(customer)
    DBSession.commit()
    print "SqlAlchemy: Total time for " + str(n) + " records " + str(time.time() - t0) + " secs"

def init_sqlite3(dbname):
    conn = sqlite3.connect(dbname)
    c = conn.cursor()
    c.execute("DROP TABLE IF EXISTS customer")
    c.execute("CREATE TABLE customer (id INTEGER NOT NULL, name VARCHAR(255), PRIMARY KEY(id))")
    conn.commit()
    return conn

def test_sqlite3(n=100000, dbname = 'sqlite3.db'):
    conn = init_sqlite3(dbname)
    c = conn.cursor()
    t0 = time.time()
    for i in range(n):
        row = ('NAME ' + str(i),)
        c.execute("INSERT INTO customer (name) VALUES (?)", row)
    conn.commit()
    print "sqlite3: Total time for " + str(n) + " records " + str(time.time() - t0) + " sec"

if __name__ == '__main__':
    test_sqlalchemy(100000)
    test_sqlite3(100000)
我已经尝试了许多变体(请参见)

我将尝试测试,然后进行基准测试

由于或映射器开销,它可能仍然会慢一些,但我希望不会慢那么多


你介意尝试并发布结果吗。这是非常有趣的东西。

SQLAlchemy ORM在将更改同步到数据库时使用该模式。这种模式远远超出了简单的数据“插入”。它包括使用属性检测系统接收分配给对象的属性,该系统在对象发生更改时跟踪对象上的更改,包括在中跟踪插入的所有行,其效果是,对于每一行,SQLAlchemy必须检索其“上次插入的id”(如果尚未给定),还涉及到根据需要扫描和排序要插入的行的依赖项。对象还需要进行一定程度的簿记,以保持所有这些操作的运行,这对于大量的行来说,一次创建大量的数据结构可能会花费过多的时间,因此最好将这些数据结构分块

基本上,工作单元是一个很大程度的自动化,以便在没有显式持久化代码的情况下,自动化将复杂对象图持久化到关系数据库的任务,并且这种自动化是有代价的

因此,orm基本上不适用于高性能批量插入。这就是为什么SQLAlchemy有两个独立的库的全部原因,您会注意到,如果您查看索引页,您会看到两个不同的部分-一个用于ORM,另一个用于核心。如果不了解两者,就无法有效地使用SQLAlchemy

对于快速批量插入的用例,SQLAlchemy提供了,这是ORM构建在其之上的SQL生成和执行系统。有效地使用该系统,我们可以生成与原始SQLite版本竞争的插入。下面的脚本说明了这一点,以及预分配主键标识符的ORM版本,以便ORM可以使用executemany()插入行。两个ORM版本一次将刷新分块为1000条记录,这对性能有很大影响

这里观察到的运行时是:

SqlAlchemy ORM: Total time for 100000 records 16.4133379459 secs
SqlAlchemy ORM pk given: Total time for 100000 records 9.77570986748 secs
SqlAlchemy Core: Total time for 100000 records 0.568737983704 secs
sqlite3: Total time for 100000 records 0.595796823502 sec
脚本:

import time
import sqlite3

from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy import Column, Integer, String,  create_engine
from sqlalchemy.orm import scoped_session, sessionmaker

Base = declarative_base()
DBSession = scoped_session(sessionmaker())

class Customer(Base):
    __tablename__ = "customer"
    id = Column(Integer, primary_key=True)
    name = Column(String(255))

def init_sqlalchemy(dbname = 'sqlite:///sqlalchemy.db'):
    global engine
    engine = create_engine(dbname, echo=False)
    DBSession.remove()
    DBSession.configure(bind=engine, autoflush=False, expire_on_commit=False)
    Base.metadata.drop_all(engine)
    Base.metadata.create_all(engine)

def test_sqlalchemy_orm(n=100000):
    init_sqlalchemy()
    t0 = time.time()
    for i in range(n):
        customer = Customer()
        customer.name = 'NAME ' + str(i)
        DBSession.add(customer)
        if i % 1000 == 0:
            DBSession.flush()
    DBSession.commit()
    print "SqlAlchemy ORM: Total time for " + str(n) + " records " + str(time.time() - t0) + " secs"

def test_sqlalchemy_orm_pk_given(n=100000):
    init_sqlalchemy()
    t0 = time.time()
    for i in range(n):
        customer = Customer(id=i+1, name="NAME " + str(i))
        DBSession.add(customer)
        if i % 1000 == 0:
            DBSession.flush()
    DBSession.commit()
    print "SqlAlchemy ORM pk given: Total time for " + str(n) + " records " + str(time.time() - t0) + " secs"

def test_sqlalchemy_core(n=100000):
    init_sqlalchemy()
    t0 = time.time()
    engine.execute(
        Customer.__table__.insert(),
        [{"name":'NAME ' + str(i)} for i in range(n)]
    )
    print "SqlAlchemy Core: Total time for " + str(n) + " records " + str(time.time() - t0) + " secs"

def init_sqlite3(dbname):
    conn = sqlite3.connect(dbname)
    c = conn.cursor()
    c.execute("DROP TABLE IF EXISTS customer")
    c.execute("CREATE TABLE customer (id INTEGER NOT NULL, name VARCHAR(255), PRIMARY KEY(id))")
    conn.commit()
    return conn

def test_sqlite3(n=100000, dbname = 'sqlite3.db'):
    conn = init_sqlite3(dbname)
    c = conn.cursor()
    t0 = time.time()
    for i in range(n):
        row = ('NAME ' + str(i),)
        c.execute("INSERT INTO customer (name) VALUES (?)", row)
    conn.commit()
    print "sqlite3: Total time for " + str(n) + " records " + str(time.time() - t0) + " sec"

if __name__ == '__main__':
    test_sqlalchemy_orm(100000)
    test_sqlalchemy_orm_pk_given(100000)
    test_sqlalchemy_core(100000)
    test_sqlite3(100000)

另请参见:

来自@zzzeek的优秀答案。对于那些想知道查询的相同统计信息的人,我稍微修改了@zzzeek代码,以便在插入这些记录之后立即查询它们,然后将这些记录转换为dict列表

这是结果

SqlAlchemy ORM: Total time for 100000 records 11.9210000038 secs
SqlAlchemy ORM query: Total time for 100000 records 2.94099998474 secs
SqlAlchemy ORM pk given: Total time for 100000 records 7.51800012589 secs
SqlAlchemy ORM pk given query: Total time for 100000 records 3.07699990273 secs
SqlAlchemy Core: Total time for 100000 records 0.431999921799 secs
SqlAlchemy Core query: Total time for 100000 records 0.389000177383 secs
sqlite3: Total time for 100000 records 0.459000110626 sec
sqlite3 query: Total time for 100000 records 0.103999853134 secs
有趣的是,使用裸sqlite3进行查询仍然比使用SQLAlchemy Core快3倍左右。我猜这就是您为返回而不是一个简单的sqlite3行所付出的代价

SQLAlchemy核心的速度大约是使用ORM的8倍。因此,无论发生什么情况,使用ORM进行查询都要慢得多

以下是我使用的代码:

import time
import sqlite3

from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy import Column, Integer, String,  create_engine
from sqlalchemy.orm import scoped_session, sessionmaker
from sqlalchemy.sql import select

Base = declarative_base()
DBSession = scoped_session(sessionmaker())

class Customer(Base):
    __tablename__ = "customer"
    id = Column(Integer, primary_key=True)
    name = Column(String(255))

def init_sqlalchemy(dbname = 'sqlite:///sqlalchemy.db'):
    global engine
    engine = create_engine(dbname, echo=False)
    DBSession.remove()
    DBSession.configure(bind=engine, autoflush=False, expire_on_commit=False)
    Base.metadata.drop_all(engine)
    Base.metadata.create_all(engine)

def test_sqlalchemy_orm(n=100000):
    init_sqlalchemy()
    t0 = time.time()
    for i in range(n):
        customer = Customer()
        customer.name = 'NAME ' + str(i)
        DBSession.add(customer)
        if i % 1000 == 0:
            DBSession.flush()
    DBSession.commit()
    print "SqlAlchemy ORM: Total time for " + str(n) + " records " + str(time.time() - t0) + " secs"
    t0 = time.time()
    q = DBSession.query(Customer)
    dict = [{'id':r.id, 'name':r.name} for r in q]
    print "SqlAlchemy ORM query: Total time for " + str(len(dict)) + " records " + str(time.time() - t0) + " secs"


def test_sqlalchemy_orm_pk_given(n=100000):
    init_sqlalchemy()
    t0 = time.time()
    for i in range(n):
        customer = Customer(id=i+1, name="NAME " + str(i))
        DBSession.add(customer)
        if i % 1000 == 0:
            DBSession.flush()
    DBSession.commit()
    print "SqlAlchemy ORM pk given: Total time for " + str(n) + " records " + str(time.time() - t0) + " secs"
    t0 = time.time()
    q = DBSession.query(Customer)
    dict = [{'id':r.id, 'name':r.name} for r in q]
    print "SqlAlchemy ORM pk given query: Total time for " + str(len(dict)) + " records " + str(time.time() - t0) + " secs"

def test_sqlalchemy_core(n=100000):
    init_sqlalchemy()
    t0 = time.time()
    engine.execute(
        Customer.__table__.insert(),
        [{"name":'NAME ' + str(i)} for i in range(n)]
    )
    print "SqlAlchemy Core: Total time for " + str(n) + " records " + str(time.time() - t0) + " secs"
    conn = engine.connect()
    t0 = time.time()
    sql = select([Customer.__table__])
    q = conn.execute(sql)
    dict = [{'id':r[0], 'name':r[0]} for r in q]
    print "SqlAlchemy Core query: Total time for " + str(len(dict)) + " records " + str(time.time() - t0) + " secs"

def init_sqlite3(dbname):
    conn = sqlite3.connect(dbname)
    c = conn.cursor()
    c.execute("DROP TABLE IF EXISTS customer")
    c.execute("CREATE TABLE customer (id INTEGER NOT NULL, name VARCHAR(255), PRIMARY KEY(id))")
    conn.commit()
    return conn

def test_sqlite3(n=100000, dbname = 'sqlite3.db'):
    conn = init_sqlite3(dbname)
    c = conn.cursor()
    t0 = time.time()
    for i in range(n):
        row = ('NAME ' + str(i),)
        c.execute("INSERT INTO customer (name) VALUES (?)", row)
    conn.commit()
    print "sqlite3: Total time for " + str(n) + " records " + str(time.time() - t0) + " sec"
    t0 = time.time()
    q = conn.execute("SELECT * FROM customer").fetchall()
    dict = [{'id':r[0], 'name':r[0]} for r in q]
    print "sqlite3 query: Total time for " + str(len(dict)) + " records " + str(time.time() - t0) + " secs"


if __name__ == '__main__':
    test_sqlalchemy_orm(100000)
    test_sqlalchemy_orm_pk_given(100000)
    test_sqlalchemy_core(100000)
    test_sqlite3(100000)
我还进行了测试,没有将查询结果转换为dicts,统计数据类似:

SqlAlchemy ORM: Total time for 100000 records 11.9189999104 secs
SqlAlchemy ORM query: Total time for 100000 records 2.78500008583 secs
SqlAlchemy ORM pk given: Total time for 100000 records 7.67199993134 secs
SqlAlchemy ORM pk given query: Total time for 100000 records 2.94000005722 secs
SqlAlchemy Core: Total time for 100000 records 0.43700003624 secs
SqlAlchemy Core query: Total time for 100000 records 0.131000041962 secs
sqlite3: Total time for 100000 records 0.500999927521 sec
sqlite3 query: Total time for 100000 records 0.0859999656677 secs
与ORM相比,使用SQLAlchemy核心进行查询的速度大约快20倍

需要注意的是,这些测试非常肤浅,不应该太认真。我可能错过了一些可以完全改变统计数据的明显技巧。


衡量性能改进的最佳方法是直接在您自己的应用程序中。不要认为我的统计数据是理所当然的。

使用插入表达式只会快10%。我希望知道原因:SqlAlchemy Insert:Total time for 100000 records 9.47秒这并不能让您感到困扰,但如果您有兴趣,可以在插入后使用timit对db会话相关代码计时。我对insert表达式也有同样的问题,它非常慢,请参阅,谢谢您的解释。engine.execute()与DBSession.execute()有显著不同吗?我曾尝试使用DBSession.execute()创建一个插入表达式,但它的速度并没有明显快于完整的ORM版本。engine.execute()和DBSession.execute()基本相同,只是DBSession.execute()将给定的纯SQL字符串包装为text()。如果您使用的是execute/executemany语法,则会产生巨大的差异。pysqlite完全是用C编写的,几乎没有延迟,因此添加到它的execute()调用中的任何Python开销都将在评测中明显显示出来。即使是单个纯Python函数调用也比pysqlite的execute()等纯C函数调用慢得多。您还需要考虑SqLalChany表达式构造通过每个执行步骤()的编译步骤。核心是先创建的,但在最初的几个星期之后,核心概念证明工作了(而且很糟糕)。从那时起,ORM和core是并行开发的。我真的不知道当时为什么会有人选择ORM模型。大多数使用数据库的项目将有+10000行。维护两种更新方法(一种用于单行,另一种用于批量)听起来并不明智。他们需要同时批量插入10000行吗?不特别是。例如,绝大多数web应用程序可能会在每个请求中交换六行。ORM在一些非常著名和高流量的网站上非常受欢迎。我只是想让你知道,2019年,在所有东西的最新版本中,我没有发现与你的时间安排有明显的相对偏差。不过,我也很好奇是否错过了一些“把戏”。