Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bulk update values with SQLAlchemy #546

Open
Yureien opened this issue Mar 14, 2019 · 4 comments
Open

Bulk update values with SQLAlchemy #546

Yureien opened this issue Mar 14, 2019 · 4 comments

Comments

@Yureien
Copy link

Yureien commented Mar 14, 2019

Since aiopg does not support bulk insert (#112), so I use this to insert everything in a single query:

await conn.execute(
    sa_foo_table
    .insert()
    .values([
        dict(name='name1', x=1),
        dict(name='name2', x=2),
        dict(name='name3', x=3),
    ])
)

Is there any such thing for bulk updating? Because if I update one by one, it might take quite some time (there are thousands of rows).

@aio-libs-bot
Copy link

GitMate.io thinks possibly related issues are #43 (Error with sqlalchemy), #345 (SQLAlchemy Dialects), #140 (Using cursors with SQLAlchemy ), #478 (Update all secret: values in .travis.yml), and #45 (Possibly use SQLAlchemy Strategies).

@vir-mir
Copy link
Member

vir-mir commented Mar 23, 2019

@fadedcoder hi.

you probably mean queries of this type:

async def test_bulk(connect):
    query = tbl.update().where(
        tbl.c.id == sa.bindparam('id')
    ).values(name=sa.bindparam('name'))

    await connect.execute(query, [
        dict(id='1', name='test_update1'),
        dict(id='2', name='test_update2'),
    ])

it's impossible. as asynchronous mode does not support executemany, but according to the documentation psycopg2 http://initd.org/psycopg/docs/cursor.html#cursor.executemany Warning In its current implementation this method is not faster than executing execute() in a loop.

@iAnanich
Copy link

iAnanich commented Jan 2, 2020

but isn't there a way to compile many updates/inserts into a single sql query and execute it faster than sequentially execute single queries? It only requires values sanitization and it must be ok.

@Yureien
Copy link
Author

Yureien commented Jan 15, 2020

@vir-mir is it somehow possible to use execute_batch() or execute_values()?

According to psycopg/psycopg2#491 (comment):

cur.executemany(): 761.322767019 sec
execute_batch(): 14.6529989243 sec
execute_values(): 12.4037430286 sec
pgcopy: 2.85529208183 sec

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants