Most performant method to update many records? #801
-
I use Postgres 12 and a few times a day I need to update 30-80k rows and ideally in less than a minute. Using SQLAlchemy / psycopg2 Account model:
A snippet of what I'm doing now (minus chunking & multithreading for readability). My desired outcome is a significant speed improvement on this. 1k row updates per 1s would be ideal. I'm not sure what is realistic.
From what I understand, this could be a job for |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 5 replies
-
Indeed, use updates = [(account_id, new_address, additional_protocol) from <data_source>]
await connection.executemany(
"""
UPDATE accounts
SET
address = $2,
protocols_used = array_append(protocols_used, $3)
WHERE
id = $1
""",
updates,
) |
Beta Was this translation helpful? Give feedback.
-
Thank you so much! I refactored my script to using your suggestion + using sets instead of lists and total duration for my script now takes 21s instead of 20minutes! That involves fetching 500k records and updating 50k of them. |
Beta Was this translation helpful? Give feedback.
Indeed, use
executemany
: