My queries or my Heroku PGBackups (
pg_dump) are failing with this "out of shared memory" error, or I need to adjust the
In PostgreSQL, the shared lock table tracks locks on a maximum of
max_locks_per_transaction * (max_connections + max_prepared_transactions) objects (e.g., tables). This means that no more than this many distinct objects can be locked at the same time.
If your database stores a large number of tables and schemas, any queries that require locking more than
max_locks_per_transaction * (max_connections + max_prepared_transactions) will raise the error:
ERROR: out of shared memory HINT: You might need to increase max_locks_per_transaction
At the moment, these three settings (
max_prepared_transactions) are set by Heroku Postgres and these can't be modified by customers.
max_locks_per_transactionis set to PostgreSQL's default of 64
max_prepared_transactionsis set to PostgreSQL's default of 0
max_connectionsis set depending on the Heroku Postgres plan's connection limit
While those settings can't be directly modified, using a plan that has a larger connection limit (
max_connections) will effectively increase the shared lock table limit and can help overcome the error.
This error is usually related to a very large number of tables and schemas in a PostgreSQL database, which can also result in performance issues with PostgreSQL tooling (such as
pg_dump, that checks all database objects when capturing a backup). For this scenario and further scalability of your database and application we recommend considering database sharding and splitting out schemas into multiple databases.
Capturing backups on databases with a large number of tables and schemas
If you see this error while capturing a backup with Heroku PGBackups, please note that this feature is only able to support moderately loaded databases up to 20 GB and databases that don't have a large number of tables, schemas or large objects.
To work around this limit, you can use
pg_dump and its
--table options to independently create backups for independent schemas/tables, or a set of them, instead of capturing a backup for the whole database at once.
See also How can I take a logical backup of large Heroku Postgres databases, for further guidance on using
pg_backups to capture backups from a one-off dyno. Especially for large databases, we also recommend capturing backups on a fork on your database to avoid impact on your primary database with the backup load.