How can I resolve the error "ERROR: out of shared memory HINT: You might need to increase max_locks_per_transaction"?


My queries or my Heroku PGBackups (pg_dump) are failing with this "out of shared memory" error, or I need to adjust the max_locks_per_transaction setting.


In PostgreSQL, the shared lock table tracks locks on a maximum of max_locks_per_transaction * (max_connections + max_prepared_transactions) objects (e.g., tables). This means that no more than this many distinct objects can be locked at the same time.

If your database stores a large number of tables and schemas, any queries that require locking more than max_locks_per_transaction * (max_connections + max_prepared_transactions) will raise the error:

ERROR: out of shared memory
HINT: You might need to increase max_locks_per_transaction

At the moment, these three settings (max_locks_per_transaction, max_connections and max_prepared_transactions) are set by Heroku Postgres and these can't be modified by customers.

While those settings can't be directly modified, using a plan that has a larger connection limit (max_connections) will effectively increase the shared lock table limit and can help overcome the error.

This error is usually related to a very large number of tables and schemas in a PostgreSQL database, which can also result in performance issues with PostgreSQL tooling (such as pg_dump, that checks all database objects when capturing a backup). For this scenario and further scalability of your database and application we recommend considering database sharding and splitting out schemas into multiple databases.

Capturing backups on databases with a large number of tables and schemas

If you see this error while capturing a backup with Heroku PGBackups, please note that this feature is only able to support moderately loaded databases up to 20 GB and databases that don't have a large number of tables, schemas or large objects.

To work around this limit, you can use pg_dump and its --schema / --table options to independently create backups for independent schemas/tables, or a set of them, instead of capturing a backup for the whole database at once.

See also How can I take a logical backup of large Heroku Postgres databases, for further guidance on using pg_backups to capture backups from a one-off dyno. Especially for large databases, we also recommend capturing backups on a fork on your database to avoid impact on your primary database with the backup load.

Ask on Stack Overflow

Engage with a community of passionate experts to get the answers you need

Ask on Stack Overflow

Heroku Support

Create a support ticket and our support experts will get back to you

Contact Heroku Support