Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What is the ideal number of max connections for a postgres database?

I'm currently using the default connection pool in sequelize, which is as follows:

const defaultPoolingConfig = {
  max: 5,
  min: 0,
  idle: 10000,
  acquire: 10000,
  evict: 10000,
  handleDisconnects: true
};

Of late, I'm getting these errors ResourceRequest timed out which are due to the above DB configuration. According to some answers the max pool should be set to 5, but those who have faced the above, Resource timeout, error have suggested to increase the pool size to 30, along with increasing the acquire time.

I need to know what must be the optimum value of max pool size for a web-app.

Edit: 1.Lets say I have 200 concurrent users, and I have 20 concurrent queries. Then what should be the values?
2.My database is provided by GCP, with the following configuration: vCPUs 1
Memory 3.75 GB SSD storage 10 GB

  1. I'm adding some graphs for CPU utilization, Read / write operations per second and transactions per second. Transactions per second CPU Utilization Read / write operations

  2. My workload resources are as follows:

    resources: limits: cpu: 500m memory: 600Mi requests: cpu: 200m memory: 500Mi

like image 492
farhan Avatar asked Oct 24 '25 18:10

farhan


1 Answers

The number of concurrent connections should be large enough for the number of concurrent running queries or transactions you may have.

If you have a lower limit, then new queries/transactions will have to wait for an available connection.

You may want to monitor currently running queries (see pg_stat_activity for instance) to detect such issues.

However, your database server must be able to handle the number of connections. If you are using a server provided by a third party, it may have set limits. If you are using your own server, then it needs to be configured properly.

Note that to handle more connections, your database server will need more processes and more RAM. Also, if they are long running queries (as opposed to transactions), then you are most probably resource-constrained on the server (often I/O-bound), and adding more queries running at the same time usually won't help with overall performance. You may want to look at configuration of your DB server (buffers etc.), and of course, if you haven't already done so, optimise your queries (make sure they all use indexes). The other pg_stat_* views and EXPLAIN are your friends here.

If you have long-running transactions with lots of idle time, then more concurrent connections may help, though you may have to wonder why you have such long-running transactions.

To summarise, your next steps should be to:

  • Check the immediate state of your database server using pg_stat_activity and friends.

  • If you don't already have that, set up monitoring of I/O, CPU, memory, swap, postgresql statistics over time. This will give you a clearer picture of what is going on on your server. If you don't have that, you're just running blind.

  • If you have long-running transactions, check that you always correctly release transactions/connections, including when errors occur. This is a pretty common issue with node.js-based web servers. Make sure you use try .. catch blocks wherever needed.

  • If there are any long-running queries, check that they are properly optimised (using indexes). If not, do your utmost to optimise them. This will be the single most useful step you can take if that's were the issue is.

  • If they are properly optimised and you have enough spare resources (RAM, I/O...), then you can consider raising the number of connections. Otherwise it's just pointless.

Edit

Since you are not operating the database yourself, you won't necessarily have all the visibility you could have on resource usage.

However, you can still:

  • Check pg_stat_activity. This alone will tell you a lot of things.
  • Check for connections/transactions that are kept around when they shouldn't
  • Check queries are properly optimised

GCP has a default maximum concurrent connections limit set to 100 for instances with 3.75 GiB of RAM. So you could indeed increase the size of your pool. But if any of the above issues are present, you are just delaying or moving the issue a bit further, so start by checking those and fixing them if relevant.

like image 92
jcaron Avatar answered Oct 26 '25 09:10

jcaron