1

I have a multi-threaded process whereby 36 threads write to the db randomly, each once every 10s on average, each with 99% idle time (sleeping).

I am not sure if this means I have 36 active connections, or effectively just one or two because of the sleep in each thread. Probably not relevant but they all use the same username.

Should I reduce the max-connections option in the config file to 36 or to something small like 4 (to reflect the probability that at any instant in time, almost certainly not more than 4 are simultaneously writing)?

Edit: is it possible that its implementation dependent, i.e. how I wrote my python code whether the connections are dropped while sleeping or not?

1 Answer 1

2

the neatest set here would be using pgbouncer for connection pooling: https://pgbouncer.github.io/config.html

default_pool_size = 4 would keep 4 permanent connections to the postgres, pooling your 36 to use one of four when session completes.

I'm recommending pooler, because whether the connection persists on server or not depends on whether you disconnect or not. also zombies would keep a connection, while your code would initiate new sessions.

in short - to run query you have to connect to a database as user. If you run on same session another transaction, you reuse the connection (if you did not disconnect). You have to explicitly disconnect to close the session. If you fail to do it, connection will stay on server (using one of max_connections slots.

also from 9.6 on we have idle_in_transaction_session_timeout, which kills transaction if its idla for longer then n period, which would help to fight against zombies.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.