I am using aredis to handle the connection pool. Following is how I instantiate redis connections in the main function -
redis_conn = await asyncio.ensure_future(get_redis_conn(redis_host, loop))
Following is the co-routine definition -
async def get_redis_conn(host, loop):
return StrictRedisCluster(startup_nodes=host, decode_responses=True, loop=loop, max_connections=96)
I am using sanic to run the web server. This is how I instantiate that -
app.run(host='0.0.0.0', port=port, after_start=after_start, workers=32)
Is my implementation wrong in some way? I can't figure out how redis reuses these connections?
Since redis-cluster allows 10000 - 32 open connections to be made for each cluster. Now if you have 10 servers than each server cannot make more than 1000 open connection. So the issue probably for your case will be, for each server if you are having 50 workers, then the number of max connections in redis initialisation should not be more than 20. Try reducing this max connections per worker as it worked out perfectly for me.
Eg:
StrictRedisCluster(startup_nodes=host, decode_responses=True, loop=loop, max_connections=35)
So you need to reduce this max_connections limit per worker.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With