Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Run multiple processes in single celery worker on a machine with single CPU

Tags:

python

celery

I am researching on Celery as background worker for my flask application. The application is hosted on a shared linux server (I am not very sure what this means) on Linode platform. The description says that the server has 1 CPU and 2GB RAM. I read that a Celery worker starts worker processes under it and their number is equal to number of cores on the machine - which is 1 in my case.

I would have situations where I have users asking for multiple background jobs to be run. They would all be placed in a redis/rabbitmq queue (not decided yet). So if I start Celery with concurrency greater than 1 (say --concurrency 4), then would it be of any use? Or will the other workers be useless in this case as I have a single CPU?

The tasks would mostly be about reading information to and from google sheets and application database. These interactions can get heavy at times taking about 5-15 minutes. Based on this, will the answer to the above question change as there might be times when cpu is not being utilized?

Any help on this will be great as I don't want one job to keep on waiting for the previous one to finish before it can start or will the only solution be to pay money for a better machine?

Thanks

like image 286
siddharth gupta Avatar asked Dec 21 '25 18:12

siddharth gupta


1 Answers

This is a common scenario, so do not worry. If your tasks are not CPU heavy, you can always overutilise like you plan to do. If all they do is I/O, then you can pick even a higher number than 4 and it will all work just fine.

like image 69
DejanLekic Avatar answered Dec 24 '25 07:12

DejanLekic