Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Can I run a background process in Django without starting a parallel process?

This is a very naive question, but I feel I don't understand something fundamental about asynchronous/background tasks in django and python.

I try to replicate a simple example provided by django-background-tasks (https://github.com/collinmutembei/django-background-tasks-example) in order to make django perform a background task 60 seconds later than it was actually run. But I guess the same is valid for any other background task manager such as Celery or Huey

The example is pretty simple - as soon as the user accesses the url, a simple function that prints a message is executed without blocking the main django process, 60 sec later

  from background_task import background
  from logging import getLogger

  logger = getLogger(__name__)

  @background(schedule=60)
  def demo_task(message):
      logger.debug('demo_task. message={0}'.format(message))

The problem is that I really don't understand the basics. It doesn't run unless I start a separate (or detached) process python manage.py process_tasks. Should I always do it to make the background task work, or there is a way to do it without starting a parallel process?

If I should start a parallel process, can I do it from inside of django code. Something like:

    import subprocess

    process = subprocess.Popen(['python', 'manage.py','process_tasks'], stdout=subprocess.PIPE, stderr=subprocess.PIPE)
like image 890
Philipp Chapkovski Avatar asked Nov 29 '25 11:11

Philipp Chapkovski


1 Answers

It is not necessary but good and helpful to run a separate process to run tasks in the background.

When you run a server, a process is created - run ps aux | grep runserver - which is responsible for serving web requests. When you say that you want to run certain tasks in the background, it implicitly means that you want a separate process to execute those tasks. This is where asynchronous task tools like celery come in.

You can also spawn a separate process yourself - as you said - by doing:

import subprocess

process = subprocess.Popen(['python', 'manage.py','process_tasks'], stdout=subprocess.PIPE, stderr=subprocess.PIPE

This method is also completely fine if you have just one or two small tasks that you want to run in parallel. However, when you have tons of complicated tasks that you are running in the background, you would want to manage them properly. Also, you need to be able to debug those tasks, if something goes wrong. Later, you will need more visibility into what is happening in all the background tasks, their status, etc. This is where celery will help you. It will give you decorated methods which will handle all those things for you. You just have to worry about your business logic then

like image 61
aash Avatar answered Dec 01 '25 00:12

aash



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!