Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

celery periodic task that executes other celery tasks not working

I have an API which returns a list of other APIs.

I need to access these APIs every 15 min and put data returned into a database.

The following is what I wrote in celery_worker.py file using celery and redis. But the all tasks does not start.

list_of_APIs = requests.get(the_api_that_returns_list_of_APIs).json()

CELERYBEAT_SCHEDULE = {
    'every-15-minute': {
        'task': 'fetch_data_of_all_APIs',
        'schedule': timedelta(minutes=15),
    },
}

@celery.task
def access_one_API(one_API):
    return requests.get(one_API).json()

@celery.task(name='fetch_data_of_all_APIs')
def fetch_data_of_all_APIs():
    for one_API in list_of_APIs:
          task = access_one_API.delay(one_API)
          # some codes to put all task.id into a list_of_task_id

    for task_id in list_of_task_id:
          # some codes to get the results of all tasks
          # some codes to put all the results into a database

The fetch_data_of_all_APIs function should run every 15 minutes which is supposed to use multiple workers to run the access_one_API function

The celery server starts in the terminal successfully but neither fetch_data_of_all_APIs nor access_one_API starts.

If I pull out codes within the fetch_data_of_all_APIs function, the access_one_API can start and be executed by multiple celery workers. But as soon as I put these codes within a function and decorate it with @celery.task, then both functions do not start.

So I believe it must have something to do with celery.

Many thanks in advance.

like image 315
Yuxiang Zhu Avatar asked Sep 19 '25 17:09

Yuxiang Zhu


1 Answers

Here example how to configure periodic tasks with subtasks in celery(I set 20 seconds for demonstration). tasks.py:

import celery
from celery.canvas import subtask
from celery.result import AsyncResult
# just for example list of integer values
list_of_APIs = [1, 2, 3, 4]


@celery.task(name='access_one_API')
def access_one_API(api):
    """
    Sum of subtask for demonstration
    :param int api:
    :return: int
    """
    return api + api


 @celery.task(name='fetch_data_of_all_APIs')
 def fetch_data_of_all_APIs(list_of_APIs):
    list_task_ids = []

    for api in list_of_APIs: 
        # run of celery subtask and collect id's of subtasks
        task_id = subtask('access_one_API', args=(api, )).apply_async().id
        list_task_ids.append(task_id)

    result_sub_tasks = {}

    for task_id in list_task_ids:
        while True:
            task_result = AsyncResult(task_id)
            if task_result.status == 'SUCCESS':
                # if subtask is finish add result and check result of next subtask
                result_sub_tasks[task_id] = task_result.result

                break

    print result_sub_tasks
    # do something with results of subtasks here...


app = celery.Celery(
   'tasks',
    broker='redis://localhost:6379/0',
    backend='redis://localhost:6379/0',
)


app.conf.beat_schedule = {
    'add-every-20-seconds': {
        'task': 'fetch_data_of_all_APIs',
        'schedule': 20.0,
        # args for fetch_data_of_all_APIs
       'args': (list_of_APIs, )
    },
}

Run celery: celery worker -A tasks.app --loglevel=info --beat

Trace from terminal:

[2017-03-14 10:31:36,361: WARNING/PoolWorker-3] {'929996b3-fc86-4274-b3c3-06c38a6d4edd': 6, 'f44456b4-df93-4a78-9f1d-b2c2d2b05322': 4, '4e44fd57-fbbc-43cd-8616-1eafef559417': 8, '6d943f35-0d74-4319-aa02-30a266aa3cd9': 2}

Hope this helps.

like image 86
Danila Ganchar Avatar answered Sep 21 '25 06:09

Danila Ganchar