Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Execute Airflow DAG instances (tasks) on a list of specific dates

Tags:

python

airflow

I would like to manage a couple of future releases using Apache airflow. All of these releases are known way in advance and I need to make sure some data pushing won't be forgotten.

The problem is that those future release do not follow a simple periodic schedule that could be handled with a classic cron like 0 1 23 * * or something like @monthly.

It's rather 2019-08-24, 2019-09-30 , 2019-10-20 ...

Is there another way but to create a seperate mydag.py file for all of those future releases? What is the standard / recommended way to do this? Am I thinking about this the wrong way (I wonder because the documentation and tutorials rather focus on the regular, periodic thing)?

like image 785
Matt Bannert Avatar asked Jan 18 '26 12:01

Matt Bannert


1 Answers

I can think of two simple ways of doing this

  1. Create 3-4 top-level DAGs, each having specific start_date = 2019-08-24, 2019-09-30... and schedule_interval='@once'

  2. Create a single top-level DAG having schedule_interval=None (start_date can be anything). Then create a "triggering-dag", that employs TriggerDagRunOperator to conditionally trigger your actual workflow on specific dates

Clearly the method 2 above is better

like image 158
y2k-shubham Avatar answered Jan 21 '26 01:01

y2k-shubham



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!