I am trying to generate dynamic workflow in airflow based on user input. I know there is option to have it based on data from file and database but in all these cases, workflow will not directly be dependent on user input and in case where multiple users are using same dag then in that case also issue may come. To avoid all these, i am thinking of passing user input to sub dag and generate the workflow. But subdag does not have option of passing user input from ui.
I guess using variables is a good solution for a problem BUT users may overwrite each other changes (some issues can occur).
Alternative 1:
Airflow has a REST API on top which supports dag triggering functionality.
Request example:
curl -X POST \
'http://localhost:8080/api/experimental/dags/<DAG_ID>/dag_runs' \
--header 'Cache-Control: no-cache' \
--header 'Content-Type: application/json' \
--data '{"conf":"{\"key\":\"value\"}"}'
The data section can store some user input which will later be accessed in Airflow operators.
More documentation: https://airflow.apache.org/docs/apache-airflow/stable/stable-rest-api-ref.html#operation/get_import_error
Alternative 2:
Airflow supports a CLI interface that can be used for triggering dags. You can specify extra configurations as a configuration parameter (-c option). Configurations can store user input.
Comand format:
airflow trigger_dag [-h] [-sd SUBDIR] [-r RUN_ID] [-c CONF] [-e EXEC_DATE]
dag_id
More documentation: http://airflow.apache.org/docs/apache-airflow/1.10.5/cli.html#trigger_dag
StackOverflow question which shows how configuration parameters can be accessed in Airflow operators: Accessing configuration parameters passed to Airflow through CLI
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With