Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Apache AIRFLOW - How to send arguments to Python Script

Tags:

python

airflow

In Admin->Connection I set an Conn Type S3.

Basically I have this code in My Python script:

if __name__ == '__main__':
    AWS_ACCESS_KEY_ID = "..."
    AWS_SECRET_ACCESS_KEY = "..."
    AWS_DEFAULT_REGION = "..."
    Start_Work

What I want to do is call my script from Airflow and pass to it the arguments for the connection (instead of hard code them in the script).

How do I do that?

Let's assume that this is the connection: enter image description here

How do I access each filed data?

like image 274
jack Avatar asked Nov 25 '25 13:11

jack


1 Answers

One thing you can do is import the provide_session util to then retrieve the connection based on the conn_id. You can then pass that to the python operator.

So it would look something like this:

from airflow.utils.db import provide_session

@provide_session
def get_conn(conn_id, session=None):
    conn = (session.query(Connection)
                   .filter(Connection.conn_id == conn_id)
                   .first())
    return conn

def my_python_function():

   conn = get_conn('connection_id')

   key_id = conn.extra_dejson.get('AWS_ACCESS_KEY_ID')
   secret_key = conn.extra_dejson.get('AWS_SECRET_ACCESS_KEY')
   default_region = conn.extra_dejson.get('DEFAULT_REGION')

task1 = PythonOperator(task_id='my_task', python_callable=my_python_function, dag=dag)

task1

EDIT: Removed quotes from python callable

like image 130
Joshua Bonomini Avatar answered Nov 28 '25 03:11

Joshua Bonomini



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!