Airflow DAGs that demonstrate how to interact with the Experimental Airflow REST API.
This DAG contains code for querying from some dummy data and checking whether the query parameters passed through the DAG run conf correspond to any data that can be found in the table. The DAG contains two Airflow tasks which communicate via XComs. The first task runs an input verification function to check whether the inputs to conf are valid values. The second task performs the actual query on the table.
query_field- The column to search fromquery_value- The value to search for inquery_field
| Student | Class | Grade |
|---|---|---|
| Alice | A | B- |
| Bob | A | B+ |
| Connor | B | A- |
| Daniel | B | C+ |
This DAG contains code for triggering the api_task_DAG to run through the Airflow API. This DAG is for demonstration purposes only. There's little reason to use the Airflow API through a DAG running on that same server. But code here can be nearly identical to how a Python script would call the API from outside.
This code triggers the api_task_DAG through the API and repeatedly makes GET requests to the Airflow server to watch the status of the triggered DAG run. When the run has either failed or succeeded, this DAG outputs the result to log and finishes running.
- This DAG will succeed assuming that the Airflow server that hosts
api_task_DAGallows API calls and authenticates usingauth_backendand has an API user calledapi_userwhose password ispassword. - The authentication method used here for simplicity's sake is HTTP Basic Auth, which is not secure as the credentials are passed over the connection unencrypted. This is not safe to use when the Airflow server does not use HTTPS.
- Since this DAG runs from within a Dockerized Airflow instance, the url in the POST request refers to the name of the
webserverDocker container. This will obviously change if calling the API from outside the Dockerized Airflow environment.