odaneau-astro / query-tag-sandbox

Simple repo to test adding Query Tags to Snowflake & SQL Operators

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Overview

Sandbox repo to test using hook_params to pass query tags to Snowflake for all BaseSQLOperator-related Operators.

To configure the Snowflake connection, you can rename .example_env to .env and provide your credentials in the AIRFLOW_CONN_SNOWFLAKE_ADMIN environment variable.

AIRFLOW__LOGGING__LOGGING_LEVEL is set to debug mode. This allows to see SESSION_PARAMETERS printed to the task logs.

The test_hook_params.py file contains a Dag to test all the BaseSQLOperator-related Operators. The gusty_dag folder contains a Dag to test with gusty

We can customize which Airflow Templated Variables are passed to Snowflake in the include/utils.py file.

Project Contents

Your Astro project contains the following files and folders:

  • dags: This folder contains the Python files for your Airflow DAGs. By default, this directory includes two example DAGs:
    • example_dag_basic: This DAG shows a simple ETL data pipeline example with three TaskFlow API tasks that run daily.
    • example_dag_advanced: This advanced DAG showcases a variety of Airflow features like branching, Jinja templates, task groups and several Airflow operators.
  • Dockerfile: This file contains a versioned Astro Runtime Docker image that provides a differentiated Airflow experience. If you want to execute other commands or overrides at runtime, specify them here.
  • include: This folder contains any additional files that you want to include as part of your project. It is empty by default.
  • packages.txt: Install OS-level packages needed for your project by adding them to this file. It is empty by default.
  • requirements.txt: Install Python packages needed for your project by adding them to this file. It is empty by default.
  • plugins: Add custom or community plugins for your project to this file. It is empty by default.
  • airflow_settings.yaml: Use this local-only file to specify Airflow Connections, Variables, and Pools instead of entering them in the Airflow UI as you develop DAGs in this project.

Deploy Your Project Locally

  1. Start Airflow on your local machine by running 'astro dev start'.

This command will spin up 4 Docker containers on your machine, each for a different Airflow component:

  • Postgres: Airflow's Metadata Database
  • Webserver: The Airflow component responsible for rendering the Airflow UI
  • Scheduler: The Airflow component responsible for monitoring and triggering tasks
  • Triggerer: The Airflow component responsible for triggering deferred tasks
  1. Verify that all 4 Docker containers were created by running 'docker ps'.

Note: Running 'astro dev start' will start your project with the Airflow Webserver exposed at port 8080 and Postgres exposed at port 5432. If you already have either of those ports allocated, you can either stop your existing Docker containers or change the port.

  1. Access the Airflow UI for your local Airflow project. To do so, go to http://localhost:8080/ and log in with 'admin' for both your Username and Password.

You should also be able to access your Postgres Database at 'localhost:5432/postgres'.

Deploy Your Project to Astronomer

If you have an Astronomer account, pushing code to a Deployment on Astronomer is simple. For deploying instructions, refer to Astronomer documentation: https://docs.astronomer.io/cloud/deploy-code/

Contact

The Astronomer CLI is maintained with love by the Astronomer team. To report a bug or suggest a change, reach out to our support.

About

Simple repo to test adding Query Tags to Snowflake & SQL Operators


Languages

Language:Python 99.6%Language:Dockerfile 0.4%