Airflow Datastore Examples
This provides example usage for the Airflow Google Cloud Datastore Export Operator.
This Medium article, Perform Daily Backups of Your Google Cloud Datastore using Airflow, describes this code with some visuals.
Usage
This assumes you already have Airflow running.
(Optional) Initial Local Tests
These are optional, but helps ensure a healthy DAG before launching it. Run these on a local airflow instance before saving them to a final deployed one.
- Install airflow locally:
pip install -r requirements.txt
. This gives you access toairflow
commands in later steps. - Copy the contents of
dags
into your local airflowdags
directory - Confirm DAG compiles with:
python dags/dag_datastore_backup.py
- Confirm it can be bagged:
airflow initdb
. - Confirm the task can be run:
airflow test dag_datastore_backup datastore_export 2019-01-01
- This will only work if you have connection IDs and credentials set up in your local Airflow installation. However you may still be able to see it run some of the DAG.
Installation and Running
Similar to other Airflow DAGs, just copy the contents of dags
into your deployed Airflow dags
directory.
(Optional) Install Slack Webhooks
- Uncomment the noted lines in
dag_datastore_backup.py
- Copy
slack_operator.py
into your DAGs directory, found here - Follow
README
found in that link for installing Slack Alerts