Bilbottom / sql-learning-materials

SQL scripts that demonstrate various features and concepts.

Home Page:https://bilbottom.github.io/sql-learning-materials/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Poetry Python Docker GitHub last commit

SQL Server PostgreSQL SQLite DuckDB Metabase


SQL Learning Materials

SQL scripts that demonstrate various features and concepts.

This project contains a bunch of SQL learning materials aimed at different levels of experience and covering a variety of topics. It focuses on just writing SELECT statements so there will be very few resources for anything else.

Jump into https://bilbottom.github.io/sql-learning-materials/ to see the summary of what's covered in this project, and continue below for instructions on how to set up the databases.

Acknowledgements

The data used in this project is from a couple of sources.

The SQL Server instance will load the ubiquitous AdventureWorks databases (the transactional one and the analytical one), which is available from various Microsoft pages:

The PostgreSQL instance will load the similarly ubiquitous Sakila database, which is available from the following GitHub repo:

All credit for the data goes to the respective owners, and these sources should be consulted for any documentation you need around the data.

The docs are built using MkDocs and the Material for MkDocs theme, with several plugins to make the experience more interactive. I cannot express the love I have for the maintainers of these tools!

Pre-requisites

This project uses Poetry to manage the Python dependencies and Docker to spin up the databases.

To install these, follow the instructions on their websites:

Quick start

After installing the pre-requisites and cloning this repo, just run the resources package to download the files needed to feed into the SQL Server and PostgreSQL databases before running Docker's compose command.

poetry install --sync  # --with dev,test,docs
python -m resources
docker compose up --detach
mkdocs build
mkdocs serve
docker compose down --volumes  # When you're finished

This will take a little while to run since there's a fair bit of data to chunk through.

You can connect to the databases using any of your favourite SQL clients.

The credentials for the databases are not sensitive and are defined in the docker-compose.yml file. For reference, the credentials are:

Database Username Password Host Port
SQL Server SA Test@12345 localhost 1433
PostgreSQL postgres Test@12345 localhost 5432

The SQLite and DuckDB databases are just files, so using Docker for these is overkill -- when you run python -m resources, the files for these databases will be created in your file system. The file locations are defined in the resources.toml config file (you can override them there if you want) and are:

Database File location
SQLite src/resources/data/sqlite/loan.db
DuckDB src/resources/data/duckdb/duckdb.db

The Metabase instance will be launched on localhost:3000 and you will have to configure your own login details.

On an M1 mac...

...you will have to make sure that you have enabled the virtualisation framework and Rosetta for amd64 support, see the following GitHub issue and comment:

About

SQL scripts that demonstrate various features and concepts.

https://bilbottom.github.io/sql-learning-materials/

License:Apache License 2.0


Languages

Language:Python 88.7%Language:Dockerfile 11.3%