Training script to scrap data and save them in postgres.
- Prepare venv for project and install requirements:
$ pip3 install -r requirements.txt
- First of create database for data
$ psql -c "CREATE DATABASE charity;"
- Download csv files using python script:
$ python3 csv_files_downloading.py
- Upload data to database using bash script:
$ ./loading_data_to_db.sh charity
- Profit!
- Everything is in one table because my task was to scrap data not to plan the database structure, but definitely there is a place for relational database
- Some constants are taken from website because mostly scraping is one time operation so it need to be done effectively (here we don't know anything about checking if there is new data incoming)