- Organization: Foster Love Project
- Client Contacts:
Kelly Hughes
,Elizabeth Baldoni
,Krissy Evans
- Original Creators:
Alex Bellomo
,Max Kornyev
,Austin Leung
,Lydia Xing
, andSean Zhou
- Student Consultants:
Tomas Viejobueno
,Harriet Khang
,Aashai Avadhani
,Yanyu Lin
, andHikma Redi
- The beta deployment is at flp-app.herokuapp.com
Python 3.6.6
Django 3.1.7
- See
requirements.txt
for a complete enumaration of package dependencies
The following will set up a python environment for this project using virtualenv
.
This allows us to keep all your project dependencies (or pip modules
) in isolation, and running their correct versions.
- If you dont have
virtualenv
install:pip install virtualenv
- In the project directory, create the env:
virtualenv djangoEnv
(set djangoEnv to your preferred env name)- it is already added in the gitignore
- Start the env:
source djangoEnv/bin/activate
- Install all [new] dependencies:
pip install -r requirements.txt
- Exit the env:
deactivate
orexit
terminal - FOR DEVELOPMENT: After installing new python libraries to your pipenv, you must update the
requirements.txt
file- Do this by running
pip freeze > requirements.txt
AFTER you have started the environment
- Do this by running
- If you created some new models:
python manage.py makemigrations inventory
&&python manage.py migrate
- If you added new static items like images or CSS files:
python manage.py collectstatic
- Run the app:
python manage.py runserver
~> localhost:8000
python manage.py createsuperuser
- Create an admin username and password
- Log into admin view at localhost:8000/admin/
python manage.py populate
- Creates sample model data for testing
- Creates a superuser login:
<username>:<pass>
|admin:admin
- Creates a staff login:
<username>:<pass>
|staff:staff
python manage.py drop
- Destroys all objects
python manage.py collectstatic
- Updates static items i.e. if any images or CSS files are added
- Needs to be run to access new static resources
Use only when all of below is true:
- database is constructed using "import" command, with provided MANAGE_INVENTORY_FILE and MANAGE_ITEMS_FILE in google drive.
- item table in database include an "outdated" column.
for only recognizable item names that are outdated, mark them as outdated and merge quantity of them into new items.
Some examples of recognizable names are: jacket (kid boy), pajamas/pj's (baby girl), girl accessories 0-6 mo python manage.py trans
- Mark certain items as outdated
- Add quantity of the outdated items to mapped items
python manage.py quantity
- If any item quantities happen to go under 0, it sets them to 0
- Issue that causes negative numbers, should be fixed, but useful script in case database has a mishap.
- Run the suite with
./manage.py test
You will follow these directions if there is no EC2 instance created with a docker image running:
-
To SSH into AWS, you can find our private key file in Google Drive and use the AWS login credentials in the handoff doc to get the public DNS
-
Deployment tutorial: https://stackabuse.com/deploying-django-applications-to-aws-ec2-with-docker - also please make sure you turn off debug mode in settings.py, check out our deployment PR to see what changes you have to make and save the .sqlite3 database file before you run
docker pull
so you don't overwrite FLP's data (Note: Do not follow thedocker run
commands, usedocker-compose build
to build as this application now runs using a docker-compose file. -
Make sure in
deploy
, you have thedb.sqlite3
,env
files from the Google Drive. Furthermore, add theclient_secrets.json
file for Exporting to Google Drive functionality in the root directory (/home/ec2-user/github
) of where the repository is stored. -
If you wish to start with a custom or new databse, you can follow this step instead of adding the
db.sqlite3
file as described in the above step. Run thedocker-compose exec django /bin/bash
to access the docker container. Then run,python manage.py drop
thenpython manage.py import MANAGE_INVENTORY_FILE MANAGE_ITEMS_FILE
(there are examples of these in the google drive) -
Run
email_backups.py &
within the docker container (docker-compose exec django /bin/bash
needs to be run beforehand) to send database backup emails weekly. -
Make starter staff superuser and volunteer account by using the commands listed above (in the Admin section) within the docker container.
-
Finally run the server after running
docker-compose build
by runningdocker-compose up -d
.
You will follow these directions if there is an EC2 instance created with a docker image running:
The AWS/EC2 deployment of the FLP Inventory app is managed using Docker Compose.
To update the code/website: You will checkout a new branch or pull the new code that you wish to update the EC2 instance/website with. Then, you will need to build the image and volume mounts with the specified command(s) below. Afterwards, you can start the docker with the up command below. A good resource to see past history of commands to verify your process is cat ~/.bash_history
. This will give you a good sense of how past updates have occurred.
Firstly, you will need to ensure that the SQLite database file, db.sqlite3
, is moved into the deploy
directory (i.e. deploy/env
).
Secondly, you will need to copy the env
file from Google Drive (containing deployment secrets) to the deploy
directory (i.e. deploy/env
).
Finally, you will need to copy the settings.yaml
file from the Google Drive (containing Google API client ID and secrets) to the root directory (/home/ec2-user/github
) of the repository (cd ..
if within the deploy
directory).
To build the docker image, you should run:
$ docker-compose build
To launch the server, you should run:
$ docker-compose up -d
To shutdown the server, you should run:
$ docker-compose down
To restart an already-running server, you should run:
$ docker-compose restart
To access the docker container directly (useful for seeing paths and if files are being added to the docker container through .dockerignore):
$ docker-compose exec django /bin/bash
To check on the state of the server via its logs, you can run:
$ docker-compose logs
To check on the state of the server via its logs for a singlular container (generally django
for stacking tracing), you can run:
$ docker-compose logs <container name>
To have the logs outputted in a paginated fashion, you can run (generally django
container for stack tracing):
$ docker-compose logs <container name> | less
Note that all of the above commands should be run from the root of this directory (/home/ec2-user/github
)