This is my final project for the She Code Africa Cloud School(Cohort 2).
This application is a face-detection app based off of the AI/ML Clarifai API. The application allows users to upload images of audiences in an event/meeting and record the audience count.
This application is a three-tier application, with the frontend and the backend split into two repositories:
- The frontend is in this repository, while
- The backend is this current repository.
The backend is made up of the API and the database.
-
API
The API collection is documented here.
The api routes are in theroutes.js
file, with the route handlers in the/controllers
directory.
The server entrypoint isindex.js
, with the server setup inserver.js
.
Configure environment and environment variables in.env
files andconfig.js
, with different.env
files created for dev and testing environments. Use the dotenv-flow package. -
Database - PostgreSQL
The database is made up of three tables -users
,login
andmeetings
. Sql scripts to create the database tables and populate the tables with dummy data are in the/postgres
directory.
List of relations
Schema | Name | Type
--------+----------+-------
public | login | table
public | meetings | table
public | users | table
SELECT * FROM users;
id | name | email | department | title | joined
----+-------+-----------------+------------+-----------+---------------------
SELECT * FROM meetings;
id | event_name | no_of_people | location | date_recorded | user_id
----+---------------+--------------+----------+----------------------------+---------
SELECT * FROM login;
id | hash | email
----+--------------------------------------------------------------+-----------------
In the project directory, run:
docker-compose -f docker-compose.yml up
to spin up a development environment(add the--build
flag when you run the command for the first time or you add new dependencies). This will create an api container and a database container.
- Integration tests are written using Jest, Chai and Supertest, within the
/__tests__
directory.
In the project directory, run:
npm run test-script
to launch a test environment and avoid testing against the actual production database. This runs a bash script defined in/bin/test.sh
.
- Run
docker-compose -f docker-compose-prod.yml build
to create a production-ready image for the api.
Use a managed database service in production.
Deploy the frontend and api on the same GKE cluster while using GCP-managed Cloud SQL database service.
-
Frontend deployment (
react_deployment.yml
)-
Deploy the React frontend with a Load Balancer service to make it accessible over the public internet.
-
Configure Nginx as a reverse proxy to direct traffic to the API.
-
Enable Zero Downtime Deployment with a rolling update strategy in Kubernetes.
-
-
API deployment
-
Create Secrets for database credentials and service accounts for IAM authentication to Cloud SQL.
-
Deploy the API with a ClusterIP service which will ensure it is not accessible over the internet.
-
Add a service account with the Cloud SQL Client IAM role.
-
The
service-acc-key.yml
file is the service account credentials required for the GKE cluster to access the Cloud SQL database. Deploy this file beforeapi_deployment.yml
-
The
api_deployment.yml
deployment file contains the Kubernetes objects- Secrets, Service and Deployment - required for this application for easy readability. The order of deployment is Secrets, Service and then Deployment. -
Enable Zero Downtime Deployment with a rolling update strategy in Kubernetes.
-
-
Database deployment
- Setup Postgres on Cloud SQL instance.
- Use Cloud SQL Auth Proxy for secure access to Cloud SQL instance without the need for authorized networks or for configuring SSL.
- Setup Cloud SQL Auth Proxy as a 'sidecar', to run as a container within the pod running the API container. Mount service account secret as a volume on Cloud SQL Auth Proxy container.
Provision infrastructure in GCP using Terraform. Infrastructure for this project include:
- VPC network
- Firewall rules
- Google Compute Engine VM instances
- Google Kubernetes Engine Cluster
- Cloud SQL Instance for Postgres
- Remote backend
Check out the infrastructure-as-code configuration for this project in this repo
Install Ansible in one of the two VM instances provisioned with Terraform. Then, install and configure Jenkins for this project using Ansible.
To install Ansible, follow these steps:
-
Access the Ansible VM instance using ssh:
gcloud compute ssh <ansible-server-name>
-
Generate SSH keys:
ssh-keygen
-
Copy the public key:
sudo cat ~/.ssh/id_rsa.pub
-
Access the Jenkins VM instance using ssh.
-
Paste the public key within
~/.ssh/authorized_keys
folder:sudo vi /home/<username>/.ssh/authorized_keys
-
Confirm connection between the two instances:
ssh <jenkins-instance-ip-address>
-
Run the following commands to install Ansible:
-
Update Repository by including the official project’s PPA
sudo apt-get update
sudo apt-get install software-properties-common
sudo apt-add-repository -y ppa:ansible/ansible
sudo apt-get update
to refresh the package manager -
Install Ansible (and Python)
sudo apt-get install -y ansible
sudo apt install python-pip -y
-
Install Boto Framework
sudo pip install boto boto3
sudo apt-get install python-boto -y
-
Check that Ansible is installed
ansible --version
-
Add the ip address of the Jenkins instance to the Ansible's inventory file:
sudo vi /etc/ansible/hosts
Add this snippet in/etc/ansible/hosts
:[jenkins-server] <external-ip-address> ansible_ssh_user=<username> ansible_ssh_private_key=path/to/private/key ansible_python_interpreter=path/to/python
-
-
Install and configure Jenkins
- Create a directory within the Ansible instance named
playbooks
:
mkdir playbooks
- Within the
playbooks
directory, create the playbooks found in this [repo](https://github.com/Z11mm/ ansible-playbooks) - Run the playbooks, one at a time using this command:
sudo ansible-playbook <filename>
- Install Java first, followed by Jenkins, and then the others in any order.
- Once complete, open
http://<ext-ip-address>:8080
in the browser and follow the prompt. - In the Jenkins web application, install the following plugins:
- Node
- Google Kubernetes Engine
- Docker
- Slack Notifications
- Assign a service account with full IAM access to the Jenkins instance to enable it interact with GCP resources.
- Create a directory within the Ansible instance named
A push to the repository triggers the CI/CD script in the Jenkinsfile. The CI portion of the script does the following:
- Runs tests.
- Builds a Docker image using the
docker-compose-prod.yml
file. - Pushes the Docker image to my Dockerhub account with a tag version corresponding to the build id.
- Sends Slack notifications when build starts and if build is successful or build fails.
The CD portion of the script does the following:
- Pulls the Docker image from DockerHub.
- Replaces the
:latest
tag version within the deployment file with the updated build id. - Deploys the application to Google Kubernetes Engine(GKE) using the Jenkins GKE plugin.
- Sends Slack notifications when build starts and if build is successful or build fails.
Monitor the application running in GKE through the built-in Cloud Operations for GKE which has Cloud Monitoring and Logging by default.