sheikhaafaq / test

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

SURVEYSPARROW DOCUMENTATION


Terraform code implementation

Prerequisites

Install tools in you local system

  • Terraform
  • Aws-cli
  • git
  • Kubectl
  • Helm
  • Aws-Iam-Authenticator
  • Configure awscli aws configure and provide:
  •   | Key                      | Value                |        
      | -------------------------|:--------------------:|
      | AWS Access Key ID        |  AXDJBJD************ | 
      | AWS Secret Access Key    |  HBDKJ4JD*******     |
      | Default region name      |  us-east-1           | 
      | Default output format    |  json                |
    
  • Create two key-pairs naming convention [environment]-bastion [staging-bastion and dev-bastion]

Steps to follow

  1. Clone the terraform code repository
  2. Go Inside Remote-Backend:
    • Run terraform init to install provider.
    • Run terraform apply --var-file dev.tfvars to create remote backend in S3 and DynamoDB for dev environment.
    • Run terraform apply --var-file staging.tfvars to create remote backend in S3 and DynamoDB for staging environment
  3. Go Inside [environment]/example (like dev-environment/dev/) directory
  4. Fill up the required variables in example.tfvars like profile and region
  5. Run terraform init
  6. In first phase deploy base setup like Vpc, Bastion, Ecr, and Eks-Cluster, comment the remaining portion of code in main.tf from provider "kubernetes" to the end. then Run terraform apply --var-file example.tfvars (dev.tfvars) see what are the resources to be deploying and approve.
  7. Export the kubeconfig export KUBECONFIG=./kubeconfig_[environment]-eks-cluster
  8. Update the kubeconfig of eks-cluster aws eks update-kubeconfig --region [region-code] --name [environment]-eks-cluster
  9. Fill up the required variables in [environment].tfvars like ecr-repository uri and acm-certicate-arn
  10. In second phase, uncomment the remaining code in main.tf to the end then Run terraform apply --var-file [environment].tfvars again to setup deployments inside eks-cluster and approve
  11. Edit the configmap aws-auth for users to access eks-cluster kubectl edit cm aws-auth -n kube-system or add the user in the aws-auth.yaml file and then Run kubectl apply -f aws-auth.yaml

Configure jenkins pipeline

  1. Login to jenkins server
  2. Go inside Manage jenkins/Manage Plugins/available and install nodejs and bitbucket plugins
  3. Go inside Manage jenkins/Global Tool Configuration/NodeJsand follow:
    • Name: 14.16.0
    • Version: NodeJS 14.16.0
    • Global npm packages to install: npm@6.14.12
    • Save
  4. Go inside Manage jenkins/Credentials Add credentials and follow:
    • Kind: Username with password
    • Scope: Global
    • Username:ss-comprinno
    • Password: bitbucket-repository-password
    • ID: BitBuketCredsForDockerfile
    • Description: BitBuket Credentials for cloning repositories
  5. Go Inside Configure System/Global properties/Environment variables and set two environment variables
  6.   | Name              | Value    |        
      | ------------------|:--------:|
      | BRANCH            |  master  | 
      | OPTIONAL_SCRIPT   |  true    |
    
  7. On dashboard/New Item/ and follow:
    • Item Name: dev-surveysparrow-pipeline
    • Type: Pipeline
    • Build Triggers: Build when a change is pushed to BitBucket
    • Copy the pipeline script dev-pipline or staging-pipline in the terraform code repository and paste inside pipeline block
    • Save
  8. Click on Build Now and check the pipeline is working
  9. Also create Webhook in bitbucket for continuous integration and continuous deployment
  10. Now commit a change in bitbucket repository and follow the pipeline till deployment completes

About