monicagangwar / aws_docker_swarm

setup to bootstrap docker swarm cluster and a controller on AWS using terraform

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

AWS DOCKER SWARM SETUP

AWS Setup for launching Docker Swarm Cluster on 3 nodes (1 master, 2 workers)
Quick Demo

Infrastructure

image

Dependencies

Configuration

  • Create an IAM user and get ACCESS KEY & SECRET ACCESS KEY on AWS CONSOLE
  • Hit aws configure and add ACCESS KEY & SECRET ACCESS KEY
  • Change the region and availability zone in variables.tf file if you wish to launch the setup in another region. Currently it defaults to us-east-1

Usage

  • Run init script which will create a S3 bucket for storing terraform remote state. Change the bucket name in setup
./init_aws.sh
  • Launch global resources which contains ssh key. Change key path in ssh_key.tf
cd global
terraform apply
cd vpc
terraform apply
cd nodes
terraform apply

Output to Note

  • manager_ip
    • Its the IP of manager node which belongs to a swarm lanched on bootup of nodes.
    • Services launched via Controller UI can pe accessed on manager_ip:port_specified
  • controller_ip
    • Controller has Portainer running on Port 9000 which is a UI over Docker Engine.
    • Hit controller_ip:9000 and login
    • Enter manager_ip:2375 when asked for Docker Endpoint on login

About

setup to bootstrap docker swarm cluster and a controller on AWS using terraform


Languages

Language:HCL 92.9%Language:Shell 7.1%