qafro1 / terraform-aws-eks

A Terraform module for deploying a EKS cluster in AWS

Home Page:https://registry.terraform.io/modules/telia-oss/eks/aws

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

AWS EKS Terraform Module

Build Status

Terraform module easy management of EKS clusters on AWS.

Usage

Prerequisite

Setup

  1. Have AWS access

  2. Apply terraform config - create cluster (usually slow, i.e. 10+ mins.)

  3. Set your kubeconfig using the aws cli:

    aws eks update-kubeconfig --name <cluster-name> # e.g example-cluster
  4. Confirm connection towards the cluster:

    kubectl get nodes # should return `no resources`

    Note

    When you create an Amazon EKS cluster, the IAM entity user or role (for example, for federated users) that creates the cluster is automatically granted system:master permissions in the cluster's RBAC configuration.

    I.e if your cluster is created by a machine user role (e.g. as a part of a CI/CD task), you will need to assume this role to establish initial connection towards the cluster.

    More info here.

  5. Save and apply config-map-aws-auth output from terraform:

    terraform output config_map_aws_auth # save as auth-config.yml
    kubectl apply -f auth-config.yml
  6. Confirm that nodes have joined/are joining the cluster

    kubectl get nodes # should show a list of nodes

Note

  • Cluster access requires an authenticated shell towards AWS in addition to the kubeconfig being present.
    • E.g: make sure that vaulted:
      • is working
      • session hasn't timed out
      • the correct AWS role is in use

Examples

Terraform module which creates a EKS cluster on AWS.

Authors

Currently maintained by these contributors.

License

MIT License. See LICENSE for full details.

About

A Terraform module for deploying a EKS cluster in AWS

https://registry.terraform.io/modules/telia-oss/eks/aws

License:MIT License


Languages

Language:HCL 90.3%Language:Makefile 9.7%