thesandlord / nginx-kubernetes-lb

Example demonstrating how to use Nginx as a L7 load balancer for Kubernetes microservices

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

This is not an official Google product.

Load Balancing Kubernetes Services with NGINX+

This app demonstrates how to make use of NGINX+ Layer 7 load balancing with Kubernetes services.

The app

Kubernetes Services

The app consists of four Kubernetes services written in Ruby, Python, Node, and Go. The services run in Docker containers, and each one does string manipulation based on a request to an endpoint with a str parameter. They each have a replication controller (rc.yaml), which configures the number of pods running in the container.

NGINX+ Load Balancer

The load balancer routes all requests to our backend services through one external IP, and is configured in nginx.conf. There is an upstream server for each of the four services. In the first server block, requests to the four endpoints are proxied to the correct upstream via proxy_pass. The second server block configures a status page listening on port 8080 to make use of NGINX+ live monitoring.

Load Testing with Seige

The app uses Siege to load test the nginxplus service. The code for this can be found in the load-generator directory.

Deploying

  1. Create a project in the Google Cloud Developer console.

  2. Install Docker, then create a Docker instance and host it on Google Compute Engine in your Cloud project.

  3. Register for NGINX+ and copy your certificate and license key into the nginx-repo.crt and nginx-repo.key files.

  4. Deploy the four Kubernetes services by running make deploy inside each services directory: arrayify, reverse, to_lower, and to_upper. To verify that 3 replicas are running for each service, run kubectl get pods.

  5. Deploy the nginxplus service by running make deploy from the nginx directory. Then run kubectl get svc to get the external IP address for your nginxplus service.

  6. When you navigate to this IP in the browser, you should see the "Nginx is running!" page. Next, verify that each service is running correctly: YOUR-IP/reverse/?str=teststring.

  7. Try out NGINX+ live monitoring by visiting the status page: YOUR-IP:8080/status.html. When you make a request to one of the services, you should see the requests and connections count update in realtime on the status page.

  8. Load test your service by running the Seige load generator.

About

Example demonstrating how to use Nginx as a L7 load balancer for Kubernetes microservices

License:Apache License 2.0


Languages

Language:Makefile 48.3%Language:Ruby 16.7%Language:Nginx 11.7%Language:Go 7.0%Language:Python 6.1%Language:JavaScript 5.5%Language:HTML 4.6%