Ramona-Saintandre / udacityDevopsP3

Final project for Udacity Azure Cloud Devops NanoDegree

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Build Status

Udacity-Azure-Devops-P3: Ensuring Quality Releases

Project Overview

In this project, I will demonstrate the skills learned about ensuring quality releases in the Udacity Nanodegree: DevOps Engineer for Microsoft Azure. Using Microsoft Azure.

In this project I will create a VM using Terraform, along with Packer to deploy an app to Azure AppService.
Then, I will test environments and run automated tests, using various tools such as Jmeter, Selenium, and Postman.
To finalize this project I will utilize Azure Log Analytics to provide insight into my application's behavior by querying custom log files.

Dependencies

Dependency Link
Python https://www.python.org/downloads/
Terraform https://www.terraform.io/downloads.html
Packer https://www.packer.io/
JMeter https://jmeter.apache.org/download_jmeter.cgi
Postman https://www.postman.com/downloads/
Selenium https://sites.google.com/a/chromium.org/chromedriver/getting-started
Azure DevOps https://dev.azure.com/ramonasaintandre0285/UdacityP3-ensuringquality
Azure CLI https://aka.ms/cli_ref
Azure Log Analytics

Project Steps

  1. This is the starter repo for the project.
git clone https://github.com/jfcb853/Udacity-DevOps-Azure-Project-3

1. Installation & Configuration

Login to Azure:
az login

Create the password based authentication service principal for your project and query it for your ID and SECRET data:
az ad sp create-for-rbac --name SPProject3 --query "{client_id: appId, client_secret: password, tenant_id: tenant}"
Make note of the password because it can't be retrieved, only reset.

Service Principal info Login to the Service Principal using the following command with you credentials from the previous step:
az login --service-principal --username APP_ID --tenant TENANT_ID --password CLIENT_SECRET Configure the storage account and state backend.

You will need to create a Storage Account, before you can us Azure Storage as a backend.

You can also use the commands in the azurestoragescript.sh file in the repo to this.
Click Here for instructions on creating the Azure storage account and backend.

Replace the values below in terraform/environments/test/main.tf files with the output from the Azure CLI:

* storage_account_name
* container_name
* key
* access_key  

Create a Service Principal for Terraform and replace the below values in the terraform/environments/test/terraform.tfvars files with the output from the Azure CLI.

* subscription_id
* client_id
* client_secret
* tenant_id

Click Here if you need help with the steps for creating a service principal.
Service Principal in Azure

2. Azure DevOps

  1. Create a new Azure Pipeline from the azure-pipelines.yaml file or start building one from scratch if you prefer.

  2. Create a new service connection: In Azure Devops go to Project Settings > Service Connections > New Service Connection > Azure Resource Manager > Next > Service Principal (Automatic) > Next > Subscription. After choosing your subscription provide a name for the service connection.

  3. If the pipeline runs now it will fail since no resources are provisioned yet in Azure.

  4. Create an SSH key to log in to your VM.

  5. Click Here for instructions on setting up an SSH key.

  6. This SSH key can be saved and added to Azure Pipelines as a secure file. I also loaded my terraform.tfvars to Azure Pipelines as a secure file along with a .env file with my access-key.

  7. Create your Azure resources (Most can be provisioned via Terraform using the Pipeline by adding tasks and jobs to the azure-pipelines.yml file utilizing terraform init, terraform plan, and terraform apply commands).

TerraForm init TerraForm plan TerraForm apply

  1. Once the resources are deployed you will have to follow the instructions on setting up an environment in Azure Pipelines to register the Linux VM so your app can be deployed to it. You can find that documentation here. In Azure DevOps under Pipelines > Environments > TEST > Add resource > Select "Virtual Machines" > Next > Under Operating System select "Linux". You will be given a registration script to copy. SSH into your VM and paste this script into the terminal and run it. This will register your VM and allow Azure Pipelines to act as an agent to run commands on it.

TEST-VM Environment

  1. Build the FakeRestAPI and Automated Testing artifacts and publish them to the artifact staging directory in the Pipeline.

Build image

  1. Deploy the FakeRestAPI to your App Service on your VM. The URL for my webapp is (need to add URL)
    It should look like the image below.

Fake Rest Deploy

3. Running Tests in Pipeline

Postman

Test Type Technology Stage in CI/CD pipeline Status
Integration Postman Test Stage - runs during build stage
Goal Status
Verify dummy API's can pass by running them through Newman

Install Postman. With the starter files I was able to test the dummy API locally with different environment variables and then export the results as a collection to my project directory.
Once the collection files have been pushed to GitHub you will need to create a script to run a Data Validation and Regression test in the Pipeline and create a report of the results.
The command line Collection Runner

Newman
was used to perform these tests.
The following script was used:

    sudo npm install -g newman reporter
    echo 'Starting Tests...'
    echo 'Running Regression Test'
    newman run automatedtesting/postman/RegressionTest.postman_collection.json  --delay-request 15000 --reporters cli,junit --suppress-exit-code /
    echo 'Running Data Validation Test'
    newman run automatedtesting/postman/DataValidation.postman_collection.json  --delay-request 12000 --reporters cli,junit --suppress-exit-code

Newman  Table Regression Newman  Graph Regression Newman   Validation

The results of the tests will also appear in the Pipeline output like so:

Selenium

Test Type Technology Stage in CI/CD pipeline Status
Functional Selenium Test Stage - runs in custom linux VM
Goal Status
Direct the output of the Selenium Test Suite to a log file, and execute the Test Suite.

The project requires that Selenium is installed on the VM to test the UI functionality of https://www.saucedemo.com/ website.

Swag Shop Inventory

ems were added to the shopping cart, and what items were removed from the shopping cart.

SwagShop

I created the python code locally and made sure the test was working before ingesting it into the Pipeline.
To this:

  1. Download the latest Chrome driver. Make sure chromedriver is added to PATH.

  2. Run pip install -U selenium

  3. You can test Selenium by executing the login.py file in the Selenium folder. It should open the site and add all items to the shopping cart and then remove them from the shopping cart.

  4. A script will need to be created to perform these tasks in the CLI in the Azure Pipelines at this point. Make sure the script includes logging the items being added and removed from the cart that can be sent to an output file.

Selenium Logs Selenium run

JMeter

Jmeter install

Test Type Technology Stage in CI/CD pipeline Status
Performance JMeter Test Stage - runs against the AppService
Goal Status
Stress Test
Endurance Test

Two tests suites were created using the Starter API files.
(Stress Test and Endurance Test) You will need to replace the APPSERVICEURL with the Url of your AppService once it's deployed.

In the Pipeline scripts were created to install JMeter and run the Stress and Endurance Tests.

For the Pipeline two users were used for the tests so the free-tier resources wouldn't be maxed out.

Before submission the tests will need to be ran simulating 30 users for a max duration of 60 seconds.
The data output from these tests will need to generate an HTML report.

Note: This may also be done non-CI/CD by Installing JMeter and running the tests. Note: This may also be done non-CI/CD.

Jmeter stress test

Jmeter endurance test

Jmeter endurance html Jmeter stress html

4. Setup Log Analytics

Log Analytics

Goal Status
Configure custom logging in Azure Monitor to ingest this log file. This may be done non-CI/CD. :check_mark:

A Log Analytics Workspace will need to be created to ingest the Selenium Log output file.
This Resource
could be helpful in setting up Log Analytics.

Go to the app service > Diagnostic Settings > + Add Diagnostic Setting. Click AppServiceHTTPLogs and Send to Log Analytics Workspace. Select a workspace (can be an existing default workspace) > Save. Go back to the app service > App Service Logs. Turn on Detailed Error Messages and Failed Request Tracing > Save. Restart the app service. Log Analytics Heartbeat Return to the Log Analytics Workspace > Logs and run a query such as heartbeatto see some log results and check that Log Analytics is working properly.

Analytic Metric

5. Setup Custom Logging

In Log Analytics Workspace go to Advanced Settings > Data > Custom Logs > Add + > Choose File. Select the seleniumlog.txt file created from the Pipeline run. (I downloaded a copy to my local machine and uploaded a copy to the Azure VM via SSH) > Next > Next. Enter the paths for the Linux machine where the file is located, such as:

  • /seleniumlog.txt
  • <your path>/seleniumlog.txt Custom logging setup

Selenium Run

Give it a name and click Done.

Return to the Log Analytics Workspace > Virtual Machines. Click on your VM, then Connect. An agent will be installed on the VM that allows Azure to collect the logs from it. (Under Settings > Agents you should see you have "1 Linux Computer Connected")

Return to the Log Analytics Workspace > logs. Under Queries you can enter the name of the custom log created in the previous query and click Run. It's possible this could take a while to show logs. The logs may only show if the timestamp on the log file is updated after the agent was installed. You may also need to reboot the VM. Log output

Note: This may also be done non-CI/CD.

6. Setup Email Alerts

Goal Status
Configure an Action Group (email)

The project also calls for creating an alert rule with an HTTP 404 condition. It also requires an action group to be created with Email notification. After the alert takes effect, visit the URL of the AppService and try to cause 404 errors by visiting non-existent pages. After the 2nd 404 error an email alert should be triggered. To set this up:

alerts
alerts
alerts

In Azure Portal go to the AppService > Alerts > New Alert Rule. Add the HTTP 404 condition with a threshold value of 1. This creates an alert once there are at least 2 404 errors. Click Done. Now create the action group with the notification type set to Email/SMS message/Push/Voice choosing the email option. Give it a name and severity level.

Email Alert Email Alert  Alert

Enchancements

One of the enchancements that I would like to do with this project, is to recreate it using Azure Bicep.

Note to reviewer

**More images can be found in the /projectimagefolder

About

Final project for Udacity Azure Cloud Devops NanoDegree


Languages

Language:HTML 68.7%Language:JavaScript 17.4%Language:HCL 7.8%Language:Python 2.6%Language:CSS 2.5%Language:Shell 0.8%Language:ASP.NET 0.1%