TungTh / ititiu18206-thesis

Thesis project by Zach 'Voxous' Wattz

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

An application for performance measuring / testing and personnel training using Event-Driven Architecture, event replay and Apache Kafka.


Good day! This is a prototype branch of the GitHub repository of my Thesis project. Remember to read this file carefully so you will know how to set this project up properly!

The project consists of 3 components:

  1. NodeUtilityApp - provides GUI for user for easier to interact with the application.
  2. JavaUtilityApp - intakes data as streams and performs broadcasts to requested clients.
  3. Kafka Cluster - this component itself is quite self-explanatory.

Sub-components for testings:

  1. NodeDistributorApp - provides GUI for user for easier to send test data.
  2. PySorterApp - processes data.
  3. RubyReporterApp - receives processed data.

NodeUtilityApp

  1. Have NodeJS installed.
  2. cd to the directory /NodeUtilityApp.
  3. cd to both backapp, frontapp-react and run npm install of each directory to get required dependencies.
  4. For backapp, run nodemon startApp or node startApp or execute a discrete script deploy.sh.
  5. For frontapp-react, run npm run build first. Once the process is completed, run serve -s build or execute a discrete script deploy.sh.

Kafka Cluster

This is an Apache Kafka application. So make sure it is downloaded first (The version of which recommended by themselves is preferred). Also, it is advised to run Kafka in Linux as many experienced users have proven the app utilizes better resources and possesses better performance than that of Windows. Not to mention the required set up steps in order for Kafka to run in Windows. After that, follow these steps:

  1. Extract the contents of the downloaded Kafka application. Make sure the Directory path of to it DOES NOT contain any spaces! For example, the path C:/ProgramData/Broker Server/kafka is ineligible as it contains a single white space.
  2. Extract the configuration and launch scripts in Kafka Configs of the repository into the installation directory of the downloaded Kafka app.
  3. cd to the Kafka application root and run ./start-ZK.sh to start the Zookeeper server first (The syntax is the same regardless of using either WSL or Linux, if you use Linux, make sure that Java is installed beforehand!).
  4. Run ./start-BKx.sh to start Broker #x (1 -> 2) with x as the broker ID. Currently there are 2 brokers, more might be added later.
  5. Perform testings.
  6. To stop all operations, stop the Kafka brokers first by running ./stop-BKs.sh. Then run ./stop-ZK.sh to stop the Zookeeper server. DO NOT STOP THE ZOOKEEPER FIRST!

JavaUtilityApp

This is a Java application used for receiving data as streams. Follow the steps:

  1. Have Java installed. More specifically, JDK 16 or higher is preferred.
  2. Have Maven installed as this App was built using Maven archetype. Either using command line or Visual Studio Code's extension is OK.
  3. cd to jksa directory of the App. If command line is preferred, run mvn package to build the project. If Visual Studio Code's extension is preferred, simply run the project to build it.
  4. Configure the input command line arguments for the Java application as follow
    <broker address> <broker port> <socketio address> <socketio port> [enable debug]
    Where angle brackets indicate MANDATORY argument whereas square brackets indicate OPTIONAL argument.
  5. To stop the App, use Ctrl + C or Stop button in Visual Studio Code.

Port numbers list:

  1. NodeUtilityApp
    • Front app: 3000
    • Back app: 3001
  2. JavaUtilityApp
    • Broker Ports: 9091 - 9092
    • Socket IO Port: 3004
  3. Testing Apps
    • Distributor front app: 3002
    • Distributor back app: 3003

About

Thesis project by Zach 'Voxous' Wattz


Languages

Language:JavaScript 61.8%Language:Java 15.4%Language:CSS 13.3%Language:Python 6.5%Language:Ruby 1.8%Language:HTML 0.9%Language:Shell 0.3%