FELT-Labs / anomaly-detection

Anomaly detection demo for SBIC 2022

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Anomaly Detection

Anomaly detection demo for SBIC 2022

Open In Colab

Anomaly Detection on YouTube:
Demo video on YouTube

Introduction

Anomaly detection is common problem found in many different industries. This repository demonstrates usage of FELT for anomaly detection on distributed data. You can image 3 factories each producing same product (cables in our case). Using FELT and Ocean Protocol we can calculate statistics across all datasets without revealing data itself. After that we can use those statistics for detecting anomalies in each factory.

Requirements

Using version 0.5.1 of FELT Labs application. All requirements are listed in requirements.txt.

Datasets

You can find the dataset files in datasets folder. Same files are published on Ocean Protocol as 3 different datasets with following DIDs:

  1. did:op:493def4e00cda410adde2017ebaf5d644cf2bdec81cec5fee29d1fc9c73d66fa
  2. did:op:9e7f56f83422b156c016fe83e87722dac4b882ba9cd03f6d88e39fea04495669
  3. did:op:53ffd278eff009d92a130db6f3b7415158d17f176ceef8114b0071bf6ec40a88

Data are created from this dataset. This data captures basic properties from copper wire production line. Each line in this dataset represents one production period.

Algorithm

Algorithm used for anomaly detection is in src/detection_algorithm.py. We published this file as Ocean Protocol asset with following DID:

The algorithm works as follows. Based on the trained model (containing value of mean and standard deviation), we calculate z-score for each data point:

$$ \text{z-score} = \frac{x - mean}{\sigma} $$

Then we assume that point is anomaly if z-score is greater than 2. Finally, the algorithm creates simple chart and stores the results.

Usage - Decentralized

Federated Analytics

  1. Go to FELT web application: https://app.feltlabs.ai/multiple (you will need to connect your MetaMask wallet)
  2. Pick name and fill following DIDs of our datasets:
  3. In next step pick Analytics and select both Mean and Standard deviation
  4. In the last step leave target column equal to -1 and start the training

This will start local jobs on each dataset. We won't be downloading those datasets. Computation will happen in place where those data are stored and only final encrypted results will be available for us. In order to combine local results into final model do the following:

  1. Go to: https://app.feltlabs.ai/jobs
  2. Open up your job and click Reload until all local jobs are finished
  3. Click Aggregation button at the bottom to create final model
  4. Wait for aggregation to finish (use Reload button) and download the final model

Evaluation of Final Model

We already published the algorithm for evaluation (anomaly detection). We just need to run it on our data (could be new data coming from those factories).

  1. Go to: https://app.feltlabs.ai/experimental
  2. Pick name, upload final model and fill following DIDs:
  3. Start the algorithm using Run button
  4. Go to: https://app.feltlabs.ai/jobs
  5. Wait until your compute job is finished (use Reload button) and then download result.jpg with the final chart

Usage - Local

This simulates the decentralized process in local environment.

Install

Install Python 3.8 or newer.

pip install -r requirements.txt
# In case you want to contribute to repository run following as well:
pre-commit install

Then you can start jupyter lab/notebook as follows:

jupyter lab

Using Makefile

Alternatively, you can use Makefile which will create virtual environment and install requirements for you. Using following command:

make install

Main file

Once you have requirements installed and jupyter running. You can open main.ipynb which will walk you through the main usage of FELT.

You can also open the notebook in Google Colab:

Open In Colab

About

Anomaly detection demo for SBIC 2022

License:GNU General Public License v3.0


Languages

Language:Jupyter Notebook 81.9%Language:Python 15.0%Language:Makefile 3.1%