faizankshaikh / expose_klDiv

An Expose on Kullback-Leibler Divergence

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

An Expose on Kullback-Leibler Divergence

This repository contains the write-up and respective code for "An Expose on Kullback-Leibler Divergence", a class project for Information Theory course

Abstract

The project is an in-depth exploration of Kullback Leibler divergence (KLD), a widely used metric to measure two probability distributions. We will gently introduce the topic along with an example to familiarize ourselves with the terminologies. We will then dive into the mathematical aspects of KLD by formulating a theoretical derivation of KLD for two univariate gaussian distributions. Lastly, we will lay out a practical application of KLD as a learning metric for a deep learning architecture called Variational Autoencoders.

About

An Expose on Kullback-Leibler Divergence

License:MIT License


Languages

Language:Jupyter Notebook 100.0%