ashutosh1919 / explainable-cnn

πŸ“¦ PyTorch based visualization package for generating layer-wise explanations for CNNs.

Home Page:https://pypi.org/project/explainable-cnn/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Explainable CNNs

Torch Version Torchvision Version Python Version test workflow Price Maintained

πŸ“¦ Flexible visualization package for generating layer-wise explanations for CNNs.

It is a common notion that a Deep Learning model is considered as a black box. Working towards this problem, this project provides flexible and easy to use pip package explainable-cnn that will help you to create visualization for any torch based CNN model. Note that it uses one of the data centric approach. This project focusses on making the internal working of the Neural layers more transparent. In order to do so, explainable-cnn is a plug & play component that visualizes the layers based on on their gradients and builds different representations including Saliency Map, Guided BackPropagation, Grad CAM and Guided Grad CAM.

Architechture

⭐ Star us on GitHub β€” it helps!

Usage

Install the package

pip install explainable-cnn

To create visualizations, create an instance of CNNExplainer.

from explainable_cnn import CNNExplainer

x_cnn = CNNExplainer(...)

The following method calls returns numpy arrays corresponding to image for different types of visualizations.

saliency_map = x_cnn.get_saliency_map(...)

grad_cam = x_cnn.get_grad_cam(...)

guided_grad_cam = x_cnn.get_guided_grad_cam(...)

To see full list of arguments and their usage for all methods, please refer to this file

You may want to look at example usage in the example notebook.

Output

Below is a comparison of the visualization generated between GradCam and GuidedGradCam

Contributors ✨

Thanks goes to these wonderful people (emoji key):


Ashutosh Hathidara

πŸ’» 🎨 πŸ”¬ 🚧 βœ… ⚠️

Lalit Pandey

πŸ”¬ πŸ“–

This project follows the all-contributors specification. Contributions of any kind welcome!

References