Sara-mibo / LRP_EncoderDecoder_GRU

Implementing LRP (Layer-wise Relevance Propagation) for a sequence-to-sequence model with GRU layers.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Table of Contents
  1. Project Description
  2. Dependencies
  3. Usage
  4. Contact
  5. Acknowledgments

Project Description

This repository provides the implementation of the Layer-wise Relevance Propagation (LRP) explanation method for GRU cells (as proposed by the Pytorch framework), as well as for a sequence-to-sequence neural network architecture. We use LRP in order to explain the decisions of an encoder-decoder GRU-based pollution forecasting model.

Dependencies

The steps you need to run to install the required dependencies are the following:

  • create environment lrpenv
    conda create -n lrpenv python=3.8
  • activate environment lrpenv
    conda activate lrpenv
  • install pip
    conda install pip
    
  • install requirements
    pip install -r requirements.txt

Usage

The folder LRP/ contains the main part of the LRP implementation for a seq-2-seq model with GRU layers.

The folder LRP_toyTask/ contains the scripts used for validation of the LRP implementation through a toy task.

The folder LRP_pollutionForecastModel/ contains the scripts used for applying LRP to a pollution forecasting task.

(back to top)

Contact

Sara Mirzavand Borujeni: sara.mirzavand.borujeni@hhi.fraunhofer.de - sarah.mb@outlook.com

Project Link: https://github.com/Sara-mibo/LRP_EncoderDecoder_GRU

(back to top)

Acknowledgments

(back to top)

About

Implementing LRP (Layer-wise Relevance Propagation) for a sequence-to-sequence model with GRU layers.

License:Other


Languages

Language:Python 74.5%Language:Jupyter Notebook 25.5%