domenicocinque / spm

Pooling Layers For Simplicial Neural Networks

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Pooling Strategies for Simplicial Convolutional Networks

PyTorch Lightning Config: Hydra Template

Paper

Description

This repository contains the code for the paper Pooling Strategies for Simplicial Convolutional Networks.

The goal of this paper is to introduce pooling strategies for simplicial convolutional neural networks. Inspired by graph pooling methods, we introduce a general formulation for a simplicial pooling layer that performs: i) local aggregation of simplicial signals; ii) principled selection of sampling sets; iii) downsampling and simplicial topology adaptation. The general layer is then customized to design four different pooling strategies (i.e., max, top-k, self-attention, and separated top-k) grounded in the theory of topological signal processing. Also, we leverage the proposed layers in a hierarchical architecture that reduce complexity while representing data at different resolutions. Numerical results on real data benchmarks (i.e., flow and graph classification) illustrate the advantage of the proposed methods with respect to the state of the art.

How to run

Install dependencies

# clone project
git clone https://github.com/domenicocinque/spm
cd spm

# [OPTIONAL] create conda environment
conda create -n spm python=3.9
conda activate spm

# install pytorch according to instructions
# https://pytorch.org/get-started/
# install pytorch geometric according to instructions
# https://pytorch-geometric.readthedocs.io/en/latest/notes/installation.html

# install requirements
pip install -r requirements.txt

Train model with default configuration

# train on CPU
python src/train.py trainer=cpu

# train on GPU
python src/train.py trainer=gpu

Train model with chosen experiment configuration from configs/experiment/

python src/train.py experiment=experiment_name.yaml

You can override any parameter from command line like this

python src/train.py trainer.max_epochs=20 datamodule.batch_size=64

About

Pooling Layers For Simplicial Neural Networks


Languages

Language:Python 80.7%Language:Shell 19.3%