MansourGu / how_attentive_are_gats

Code for the paper "How Attentive are Graph Attention Networks?"

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

How Attentive are Graph Attention Networks?

This repository is the official implementation of How Attentive are Graph Attention Networks?.

alt text

GATv2 is now available as part of PyTorch Geometric library!

https://pytorch-geometric.readthedocs.io/en/latest/modules/nn.html#torch_geometric.nn.conv.GATv2Conv

and also is in this main directory.

GATv2 is now available as part of DGL library!

https://docs.dgl.ai/en/latest/api/python/nn.pytorch.html#gatv2conv

and also in this repository.

GATv2 is now available as part of Google's TensorFlow GNN library!

https://github.com/tensorflow/gnn/blob/main/tensorflow_gnn/docs/api_docs/python/gnn/keras/layers/GATv2.md

The rest of the code for reproducing the experiments in the paper will be made publicly available.

Citation

How Attentive are Graph Attention Networks?

@article{brody2021attentive,
  title={How Attentive are Graph Attention Networks?},
  author={Brody, Shaked and Alon, Uri and Yahav, Eran},
  journal={arXiv preprint arXiv:2105.14491},
  year={2021}
}

About

Code for the paper "How Attentive are Graph Attention Networks?"


Languages

Language:Python 100.0%