There are 12 repositories under graph-transformer topic.
Graph Transformer Architecture. Source code for "A Generalization of Transformer Networks to Graphs", DLG-AAAI'21.
Papers about graph transformers.
Universal Graph Transformer Self-Attention Networks (TheWebConf WWW 2022) (Pytorch and Tensorflow)
The official implementation of NeurIPS22 spotlight paper "NodeFormer: A Scalable Graph Structure Learning Transformer for Node Classification"
Official Pytorch code for Structure-Aware Transformer.
Source code for GNN-LSPE (Graph Neural Networks with Learnable Structural and Positional Representations), ICLR 2022
[AAAI2023] A PyTorch implementation of PDFormer: Propagation Delay-aware Dynamic Long-range Transformer for Traffic Flow Prediction.
[ICLR 2023] One Transformer Can Understand Both 2D & 3D Molecular Data (official implementation)
Long Range Graph Benchmark, NeurIPS 2022 Track on D&B
Official Code Repository for the paper "Accurate Learning of Graph Representations with Graph Multiset Pooling" (ICLR 2021)
SignNet and BasisNet
Repository for CARTE: Context-Aware Representation of Table Entries
It is a comprehensive resource hub compiling all graph papers accepted at the International Conference on Learning Representations (ICLR) in 2024.
Code for our paper "Attending to Graph Transformers"
Triplet Graph Transformer
Protein Structure Transformer (PST): Endowing pretrained protein language models with structural knowledge
An unofficial implementation of Graph Transformer (Masked Label Prediction: Unified Message Passing Model for Semi-Supervised Classification) - IJCAI 2021
Official Pytorch implementation of NeuralWalker
Unified Graph Transformer (UGT) is a novel Graph Transformer model specialised in preserving both local and global graph structures and developed by NS Lab @ CUK based on pure PyTorch backend.
Hop-Wise Graph Attention for Scalable and Generalizable Learning on Circuits
Multiresolution Graph Transformers and Wavelet Positional Encoding for Learning Long-Range and Hierarchical Structures
Graph Transformers for Large Graphs
MANDO-HGT is a framework for detecting smart contract vulnerabilities. Given either in source code or bytecode forms, MANDO-HGT adapts heterogeneous graph transformers with customized meta relations for graph nodes and edges to learn their embeddings and train classifiers for detecting various vulnerability types in the contracts' nodes and graphs.