There are 11 repositories under graph-transformer topic.
Graph Transformer Architecture. Source code for "A Generalization of Transformer Networks to Graphs", DLG-AAAI'21.
Papers about graph transformers.
Universal Graph Transformer Self-Attention Networks (TheWebConf WWW 2022) (Pytorch and Tensorflow)
The official implementation of NeurIPS22 spotlight paper "NodeFormer: A Scalable Graph Structure Learning Transformer for Node Classification"
Source code for GNN-LSPE (Graph Neural Networks with Learnable Structural and Positional Representations), ICLR 2022
Official Pytorch code for Structure-Aware Transformer.
[ICLR 2023] One Transformer Can Understand Both 2D & 3D Molecular Data (official implementation)
[AAAI2023] A PyTorch implementation of PDFormer: Propagation Delay-aware Dynamic Long-range Transformer for Traffic Flow Prediction.
Long Range Graph Benchmark, NeurIPS 2022 Track on D&B
Official Code Repository for the paper "Accurate Learning of Graph Representations with Graph Multiset Pooling" (ICLR 2021)
SignNet and BasisNet
Code for our paper "Attending to Graph Transformers"
It is a comprehensive resource hub compiling all graph papers accepted at the International Conference on Learning Representations (ICLR) in 2024.
An unofficial implementation of Graph Transformer (Masked Label Prediction: Unified Message Passing Model for Semi-Supervised Classification) - IJCAI 2021
Triplet Graph Transformer
Protein Structure Transformer (PST): Endowing pretrained protein language models with structural knowledge
Hop-Wise Graph Attention for Scalable and Generalizable Learning on Circuits
Unified Graph Transformer (UGT) is a novel Graph Transformer model specialised in preserving both local and global graph structures and developed by NS Lab @ CUK based on pure PyTorch backend.
Multiresolution Graph Transformers and Wavelet Positional Encoding for Learning Long-Range and Hierarchical Structures
MANDO-HGT is a framework for detecting smart contract vulnerabilities. Given either in source code or bytecode forms, MANDO-HGT adapts heterogeneous graph transformers with customized meta relations for graph nodes and edges to learn their embeddings and train classifiers for detecting various vulnerability types in the contracts' nodes and graphs.
Community-aware Graph Transformer (CGT) is a novel Graph Transformer model that utilizes community structures to address node degree biases in message-passing mechanism and developed by NS Lab @ CUK based on pure PyTorch backend.
Welcome to the Graph Neural Networks (06838-01) class repository for the Department of Artificial Intelligence at the Catholic University of Korea. This platform is dedicated to sharing and archiving lecture materials such as practices, assignments, and sample codes for the class.