willyfh / graph-transformer

An unofficial implementation of Graph Transformer (Masked Label Prediction: Unified Message Passing Model for Semi-Supervised Classification) - IJCAI 2021

Home Page:https://www.ijcai.org/proceedings/2021/0214.pdf

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Graph Transformer (IJCAI 2021)

An unofficial implementation of Graph Transformer:
Masked Label Prediction: Unified Message Passing Model for Semi-Supervised Classification) - IJCAI 2021 > https://www.ijcai.org/proceedings/2021/0214.pdf

This GNN architecture is implemented based on Section 3.1 (Graph Transformer) in the paper.

I implemented the code by referring to this repository, but with some modifications to match with the original published paper in IJCAI 2021.

image

Installation

pip install graph-transformer

Usage

import torch
from graph_transformer import GraphTransformerModel

model = GraphTransformerModel(
        node_dim = 512,
        edge_dim = 512,
        num_blocks = 3, # number of graph transformer blocks
        num_heads = 8,
        last_average=True, # wether to average or concatenation at the last block
        model_dim=None # if None, node_dim will be used as the dimension of the graph transformer block
)

nodes = torch.randn(1, 128, 512)
edges = torch.randn(1, 128, 128, 512)
adjacency = torch.ones(1, 128, 128)

nodes = model(nodes, edges, adjacency)

About

An unofficial implementation of Graph Transformer (Masked Label Prediction: Unified Message Passing Model for Semi-Supervised Classification) - IJCAI 2021

https://www.ijcai.org/proceedings/2021/0214.pdf

License:MIT License


Languages

Language:Python 100.0%