This project is a part of CPSC 583: Deep Learning for Graph Structured Data Course
This project is a modification over the study by Brbic et al. (2022) Nature Methods. The work is titled "Annotation of spatially resolved single-cell data with STELLAR"
1. Python environment (Optional): We recommend using Conda package manager. The steps to install the required packages are given below.
module load miniconda
conda create --name pytorch_env python=3.8 pytorch torchvision torchaudio torch-geometric cudatoolkit=11.3 -c pytorch
conda activate pytorch_env
pip install pyg-lib torch-scatter torch-sparse -f https://data.pyg.org/whl/torch-1.13.0+cpu.html
pip install torch-geometric
pip install -r requirements.txt
CODEX multiplexed imaging datasets is available at dryad.
This step may consume a lot of memory (32GB RAM) and time (30 * 10 minutes).
sh run_script.sh
python3 run_stellar.py --dataset Hubmap --num-heads 22 --model GAT --randseed 2 --sample-rate 1.0
GAT
can be changed to any of the following models
FullyConnected
GCN
GraphSAGE
Transformers
python3 run_stellar.py --dataset TonsilBE --num-heads 13 --sample-rate 0.35 --model GraphSAGE
Results of the trials that I had run are given in Hubmap_results.csv
and TonsilBE_results.csv
. The first three rows are the accuracy results for three different runs on the different dataset split using different random seeds. The last three rows are balanced accuracy results for each of the three different runs. The columns are representative of different models tested - (Fully Connected (FC), Graph Convolutional Network (GCN), Graph Attention Network (GAT), GraphSAGE and Graph Transformer).