Graphcore Research's repositories
unit-scaling
A library for unit scaling in PyTorch
out-of-the-box-fp8-training
Demo of the unit_scaling library, showing how a model can be easily adapted to train in FP8.
jax-experimental
JAX for Graphcore IPU (experimental)
unit-scaling-demo
Unit Scaling demo and experimentation code
llm-inference-research
An experimentation platform for LLM inference optimisation
random-bases
Improving Neural Network Training in Low Dimensional Random Bases
tessellate-ipu
TessellateIPU: low level Poplar tile programming from Python
pytorch-tensor-tracker
Flexibly track outputs and grad-outputs of torch.nn.Module.
poptorch-experimental-addons
IPU-specific extensions to PopTorch
hydronet-gnn
Fast and Accurate Predictions from 3D Molecular Structures
jit-dynamic-lookup
An experimental dynamic tensor slice operation, using JIT-compiled data exchanges
flash-attention-ipu
Poplar implementation of FlashAttention for IPU
qm1b-dataset
one billion quantum mechanical simulations containing 9-11 heavy atoms
sparsity-benchmarks
Benchmarking code for “PopSparse: Accelerated block sparse matrix multiplication on IPU”
ipu-ray-lib
Path-tracer with Neural HDRI for Graphcore IPUs.
kg-topology-toolbox
A Python toolbox to compute topological metrics and statistics for Knowledge Graphs
tensorflow-jax-experimental
TensorFlow XLA backend of experimental JAX on Mk2
graphium-smg
Graphium fork for Scaling Molecular GNNs project at Graphcore
llama.cpp
LLM inference in C/C++
making-efficientnet-more-efficient
Supplementary materials for: Making EfficientNet More Efficient: Exploring Batch-Independent Normalization, Group Convolutions and Reduced Resolution Training