Graphcore Research's repositories
unit-scaling
A library for unit scaling in PyTorch
out-of-the-box-fp8-training
Demo of the unit_scaling library, showing how a model can be easily adapted to train in FP8.
llm-inference-research
An experimentation platform for LLM inference optimisation
jax-experimental
JAX for Graphcore IPU (experimental)
unit-scaling-demo
Unit Scaling demo and experimentation code
pytorch-tensor-tracker
Flexibly track outputs and grad-outputs of torch.nn.Module.
random-bases
Improving Neural Network Training in Low Dimensional Random Bases
tessellate-ipu
TessellateIPU: low level Poplar tile programming from Python
poptorch-experimental-addons
IPU-specific extensions to PopTorch
jax-scalify
JAX Scalify: end-to-end scaled arithmetics
kg-topology-toolbox
A Python toolbox to compute topological metrics and statistics for Knowledge Graphs
hydronet-gnn
Fast and Accurate Predictions from 3D Molecular Structures
jit-dynamic-lookup
An experimental dynamic tensor slice operation, using JIT-compiled data exchanges
qm1b-dataset
one billion quantum mechanical simulations containing 9-11 heavy atoms
flash-attention-ipu
Poplar implementation of FlashAttention for IPU
sparsity-benchmarks
Benchmarking code for “PopSparse: Accelerated block sparse matrix multiplication on IPU”
ipu-ray-lib
Path-tracer with Neural HDRI for Graphcore IPUs.
tensorflow-jax-experimental
TensorFlow XLA backend of experimental JAX on Mk2
graphium-smg
Graphium fork for Scaling Molecular GNNs project at Graphcore
sparq-llama.cpp
LLM inference in C/C++
making-efficientnet-more-efficient
Supplementary materials for: Making EfficientNet More Efficient: Exploring Batch-Independent Normalization, Group Convolutions and Reduced Resolution Training
ml_dtypes
A stand-alone implementation of several NumPy dtype extensions used in machine learning.