Graphcore's repositories
vllm-fork
A high-throughput and memory-efficient inference and serving engine for LLMs
ogb-lsc-pcqm4mv2
The OGB-LSC is the Large Scale Competition by Open Graph Benchmark to help accelerate research into machine learning on graph structured data
Gradient-Tensorflow2
TensorFlow 2 Models on IPUs using Paperspace Gradient
zephyr-fork
Primary Git Repository for the Zephyr Project. Zephyr is a new generation, scalable, optimized, secure RTOS for multiple hardware architectures.
pytorch-fork
Tensors and Dynamic neural networks in Python with strong GPU acceleration
torchbench-fork
TorchBench is a collection of open source benchmarks used to evaluate PyTorch performance.
distributed-kge-poplar
The application is a end-user training and evaluation system for standard knowledge graph embedding models. It was developed to optimise the WikiKG90Mv2 dataset
Gradient-Pytorch-Geometric
A repository of tutorials and examples demonstrating use of PyTorch Geometric with IPUs
Gradient-HuggingFace
Tasks and tutorials using Graphore's IPU with Hugging Face. Originally at https://github.com/gradient-ai/Graphcore-HuggingFace
optimum-graphcore-fork
Blazing fast training of 🤗 Transformers on Graphcore IPUs
continue-fork
⏩ Continue is an open-source autopilot for VS Code and JetBrains—the easiest way to code with any LLM
bigcode-evaluation-harness-fork
A framework for the evaluation of autoregressive code generation language models.
simple-server-framework
Simple Server Framework provides a wrapper to add serving to an application using a minimal declarative config and utilities to package and deploy the application.
api-deployment
Example of deployment of a NLP inference server on Gcore. Using FastAPI, Huggingface's optimum-graphcore and Github workflows.
lightning-Graphcore-fork
Users looking to save money and run large models faster using single or multiple IPU device
lm-evaluation-harness-fork
A framework for few-shot evaluation of language models.
rules_rust-fork
Rust rules for Bazel
PopTransformer
PopTransformer provides a fundamental framework(including layers, operators, models, etc) that allow users to develop and run highly optimized transformer-based models(inference-only) with Poplar SDK on Graphcore IPU.
license-checker
A tool to accumulate license information for pip and aptitude dependencies.
diffusers-fork
🤗 Diffusers: State-of-the-art diffusion models for image and audio generation in PyTorch
pytorch_geometric-fork
Graph Neural Network Library for PyTorch
transformers-fork
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
pytorch-image-models-fork
PyTorch image models, scripts, pretrained weights -- ResNet, ResNeXT, EfficientNet, NFNet, Vision Transformer (ViT), MobileNet-V3/V2, RegNet, DPN, CSPNet, Swin Transformer, MaxViT, CoAtNet, ConvNeXt, and more. Fork of: https://github.com/huggingface/pytorch-image-models