Marko Kabić (kabicm)

kabicm

Geek Repo

Company:ETH Zurich

Location:ETH Zürich

Home Page:https://www.linkedin.com/in/marko-kabic/

Github PK Tool:Github PK Tool

Marko Kabić's repositories

alpa

Auto parallelization for large-scale neural networks

License:Apache-2.0Stargazers:0Issues:0Issues:0

apex

A PyTorch Extension: Tools for easy mixed precision and distributed training in Pytorch

License:BSD-3-ClauseStargazers:0Issues:0Issues:0

attention-is-all-you-need-pytorch

A PyTorch implementation of the Transformer model in "Attention is All You Need".

Language:PythonLicense:MITStargazers:0Issues:0Issues:0

ColossalAI

Colossal-AI: A Unified Deep Learning System for Big Model Era

License:Apache-2.0Stargazers:0Issues:0Issues:0

COSTA

Distributed Communication-Optimal Shuffle and Transpose Algorithm

License:BSD-3-ClauseStargazers:0Issues:0Issues:0
License:Apache-2.0Stargazers:0Issues:0Issues:0

cudf

cuDF - GPU DataFrame Library

License:Apache-2.0Stargazers:0Issues:0Issues:0

cylon

Cylon is a fast, scalable, distributed memory, parallel runtime with a Pandas like DataFrame.

Language:C++License:Apache-2.0Stargazers:0Issues:0Issues:0
License:Apache-2.0Stargazers:0Issues:0Issues:0
License:MITStargazers:0Issues:0Issues:0

FasterTransformer

Transformer related optimization, including BERT, GPT

License:Apache-2.0Stargazers:0Issues:0Issues:0

FBGEMM

FB (Facebook) + GEMM (General Matrix-Matrix Multiplication) - https://code.fb.com/ml-applications/fbgemm/

License:NOASSERTIONStargazers:0Issues:0Issues:0

flash-attention

Fast and memory-efficient exact attention

Language:C++License:Apache-2.0Stargazers:0Issues:0Issues:0

flax

Flax is a neural network library for JAX that is designed for flexibility.

License:Apache-2.0Stargazers:0Issues:0Issues:0

gavel

Code for "Heterogenity-Aware Cluster Scheduling Policies for Deep Learning Workloads", which appeared at OSDI 2020

Language:Jupyter NotebookLicense:MITStargazers:0Issues:0Issues:0

google-research

Google Research

Language:Jupyter NotebookLicense:Apache-2.0Stargazers:0Issues:0Issues:0

marius

Large scale embeddings on a single machine.

Language:C++License:Apache-2.0Stargazers:0Issues:0Issues:0

mesh

Mesh TensorFlow: Model Parallelism Made Easier

License:Apache-2.0Stargazers:0Issues:0Issues:0

mesh-transformer-jax

Model parallel transformers in JAX and Haiku

License:Apache-2.0Stargazers:0Issues:0Issues:0

minGPT

A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training

License:MITStargazers:0Issues:0Issues:0

parallelformers

Parallelformers: An Efficient Model Parallelization Toolkit for Deployment

License:Apache-2.0Stargazers:0Issues:0Issues:0

pytorch3d

PyTorch3D is FAIR's library of reusable components for deep learning with 3D data

Language:PythonLicense:NOASSERTIONStargazers:0Issues:0Issues:0

query-engine

LingoDB: A new analytical database system that blurs the lines between databases and compilers.

License:MITStargazers:0Issues:0Issues:0

semiprof

Simple thread safe annotation based C++ profiler.

Language:C++License:BSD-3-ClauseStargazers:0Issues:0Issues:0

snn_toolbox

Toolbox for converting analog to spiking neural networks (ANN to SNN), and running them in a spiking neuron simulator.

License:MITStargazers:0Issues:0Issues:0

spack

A flexible package manager that supports multiple versions, configurations, platforms, and compilers.

Language:PythonLicense:NOASSERTIONStargazers:0Issues:0Issues:0

sql-parser

SQL Parser for C++. Building C++ object structure from SQL statements.

License:MITStargazers:0Issues:0Issues:0

transformer-from-scratch

Well documented, unit tested, type checked and formatted implementation of a vanilla transformer - for educational purposes.

Stargazers:0Issues:0Issues:0

transformers

🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.

License:Apache-2.0Stargazers:0Issues:0Issues:0

trax

Trax — Deep Learning with Clear Code and Speed

License:Apache-2.0Stargazers:0Issues:0Issues:0