Myle Ott (myleott)

myleott

Geek Repo

Location:New York, NY

Home Page:http://myleott.com

Github PK Tool:Github PK Tool

Myle Ott's starred repositories

torchtitan

A native PyTorch Library for large model training

Language:PythonLicense:BSD-3-ClauseStargazers:1370Issues:0Issues:0

ansible-role-rke2

Ansible Role to install RKE2 Kubernetes.

Language:JinjaLicense:MITStargazers:278Issues:0Issues:0

keras-core

A multi-backend implementation of the Keras API, with support for TensorFlow, JAX, and PyTorch.

Language:PythonLicense:Apache-2.0Stargazers:1269Issues:0Issues:0

mctx

Monte Carlo tree search in JAX

Language:PythonLicense:Apache-2.0Stargazers:2276Issues:0Issues:0

tini

A tiny but valid `init` for containers

Language:CLicense:MITStargazers:9668Issues:0Issues:0
Language:PythonLicense:Apache-2.0Stargazers:225Issues:0Issues:0

nginx-s3-gateway

NGINX S3 Caching Gateway

Language:JavaScriptLicense:Apache-2.0Stargazers:471Issues:0Issues:0

chatgpt-mac

ChatGPT for Mac, living in your menubar.

Language:JavaScriptStargazers:6381Issues:0Issues:0

growthbook

Open Source Feature Flagging and A/B Testing Platform

Language:TypeScriptLicense:NOASSERTIONStargazers:5861Issues:0Issues:0

langchain

🦜🔗 Build context-aware reasoning applications

Language:Jupyter NotebookLicense:MITStargazers:89828Issues:0Issues:0

dust

Amplify your team's potential with customizable and secure AI assistants.

Language:TypeScriptLicense:MITStargazers:914Issues:0Issues:0

skypilot

SkyPilot: Run LLMs, AI, and Batch jobs on any cloud. Get maximum savings, highest GPU availability, and managed execution—all with a simple interface.

Language:PythonLicense:Apache-2.0Stargazers:6323Issues:0Issues:0

img2dataset

Easily turn large sets of image urls to an image dataset. Can download, resize and package 100M urls in 20h on one machine.

Language:PythonLicense:MITStargazers:3475Issues:0Issues:0

k-diffusion

Karras et al. (2022) diffusion models for PyTorch

Language:PythonLicense:MITStargazers:2209Issues:0Issues:0

diffusers

🤗 Diffusers: State-of-the-art diffusion models for image and audio generation in PyTorch and FLAX.

Language:PythonLicense:Apache-2.0Stargazers:24279Issues:0Issues:0

pyrallis

Pyrallis is a framework for structured configuration parsing from both cmd and files. Simply define your desired configuration structure as a dataclass and let pyrallis do the rest!

Language:PythonLicense:MITStargazers:185Issues:0Issues:0

NeMo

A scalable generative AI framework built for researchers and developers working on Large Language Models, Multimodal, and Speech AI (Automatic Speech Recognition and Text-to-Speech)

Language:PythonLicense:Apache-2.0Stargazers:11081Issues:0Issues:0

FasterTransformer

Transformer related optimization, including BERT, GPT

Language:C++License:Apache-2.0Stargazers:5686Issues:0Issues:0

metaseq

Repo for external large-scale work

Language:PythonLicense:MITStargazers:6441Issues:0Issues:0

gpu-burn

Multi-GPU CUDA stress test

Language:C++License:BSD-2-ClauseStargazers:1277Issues:0Issues:0

timeout-decorator

Timeout decorator for Python

Language:PythonLicense:MITStargazers:616Issues:0Issues:0

subprocess-tee

A subprocess.run drop-in replacement that supports a tee mode, being able to display output in real time while still capturing it. No dependencies needed

Language:PythonLicense:MITStargazers:55Issues:0Issues:0

functorch

functorch is JAX-like composable function transforms for PyTorch.

Language:Jupyter NotebookLicense:BSD-3-ClauseStargazers:1381Issues:0Issues:0

gpt-neox

An implementation of model parallel autoregressive transformers on GPUs, based on the Megatron and DeepSpeed libraries

Language:PythonLicense:Apache-2.0Stargazers:6732Issues:0Issues:0

filesystem_spec

A specification that python filesystems should adhere to.

Language:PythonLicense:BSD-3-ClauseStargazers:966Issues:0Issues:0

Megatron-DeepSpeed

Ongoing research training transformer language models at scale, including: BERT & GPT-2

Language:PythonLicense:NOASSERTIONStargazers:1289Issues:0Issues:0

Megatron-DeepSpeed

Ongoing research training transformer language models at scale, including: BERT & GPT-2

Language:PythonLicense:NOASSERTIONStargazers:1762Issues:0Issues:0
Language:PythonStargazers:227Issues:0Issues:0

TurboTransformers

a fast and user-friendly runtime for transformer inference (Bert, Albert, GPT2, Decoders, etc) on CPU and GPU.

Language:C++License:NOASSERTIONStargazers:1456Issues:0Issues:0

hivemind

Decentralized deep learning in PyTorch. Built to train models on thousands of volunteers across the world.

Language:PythonLicense:MITStargazers:1930Issues:0Issues:0