Danielle Rothermel (drothermel)

drothermel

Geek Repo

Github PK Tool:Github PK Tool

Danielle Rothermel's starred repositories

dnd-kit

The modern, lightweight, performant, accessible and extensible drag & drop toolkit for React.

Language:TypeScriptLicense:MITStargazers:12107Issues:0Issues:0

jupyterlab-vim

Vim notebook cell bindings for JupyterLab

Language:TypeScriptLicense:MITStargazers:667Issues:0Issues:0

codemodder-python

Python implementation of the Codemodder framework

Language:PythonLicense:AGPL-3.0Stargazers:34Issues:0Issues:0

fasten-onprem

Fasten is an open-source, self-hosted, personal/family electronic medical record aggregator, designed to integrate with 100,000's of insurances/hospitals/clinics

Language:GoLicense:GPL-3.0Stargazers:1457Issues:0Issues:0

llm-course

Course to get into Large Language Models (LLMs) with roadmaps and Colab notebooks.

Language:Jupyter NotebookLicense:Apache-2.0Stargazers:34741Issues:0Issues:0

FlexGen

Running large language models on a single GPU for throughput-oriented scenarios.

Language:PythonLicense:Apache-2.0Stargazers:9091Issues:0Issues:0

llama

Inference code for Llama models

Language:PythonLicense:NOASSERTIONStargazers:54624Issues:0Issues:0

atlas

Code repository for supporting the paper "Atlas Few-shot Learning with Retrieval Augmented Language Models",(https//arxiv.org/abs/2208.03299)

Language:PythonLicense:NOASSERTIONStargazers:500Issues:0Issues:0
Language:PythonLicense:CC0-1.0Stargazers:5Issues:0Issues:0

nle-language-wrapper

Nethack Learning Environment Wrapper for Language Interface

Language:PythonLicense:MITStargazers:32Issues:0Issues:0

rlmeta

RLMeta is a light-weight flexible framework for Distributed Reinforcement Learning Research.

Language:PythonLicense:MITStargazers:285Issues:0Issues:0

hydra

Hydra is a framework for elegantly configuring complex applications

Language:PythonLicense:MITStargazers:8451Issues:0Issues:0

mbrl-lib

Library for Model Based RL

Language:PythonLicense:MITStargazers:937Issues:0Issues:0

accelerate

🚀 A simple way to launch, train, and use PyTorch models on almost any device and distributed configuration, automatic mixed precision (including fp8), and easy-to-configure FSDP and DeepSpeed support

Language:PythonLicense:Apache-2.0Stargazers:7433Issues:0Issues:0

scalene

Scalene: a high-performance, high-precision CPU, GPU, and memory profiler for Python with AI-powered optimization proposals

Language:PythonLicense:Apache-2.0Stargazers:11433Issues:0Issues:0

compressive-transformer-pytorch

Pytorch implementation of Compressive Transformers, from Deepmind

Language:PythonLicense:MITStargazers:155Issues:0Issues:0

joeynmt

Minimalist NMT for educational purposes

Language:PythonLicense:Apache-2.0Stargazers:668Issues:0Issues:0

pytorch-lightning

Pretrain, finetune and deploy AI models on multiple GPUs, TPUs with zero code changes.

Language:PythonLicense:Apache-2.0Stargazers:27587Issues:0Issues:0

transformers

🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.

Language:PythonLicense:Apache-2.0Stargazers:129794Issues:0Issues:0

tokenizers

💥 Fast State-of-the-Art Tokenizers optimized for Research and Production

Language:RustLicense:Apache-2.0Stargazers:8743Issues:0Issues:0

gym-sokoban

Sokoban environment for OpenAI Gym

Language:PythonLicense:MITStargazers:317Issues:0Issues:0

detr

End-to-End Object Detection with Transformers

Language:PythonLicense:Apache-2.0Stargazers:13174Issues:0Issues:0