Brando Miranda's repositories
bitsandbytes
8-bit CUDA functions for PyTorch
Code-LMs
Guide to using pre-trained large language models of source code
coq_jupyter
Jupyter kernel for Coq
coq_serapy
Python bindings for Coq Serapi. Primarily designed for use in Proverbot9001. Works for Coq versions 8.9-8.12.
DeepSpeed
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
Diva
DIversity in VerificAtion
gpt-neo-fine-tuning-example
Fine-Tune EleutherAI GPT-Neo And GPT-J-6B To Generate Netflix Movie Descriptions Using Hugginface And DeepSpeed
learn2learn
A PyTorch Library for Meta-learning Research
llm-seminar
Seminar on Large Language Models (COMP790-101 at UNC Chapel Hill, Fall 2022)
makefiletutorial
Learn make by example
memorizing-transformers-pytorch
Implementation of Memorizing Transformers (ICLR 2022), attention net augmented with indexing and retrieval of memories using approximate nearest neighbors, in Pytorch
miniF2F
Formal to Formal Mathematics Benchmark
mistral
Mistral: A strong, northwesterly wind: Framework for transparent and accessible large-scale language model training, built with Hugging Face 🤗 Transformers.
ml-for-proofs
An open bibliography of machine learning for formal proof papers
plugin-tutorial
Yet another plugin tutorial, this time as an exercise for 598
project-menu
See the issue board for the current status of active and prospective projects!
state-spaces
Sequence Modeling with Structured State Spaces
transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
unicoq
An enhanced unification algorithm for Coq