Peter Albert's repositories
finetune-gpt2xl
Guide: Finetune GPT2-XL (1.5 Billion Parameters) and finetune GPT-NEO (2.7 B) on a single GPU with Huggingface Transformers using DeepSpeed
AREPL-backend
TS interface to python evaluator for AREPL
Language:PythonMIT000
audio
Data manipulation and transformation for audio signal processing, powered by PyTorch
Language:PythonBSD-2-Clause000
fairseq
Facebook AI Research Sequence-to-Sequence Toolkit written in Python.
Language:PythonMIT000
metaseq
Repo for external large-scale work
Language:PythonMIT000
openai-python
The OpenAI Python library provides convenient access to the OpenAI API from applications written in the Python language.
Language:PythonMIT000
tdnc
Research project to combine a pretrained transformer like BERT with the Differential Neural Computer, to add a memory component to BERT
Language:Jupyter NotebookMIT000