Adam Feldmann's repositories
gptbot
GPT4 & LangChain Chatbot for large PDF docs
Hungarian-gpt-3
this is the repo of our bilingual English-Hungarian GPT-3 model
memit
Mass-editing thousands of facts into a transformer memory (MEMIT)
prompt-engine
A library for helping developers craft prompts for Large Language Models
lm-evaluation-harness
A framework for few-shot evaluation of autoregressive language models.
bigscience
Central place for the engineering/scaling WG: documentation, SLURM scripts and logs, compute environment and data.
promptsource
Toolkit for creating, sharing and using natural language prompts.
text-generation-testing-ui
Web app for demoing the EAI models
deepspeed-gpt-neox
An implementation of model parallel autoregressive transformers on GPUs, based on the DeepSpeed library.
neox
Simple Annotated implementation of GPT-NeoX in PyTorch
mup
maximal update parametrization (µP)
PromptPapers
Must-read papers on prompt-based tuning for pre-trained language models.
OpenPrompt
An Open-Source Framework for Prompt-Learning.
gpt-neo-fine-tuning-example
Fine-Tune EleutherAI GPT-Neo And GPT-J-6B To Generate Netflix Movie Descriptions Using Hugginface And DeepSpeed
autoprompt
AutoPrompt: Automatic Prompt Construction for Masked Language Models.
minGPT
A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training
Hungarian-Wikipedia-QA
This repository contains a multilingual transformer-based Q/A solution for Hungarian Wikipedia pages.
onnxruntime
ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
DeepSpeed
DeepSpeed is a deep learning optimization library that makes distributed training easy, efficient, and effective.
PubmedQA-BERT
This repo contains a BERT-based engine for medical QA tasks using Pubmed abstracts.
copilot-docs
Documentation for GitHub Copilot
training_policies
Issues related to MLPerf™ training policies, including rules and suggested changes
sambanova_starter
SambaNova starter files, a.k.a., boilerplates.