crumb's repositories
low-rank-adapters
LoRA https://arxiv.org/abs/2106.09685 for different Transformers transformer models
transformers-8bit
wrapper for hivemind's gpt-j-8bit training code for easy loading
clip-classifier
personal tools
crumbs-testbed
you can NOT judge the code because it runs on my local machine for my local machine
diffusers-testing
🤗 Diffusers: State-of-the-art diffusion models for image and audio generation in PyTorch
notebook-hosting
notebooks
tinyvisions
comically small RGB+CLIP code
whatchamacallit
Wrapper for Diffusers Stable Diffusion
blog
Public repo for HF blog posts
datasettokenizer
literally one small function to pre-tokenize non-streamed datasets for easier small scale LM training w/ huggingface
galai
Model API for GALACTICA
gpt-engineer
Specify what you want it to build, the AI asks for clarification, and then builds it.
lora
Using Low-rank adaptation to quickly fine-tune diffusion models.
MinTransformers
wrappers around the pytorch implementation of transformers, based on mingpt
peft
peft without bitsandbytes for my poor windows computer
Prompt-Engineering-Guide
🐙 Guides, papers, lecture, notebooks and resources for prompt engineering
unsloth
CUDA_VISIBLE_DEVICES="0"