Attention Examples This repo contains code examples that build up modern attention layers from the fundamentals. I wanted to understand how self-attention blocks worked, particularly for images, so have tried to write them myself.