cfoster0 / self-attention-experiments-vision

A project about replicating, evaluating and scaling up self-attention based models in vision.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Self-Attention experiments in Vision

To do

  • Add RPE, Rotary positional embeddings
  • Fix experiment code, update models to work without separate config
  • Test on TPUv3-8
  • Run first training runs comparing DeiT with absolute learned vs. rotary pos embeddings
  • Add class-attention layers, layerscale (CaiT)
  • Add CvT
  • Add TNT, Twins

About

A project about replicating, evaluating and scaling up self-attention based models in vision.


Languages

Language:Python 100.0%