UdbhavPrasad072300 / Transformer-Implementations

Library - Vanilla, ViT, DeiT, BERT, GPT

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Transformer Implementations

License PyPi Version PyPi Downloads Package Status

Transformer Implementations and some examples with them

Implemented:

  • Vanilla Transformer
  • ViT - Vision Transformers
  • DeiT - Data efficient image Transformers
  • BERT - Bidirectional Encoder Representations from Transformers
  • GPT - Generative Pre-trained Transformer

Installation

PyPi

$ pip install transformer-implementations

or

python setup.py build
python setup.py install

Example

In notebooks directory there is a notebook on how to use each of these models for their intented use; such as image classification for Vision Transformer (ViT) and others. Check them out!

from transformer_package.models import ViT

image_size = 28 # Model Parameters
channel_size = 1
patch_size = 7
embed_size = 512
num_heads = 8
classes = 10
num_layers = 3
hidden_size = 256
dropout = 0.2

model = ViT(image_size, 
            channel_size, 
            patch_size, 
            embed_size, 
            num_heads, 
            classes, 
            num_layers, 
            hidden_size, 
            dropout=dropout).to(DEVICE)
            
prediction = model(image_tensor)

Language Translation

from "Attention is All You Need": https://arxiv.org/pdf/1706.03762.pdf

Models trained with Implementation:

Multi-class Image Classification with Vision Transformers (ViT)

from "An Image is Worth 16x16 words: Transformers for image recognition at scale": https://arxiv.org/pdf/2010.11929v1.pdf

Models trained with Implementation:

Note: ViT will not perform great on small datasets

Multi-class Image Classification with Data-efficient image Transformers (DeiT)

from "Training data-efficient image transformers & distillation through attention": https://arxiv.org/pdf/2012.12877v1.pdf

Models trained with Implementation:

About

Library - Vanilla, ViT, DeiT, BERT, GPT

License:MIT License


Languages

Language:Jupyter Notebook 86.8%Language:Python 13.2%