AmirAbaskohi / Transformers-Tutorial

Welcome to my Transformers tutorial series! In this series, I'll be diving into the powerful Transformer architecture and its implementation in TensorFlow and PyTorch. Whether you're an experienced NLP practitioner or just starting out, I hope you'll find the series informative and engaging.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

YouTube Keras TensorFlow PyTorch NumPy Pandas Python

Transformers Tutorial Series

Welcome to my Transformers tutorial series! In this series, I'll be diving into the powerful Transformer architecture and its implementation in TensorFlow and PyTorch. Whether you're an experienced NLP practitioner or just starting out, I hope you'll find the series informative and engaging.

Watch the Videos

The tutorial series is available on my YouTube channel. Each video covers a different aspect of the Transformer architecture and its implementation in TensorFlow and PyTorch. Here's a quick overview of what you can expect to learn:

  • Introduction to the Transformer architecture and its components
  • Multi-head attention and its role in the Transformer
  • Positional encoding and its importance for sequence modeling
  • Feedforward networks and their implementation in the Transformer
  • Building a complete Transformer model in TensorFlow and PyTorch
  • Fine-tuning a pre-trained Transformer model for various NLP tasks

Clone the Repository

To follow along with the tutorial series, you can clone this repository to your local machine:

git clone https://github.com/AmirAbaskohi/Transformers-Tutorial

Requirements

To run the code examples in the tutorial series, you'll need the following packages:

  • TensorFlow 2.x
  • Keras
  • PyTorch 1.x
  • NumPy
  • Matplotlib
  • Pandas

You can install these packages using pip:

pip install tensorflow
pip install keras
pip install pandas as pd
pip install numpy
pip install matplotlib
pip install torch

How to run?

To run the code, just execute the main file for the TensorFlow and PyTorch implementations. To do so, run the following command:

python main

Additionally, for the TensorFlow implementation, there's a notebook notebook.ipynb that includes the results of some of the functionalities of Transformers like positional encoding.

Feedback and Contributions

I'm always looking to improve the tutorial series and welcome any feedback or contributions you might have. If you spot any errors or have suggestions for new topics to cover, please feel free to create an issue or pull request in the repository. I appreciate your support and hope you find the tutorial series helpful!

About

Welcome to my Transformers tutorial series! In this series, I'll be diving into the powerful Transformer architecture and its implementation in TensorFlow and PyTorch. Whether you're an experienced NLP practitioner or just starting out, I hope you'll find the series informative and engaging.


Languages

Language:Jupyter Notebook 80.7%Language:Python 19.3%