Stephen Fernandes's repositories
attention-mechanisms
Implementations for a family of attention mechanisms, suitable for all kinds of natural language processing tasks and compatible with TensorFlow 2.0 and Keras.
awesome-dl-projects
This is a collection of the code that accompanies the reports in The Gallery by Weights & Biases.
awesome-fastapi
A curated list of awesome things related to FastAPI
Awesome-Vision-Attentions
Summary of related papers on visual attention. Related code will be released based on Jittor gradually.
bertviz
Tool for visualizing attention in the Transformer model (BERT, GPT-2, Albert, XLNet, RoBERTa, CTRL, etc.)
buggy-lightning-tpu-transformer
Repository created to show LIghtning devs the buggy Transformer TPU code
data-science-stack
NVIDIA Data Science stack tools
Deep-reinforcement-learning-with-pytorch
PyTorch implementation of DQN, AC, ACER, A2C, A3C, PG, DDPG, TRPO, PPO, SAC, TD3 and ....
fastai_xla_extensions
A Python package to allow fastai to run on TPUs using Pytorch-XLA
fastapi
FastAPI Tutorials
muzero-general
MuZero
nmt-transformer
Reproduction of the 2017 Transformer paper
PaintTransformer
PyTorch implementation of paper: Paint Transformer: Feed Forward Neural Painting with Stroke Prediction, ICCV 2021.
PPO-PyTorch
Minimal implementation of clipped objective Proximal Policy Optimization (PPO) in PyTorch
PyTorch-Distributed-Training
Example of PyTorch DistributedDataParallel
pytorch-image-models
PyTorch image models, scripts, pretrained weights -- ResNet, ResNeXT, EfficientNet, EfficientNetV2, NFNet, Vision Transformer, MixNet, MobileNet-V3/V2, RegNet, DPN, CSPNet, and more
QuartzNet-ASR-pytorch
Automatic Speech Recognition (ASR) model QuartzNet trained on English CommonVoice. In PyTroch with CTC loss and beam search.
safety-starter-agents
Basic constrained RL agents used in experiments for the "Benchmarking Safe Exploration in Deep Reinforcement Learning" paper.
SelfSupervisedLearning_PyTorch
PyTorch Implementation of Self-Supervised Learning models
torchgpipe
A GPipe implementation in PyTorch
Transformers_from_scratch_lightning
Transformers for seq translation task implemented from scratch with Pytorch Lightning
tutorials
PyTorch tutorials.
Various-Attention-mechanisms
This repository contain various types of attention mechanism like Bahdanau , Soft attention , Additive Attention , Hierarchical Attention etc in Pytorch, Tensorflow, Keras
vit-pytorch
Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pytorch
voice_datasets
🔊 A comprehensive list of open-source datasets for voice and sound computing (95+ datasets).