Niall Taylor's repositories
Att-BLSTM-relation-extraction
Implementation of Attention-Based Bidirectional Long Short-Term Memory Networks for Relation Classification.
auto_summ
Online demo of the paper "Unsupervised Document Summarization using pre-Trained Sentence Embeddings and Graph Centrality" published in the Second Scholarly Document Processing Workshop at NAACL-HLT 2021.
awesome-text-summarization
A curated list of resources dedicated to text summarization
beginners-pytorch-deep-learning
Repository for scripts and notebooks from the book: Programming PyTorch for Deep Learning
BertSum
Code for paper Fine-tune BERT for Extractive Summarization
dodrio-transformer-embedding-visualisation
Exploring attention weights in transformer-based models with linguistic knowledge.
ecco-transformer-visualisation
Explain, analyze, and visualize NLP language models. Ecco creates interactive visualizations directly in Jupyter notebooks explaining the behavior of Transformer-based language models (like GPT2, BERT, RoBERTA, T5, and T0).
FARM
:house_with_garden: Fast & easy transfer learning for NLP. Harvesting language models for the industry. Focus on Question Answering.
GLM
GLM (General Language Model)
guided_summarization
GSum: A General Framework for Guided Neural Abstractive Summarization
LinkBERT
[ACL 2022] LinkBERT: A Knowledgeable Language Model 😎 Pretrained with Document Links
ML-Collection-Pytorch-Tensorflow
original taken from: https://github.com/aladdinpersson/Machine-Learning-Collection
nn_interpretability
Pytorch implementation of various neural network interpretability methods
PegasusDemo
Abstractive text summarization with Google PEGASUS using HuggingFace Transformers
Pytorch-VAE-tutorial
A simple tutorial of Variational AutoEncoders with Pytorch
RATCHET
RAdiological Text Captioning for Human Examined Thoraxes
sapbert
[NAACL & ACL 2021] SapBERT: Self-alignment pretraining for BERT & XL-BEL: Cross-Lingual Biomedical Entity Linking.
simclr_pytorch
Implementation of SimCLR in PyTorch
sktime
A unified framework for machine learning with time series
SUPERVISED-CONTRASTIVE-LEARNING-FOR-PRE-TRAINED-LANGUAGE-MODEL-FINE-TUNING
in this project, I've implemented the Facebook paper about fine tuning RoBERTa with contrastive loss.
t-few
Code for T-Few from "Few-Shot Parameter-Efficient Fine-Tuning is Better and Cheaper than In-Context Learning"
tsne-pytorch
Pytorch implementation for t-SNE with cuda to accelerate
vectorizers_playground
Using the TIMC Document Vectorizers library