moon23k's repositories
Transformer_Variants
Transformer Architectures Comparison in Natural Language Generation Tasks
Scheduled_Sampling
Scheduled Sampling for Transformers
Transformer
Code to address Natural Language Generation Tasks via Transformer Architecture
BackTranslation
BackTranslation Experiment
Transformer_Balance
Transformer Balance Research
Transformer_Fusion
This repo covers methodologies to utilize Pre Trained BERT model on NMT Task
CPT
Customized Pretraining for NLG Tasks
Efficient_Summarization
Text Summarization Modeling with three different Attention Types
MultiLingual_Translation
MultiLingual Translator
MultiTurn_Dialogue
Toward Characteristic Dialogue Generation
RNN_Seq2Seq
Code to address Natural Language Generation Tasks with Sequence to Sequence Architecture
RNN_Seq2Seq_Attention
Code to address Natural Language Generation Tasks via Sequence to Sequence Architecture with Attention Mechanism
Tokenizers
Tokenizer Performance Experiment