There are 0 repository under seq2seq-attn topic.
pytorch implementation of "Get To The Point: Summarization with Pointer-Generator Networks"
Image to LaTeX (Seq2seq + Attention with Beam Search) - Tensorflow
CRNN with attention to do OCR,add Chinese recognition
This is a pytorch implementation for the BST model from Alibaba https://arxiv.org/pdf/1905.06874.pdf
Korean English NMT(Neural Machine Translation) with Gluon
Key value memory network implemented using keras
Chatbot using Tensorflow (Model is seq2seq) Extend V2.0 ko
C# Sequence to Sequence Learning with Attention using LSTM neural Networks
use an AI model to write couplet with TensorFlow 2 / 用AI对对联
load point forecast
Configurable Encoder-Decoder Sequence-to-Sequence model. Built with TensorFlow.
Neural Machine Translation using LSTMs and Attention mechanism. Two approaches were implemented, models, one without out attention using repeat vector, and the other using encoder decoder architecture and attention mechanism.
Sequence-to-sequence model implementations including RNN, CNN, Attention, and Transformers using PyTorch
Generates summary of a given news article. Used attention seq2seq encoder decoder model.
基于Seq2Seq+Attention模型的Textsum文本自动摘要
Chatbot using Seq2Seq model using Tensorflow
A few approaches using sequence to sequence (seq2seq) architecture to solve semantice parsing problem
This repository contains the code for a speech to speech translation system created from scratch for digits translation from English to Tamil
Sequence to sequence learning for GEC task using several deep models.
Neural Machine Translation by Seq2Seq Model with Attention layer
Seq2Seq Neural Machine Translation Task, Completed as a part of CS462-NLP Coursework
Experimentation of converting English to Pig Latin via a variety of Vanilla-seq2seq networks, Attention-mechanism based models and Transformer based Machine translation system.
Seq2Seq model that restores punctuation on English input text.
Seq2seq-attention house price prediction.
French to English neural machine translation trained on multi30k dataset.
I replicate and make the original Seq2Seq from PyTorch tutorials to be easy to use and adapt.
Some natural language processing networks from scratch in PyTorch for personal educational purposes.
A Seq2Seq Attention chatbot deployed on Heroku
Generative adversarial imitation learning to produce a proxy for the reward function present in dialogue.
This repository is base on Pytorch Tutorial with some experiments and refined.