There are 0 repository under attention-layer topic.
A bidirectional encoder-decoder LSTM neural network is trained for text summarization on the cnn/dailymail dataset. (MIT808 project)
In this repository, I have developed a CycleGAN architecture with embedded Self-Attention Layers, that could solve three different complex tasks. Here the same principle Neural Network architecture has been used to solve the three different task. Although truth be told, my model has not exceeded any state of the art performances for the given task, but the architecture was powerful enough to understand the task that has been given to solve and produce considerably good results.
NLP - Semantic Role Labeling using GCN, Bert and Biaffine Attention Layer. Developed in Pytorch
This is a deep learning network: ResNet with an attention layer that can be used on a custom data set.
pytorch sentiment classification example used NSMC data
Using a deep learning model that takes advantage of LSTM and a custom Attention layer, we create an algorithm that is able to train on reviews and existent summaries to churn out and generate brand new summaries of its own.