There are 1 repository under text-generation-using-rnn topic.
Website which uses Deep Learning to generate horror stories.
Generador de freestyle basado en batallas de RAP
Graph convolution-based visual storytelling
Text generation using a character-based RNN with LSTM cells. We will work with a dataset of Shakespeare's writing from Andrej Karpathy's The Unreasonable Effectiveness of Recurrent Neural Networks. Given a sequence of characters from this data ("Shakespear"), train a model to predict the next character in the sequence ("e"). Longer sequences of text can be generated by calling the model repeatedly. Developed using Keras. Inspired by the following notebook: https://colab.research.google.com/github/tensorflow/docs/blob/master/site/en/tutorials/text/text_generation.ipynb#scrollTo=BwpJ5IffzRG6
A re-implementation of the Sentence VAE paper, Generating Sentences from a Continuous Space
Website which uses Deep Learning to generate horror stories.
Developed an LSTM model to generate text, mimicking the style of Nietzsche's writings
Text GAN trained on Star Wars episode IV script. All information taken from official TensorFlow tutorial page. Char-embedded.
PyTorchのGRUを用いてseqence-to-seqenceを実装
This repo contains a collection of scripts for builiding a text generator by training a recurrent neural network on a large text dataset.
RNN is one of the very powerful deep-learning algorithm which works amazingly well on Sequential Data. As historical or past data plays major role in the prediction of sequential data, RNN takes these inputs of not only recent output but also past output. Here I have used GRU for the prediction of eminem's Rap.
Generate text using a character-based RNN