There are 9 repositories under text-generator topic.
Simple Text-Generator with OpenAI gpt-2 Pytorch Implementation
Code for training and evaluation of the model from "Language Generation with Recurrent Generative Adversarial Networks without Pre-training"
PyTorch Implementation of NBA game summary generator.
In this repository you will find an end-to-end model for text generation by implementing a Bi-LSTM-LSTM based model with PyTorch's LSTMCells.
Fast SEO text generator on a mask.
Auto generate text using Markov models.
Make ASCII art with HYPERRECTANGLES
Golang text generator for generate SEO texts
Promoting critical thinking through machine-generated prompts.
Train a bidirectional or normal LSTM recurrent neural network to generate text on a free GPU using any dataset. Just upload your text file and click run!
Various text/word generation methods implemented in Unity.
Train Markov models on Internet Archive text files.
Advanced Internationalized OpenFX Text Generator
Allows a user to generate sentences based off a Markov model trained on a chosen text corpus.
This repo is the continuation of the Machine-Learning repo. Here I'm gonna to upload all the examples and exercises which I'll do to learn deep learning techniques and all the problems I'll solve using these last few.
The one model for genesis of peptide ligands
Python编写的处理法务邮单自动批量生成的脚本小工具-提取判决书内容免去手输填充邮单-Legal agency postal receipt automatically generate app
Eminem Lyrics Generators using GPT-2 and RNN models, with performed Data Analysis.
Over-engineered string template engine with a simple interface, focused on versatility and user control.
Simple text summarizer created by using Tensorflow library.
Bad-Wiki is an automatic text generator. It creates really bad nonsense definitions for the terms you provide.
Create, update and run Markov chain models for text generation
Several tools for changing a word into something.
A VS4Mac addin for generating dummy text.
A model that generates random recipes. Trained with 50+ Epicurious recipes.
Generates "natural-sounding" text using a Markov model and sample textual training input. Given some sample text from which to build a model, the program prints out one or more sentences by randomly traversing a Markov chain that models the source text.
Using LSTMs to generate a TV Script (Keras).