KrishArul26 / Text-Generation-using-GPT2

GPT-2 is a pre-trained language model that can be used for various NLP tasks such as text generation, data summarization, and translation. Language models are statistical tools to predict/generate the next word(s) in a sequence based on the preceding word(s). The GPT-2 architecture is based on the Transformers concept. The Transformer provides a mechanism based on encoder-decoders to detect input-output dependencies. At every stage, the model takes the previously generated data as additional input when generating the next output. GPT-2 has outperformed other language models when it comes to generating articles based on small input content.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

KrishArul26/Text-Generation-using-GPT2 Issues

No issues in this repository yet.