KrishArul26 / Text-Generation-using-GPT2

GPT-2 is a pre-trained language model that can be used for various NLP tasks such as text generation, data summarization, and translation. Language models are statistical tools to predict/generate the next word(s) in a sequence based on the preceding word(s). The GPT-2 architecture is based on the Transformers concept. The Transformer provides a mechanism based on encoder-decoders to detect input-output dependencies. At every stage, the model takes the previously generated data as additional input when generating the next output. GPT-2 has outperformed other language models when it comes to generating articles based on small input content.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Text-Generation-using-GPT2

Introduction

The process of computationally identifying and categorizing opinions expressed in a piece of text, especially to determine whether the writer's attitude towards a particular topic, product, etc. is positive, negative, or neutral. Understanding people’s emotions is essential for businesses since customers are able to express their thoughts and feelings more openly than ever before. By automatically analysing customer feedback, from survey responses to social media conversations, brands are able to listen attentively to their customers, and tailor products and services to meet their needs.

Technologies Used

1. IDE - Pycharm
2. GPT2 Large Pre-Trained Model
3. GPU - P-4000
4. Google Colab - Text Analysis
5. Flask- Fast API
6. Postman - API Tester

🔑 Prerequisites All the dependencies and required libraries are included in the file requirements.txt

  Python 3.6

🚀 Installation Text-Generation-using-GPT2

  1. Clone the repo
git clone https://github.com/KrishArul26/Text-Generation-using-GPT2.git
  1. Change your directory to the cloned repo
cd Text-Generation-using-GPT2

  1. Create a Python 3.6 version of virtual environment name 'lstm' and activate it
pip install virtualenv

virtualenv gpt2

gpt2\Scripts\activate

  1. Now, run the following command in your Terminal/Command Prompt to install the libraries required!!!
pip install -r requirements.txt

💡 Working

Type the following command:

python app.py

After that You will see the running IP adress just copy and paste into you browser and import or upload your speech then closk the predict button.

7. Result

About

GPT-2 is a pre-trained language model that can be used for various NLP tasks such as text generation, data summarization, and translation. Language models are statistical tools to predict/generate the next word(s) in a sequence based on the preceding word(s). The GPT-2 architecture is based on the Transformers concept. The Transformer provides a mechanism based on encoder-decoders to detect input-output dependencies. At every stage, the model takes the previously generated data as additional input when generating the next output. GPT-2 has outperformed other language models when it comes to generating articles based on small input content.