lucapug / oreilly-gpt-hands-on-nlg

This repository contains code for the O'Reilly Live Online Training for NLG & GPT

Home Page:https://www.oreilly.com/live-events/hands-on-natural-language-generation-and-gpt/0636920061438/0636920061437

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

oreilly-logo

Hands on Natural Language Generation and GPT

This repository contains code for the O'Reilly Live Online Training for Hands on Natural Language Generation and GPT

This training will focus on how the GPT family of models are used for NLP tasks including abstractive text summarization and natural language generation. The training will begin with an introduction to necessary concepts including masked self attention, language models, and transformers and then build on those concepts to introduce the GPT architecture. We will then move into how GPT is used for multiple natural language processing tasks with hands-on examples of using pre-trained GPT-2 models as well as fine-tuning these models on custom corpora.

GPT models are some of the most relevant NLP architectures today and it is closely related to other important NLP deep learning models like BERT. Both of these models are derived from the newly invented transformer architecture and represent an inflection point in how machines process language and context.

The Natural Language Processing with Next-Generation Transformer Architectures series of online trainings provides a comprehensive overview of state-of-the-art natural language processing (NLP) models including GPT and BERT which are derived from the modern attention-driven transformer architecture and the applications these models are used to solve today. All of the trainings in the series blend theory and application through the combination of visual mathematical explanations, straightforward applicable Python examples within hands-on Jupyter notebook demos, and comprehensive case studies featuring modern problems solvable by NLP models.

Notebooks

Introduction to GPT2

Ingesting a new corpus

Multi-task learning with GPT2

Dolly Lite Notebook

  • Link to original blog: here

Prompt Engineering 101

More Third party fine-tuned GPT models

Instructor

Sinan Ozdemir is the Founder and CTO of LoopGenius where he uses State of the art AI to help people create and run their businesses. Sinan is a former lecturer of Data Science at Johns Hopkins University and the author of multiple textbooks on data science and machine learning. Additionally, he is the founder of the recently acquired Kylie.ai, an enterprise-grade conversational AI platform with RPA capabilities. He holds a master’s degree in Pure Mathematics from Johns Hopkins University and is based in San Francisco, CA.

About

This repository contains code for the O'Reilly Live Online Training for NLG & GPT

https://www.oreilly.com/live-events/hands-on-natural-language-generation-and-gpt/0636920061438/0636920061437


Languages

Language:Jupyter Notebook 98.7%Language:TeX 1.3%