sidharth72 / Natural-Language-Processing-Basics

Concepts related to NLP summarized with mathematical intution.

Repository from Github https://github.comsidharth72/Natural-Language-Processing-BasicsRepository from Github https://github.comsidharth72/Natural-Language-Processing-Basics

NLP Basics Jupyter Notebooks

Overview

This repository contains a collection of Jupyter notebooks covering various fundamental concepts and techniques in Natural Language Processing (NLP). Each notebook provides explanations, code examples, and hands-on exercises to help you understand and implement NLP algorithms and models.

Notebooks

  1. NLP Basics: This notebook covers the basic concepts of NLP, including text preprocessing, tokenization, stemming, and lemmatization.

  2. Vectors and Word Embeddings: Explore the concept of word embeddings and learn how to represent words as dense vectors using techniques like Word2Vec, TF-IDF, Pointwise Mutual Information, etc.

  3. N-gram Language Models: Learn about n-gram language models, which model the probability of a word given the previous n-1 words in a sequence of text. Understand how to build and use n-gram models for tasks like text generation and prediction.

  4. Naive Bayes for Sentiment Analysis: Implement a simple sentiment analysis model using the Naive Bayes classifier. Learn how to preprocess text data, extract features, and train a classifier to classify text sentiment.

About

Concepts related to NLP summarized with mathematical intution.


Languages

Language:Jupyter Notebook 100.0%