Wazzabeee / twitter-sentiment-analysis-pyspark

Comparative study of classification algorithms implemented in PySpark on the Sentiment 140 dataset.

Home Page:https://medium.com/towards-artificial-intelligence/large-scale-sentiment-analysis-with-pyspark-bdccf9256e35

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Twitter Sentiment Analysis (PySpark)

About

This repo contains all the notebooks used for sentimental analysis on the Sentiment140 dataset with PySpark. It was developed part of an end-of-term project for 8INF919 : Machine Learning for Big Data at UQAC in collaboration with Thomas Sirvent.

You can find in the repo the LaTeX report and the presentation slides associated to this project (in french). If you'd like to read english explanations check it out my website.

Models used

We worked with the following models :

  • Logistic Regression
  • Support Vector Machines (Linear Kernel)
  • Naive Bayes
  • Random Forest
  • Decision Tree

Features tested

  • Hashing TF-IDF
  • Count Vectorizer TF-IDF
  • ChisQSelector
  • 1-Gram, 2-Gram, 3-Gram

Results

Google Cloud Cluster (Dataproc)

In the notebooks directory, you'll find the a Python file called "cluster_logistic_job.py" if you are curious and you want to see how we ran our models in the Cloud.

ETL Pipeline & Live Sentiment Analysis

Another part of this project was to implement an ETL Pipeline with Live Sentiment Analysis using our pre-trained model, Spark Streaming, Apache Kafka and Docker. The repository for this part is available here.