marina-sam14 / Transformers-Text-Classification

Utilizing DistilBERT for Analyzing Sentiments in Movie Reviews

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Overview

This project aims to fine-tune a pre-trained BERT model for sentiment analysis, with a focus on hyperparameter tuning. The primary objectives include optimizing the model's task-specific layers and determining the number of frozen BERT encoder blocks. The tuning process is conducted on the development subset of the dataset to enhance performance.

Key Tasks

  • Fine-Tuning BERT Model: Implement fine-tuning procedures to adapt BERT for sentiment analysis.
  • Hyperparameter Tuning: Explore and tune hyperparameters, such as task-specific layer sizes and frozen BERT encoder blocks.
  • Experimental Comparisons: Provide experimental results, including the performance of a baseline majority classifier and comparisons with the best classifiers from prior exercises.

Experimental Setup

  • Dataset: Utilize a diverse dataset, focusing on a development subset for hyperparameter tuning.
  • Monitoring Performance: During training, closely monitor the model's performance on the development subset to determine optimal epochs.

Results and Comparisons

  • Include results from the baseline majority classifier and best classifiers from previous exercises for comprehensive comparisons.

About

Utilizing DistilBERT for Analyzing Sentiments in Movie Reviews


Languages

Language:Jupyter Notebook 100.0%