Sentiment Analysis using DistilBERT

Song Yi Ng, Kian Ming Lim, Chin Poo Lee, Jit Yan Lim

Research output: Chapter in Book/Conference proceedingConference contributionpeer-review

Abstract

Transformers is an architecture that performs well in NLP task. To understand and improve its performance on sentiment analysis, DistilBERT is employed as the base model. Sentiment analysis is a process that extracts subjective information from textual data and categorizes them into different classes. The classification classes may include polarity (positive, neutral, negative) or emotions (happy, sad, angry). In addition, multiple techniques such as fine tuning, regularization and hyperparameter tuning are applied to improve the performance of the model. The proposed solution acquired an accuracy score of 85.41% on Internet Movie Database (IMDB) dataset and 86.59% on Customer Reviews (CR) dataset.

Original languageEnglish
Title of host publication2023 IEEE 11th Conference on Systems, Process and Control, ICSPC 2023 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages84-89
Number of pages6
ISBN (Electronic)9798350340860
DOIs
Publication statusPublished - 2023
Externally publishedYes
Event11th IEEE Conference on Systems, Process and Control, ICSPC 2023 - Malacca, Malaysia
Duration: 16 Dec 2023 → …

Publication series

Name2023 IEEE 11th Conference on Systems, Process and Control, ICSPC 2023 - Proceedings

Conference

Conference11th IEEE Conference on Systems, Process and Control, ICSPC 2023
Country/TerritoryMalaysia
CityMalacca
Period16/12/23 → …

Keywords

  • Deep Learning
  • DistilBERT
  • Sentiment Analysis
  • Transformers

ASJC Scopus subject areas

  • Artificial Intelligence
  • Computer Science Applications
  • Information Systems
  • Information Systems and Management
  • Safety, Risk, Reliability and Quality
  • Control and Optimization
  • Modelling and Simulation
  • Education

Fingerprint

Dive into the research topics of 'Sentiment Analysis using DistilBERT'. Together they form a unique fingerprint.

Cite this