Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Exploring Transformer Models for Sentiment Analysis in Airline Service Reviews
Rangamati Science and Technology University, Dept. of CSE, Rangamati-4500, Bangladesh.
International Islamic University, Dept. of CSE, Chittagong, Bangladesh.
Port City International University, Dept. of CSE, Chittagong, Bangladesh.
Port City International University, Dept. of CSE, Chittagong, Bangladesh.
Show others and affiliations
2024 (English)In: 2024 IEEE Conference on Computing Applications and Systems (COMPAS), IEEE, 2024Conference paper, Published paper (Refereed)
Abstract [en]

Given the substantial expansion of the airline sector in the last twenty years, it has become imperative to have efficient techniques for evaluating customer input in order to improve service provision. Using a variety of machine learning approaches on two separate datasets and their merged form, this research paper examines the effectiveness of sentiment analysis in extracting attitudes and emotions from airline service ratings. Initially, conventional machine learning methods including Random Forest, K-Nearest Neighbours (KNN), AdaBoost, Logistic Regression, Support Vector Machines (SVM), and Decision Trees were used. Among them, Random Forest demonstrated the best level of accuracy, reaching 99% on one of the available datasets. Following that, a hybrid model employing Convolutional Neural Networks and Bidirectional Long Short-Term Memory (CNN-Bi-LSTM) was investigated, achieving a maximum accuracy of 98%. The research subsequently progressed to utilizing transformer-based models such as DistilBERT, RoBERTa, ALBERT, Electra, and two versions of BERT (basic and big), with a specific emphasis on their ability to effectively process intricate contextual information. Despite comparable performance metrics among models, ALBERT demonstrated a little lower accuracy of 97% in a specific dataset, highlighting the intricate capabilities of transformer structures.

Place, publisher, year, edition, pages
IEEE, 2024.
Keywords [en]
Transformer, DistilBERT, RoBERTa, Electra, CNN, CNN-Bi-LSTM
National Category
Computer Sciences
Research subject
Cyber Security
Identifiers
URN: urn:nbn:se:ltu:diva-111413DOI: 10.1109/COMPAS60761.2024.10796289Scopus ID: 2-s2.0-85215501764OAI: oai:DiVA.org:ltu-111413DiVA, id: diva2:1932667
Conference
2024 IEEE Conference on Computing Applications and Systems (COMPAS), Chattogram, Bangladesh, September 25-26, 2024
Note

ISBN for host publication: 979-8-3315-2976-5;

Available from: 2025-01-29 Created: 2025-01-29 Last updated: 2025-10-21Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Andersson, Karl

Search in DiVA

By author/editor
Andersson, Karl
By organisation
Computer Science
Computer Sciences

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 27 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf