MPNet-GRUs: Sentiment Analysis with Masked and Permuted Pre-training for Language Understanding and Gated Recurrent Units

Citation

Loh, Nicole Kai Ning and Lee, Chin Poo and Ong, Thian Song and Lim, Kian Ming (2024) MPNet-GRUs: Sentiment Analysis with Masked and Permuted Pre-training for Language Understanding and Gated Recurrent Units. IEEE Access. p. 1. ISSN 2169-3536

[img] Text
1.pdf - Published Version
Restricted to Repository staff only

Download (708kB)

Abstract

Sentiment analysis, a pivotal task in natural language processing, aims to discern opinions and emotions expressed in text. However, existing methods for sentiment analysis face various challenges such as data scarcity, complex language patterns, and long-range dependencies. In this paper, we propose MPNetGRUs, a hybrid deep learning model that integrates three key components: MPNet, BiGRU, and GRU. MPNet, a transformer-based pre-trained language model, enhances language understanding through masked and permuted pre-training. BiGRU and GRU, recurrent neural networks, capture long-term dependencies bidirectionally and unidirectionally. By combining the strengths of these models, MPNet-GRUs aims to provide a more effective and efficient solution for sentiment analysis. Evaluation on three benchmark datasets reveals the superior performance of MPNet-GRUs: 94.71% for IMDb, 86.27% for Twitter US Airline Sentiment, and 88.17% for Sentiment140, demonstrating its potential to advance sentiment analysis.

Item Type: Article
Uncontrolled Keywords: BiGRU, GRU, MPNet, sentiment, sentiment analysis, transformer.
Subjects: L Education > LB Theory and practice of education > LB1060 Learning
Divisions: Faculty of Information Science and Technology (FIST)
Depositing User: Ms Nurul Iqtiani Ahmad
Date Deposited: 30 May 2024 02:46
Last Modified: 30 May 2024 02:46
URII: http://shdl.mmu.edu.my/id/eprint/12482

Downloads

Downloads per month over past year

View ItemEdit (login required)