Enhancing Sentiment Analysis in Social Media Texts Using Transformer-Based NLP Models
International Journal of Electrical and Electronics Engineering |
© 2024 by SSRG - IJEEE Journal |
Volume 11 Issue 8 |
Year of Publication : 2024 |
Authors : S. Padmalal, I. Edwin Dayanand, Goda Srinivasa Rao, Santosh Gore |
How to Cite?
S. Padmalal, I. Edwin Dayanand, Goda Srinivasa Rao, Santosh Gore, "Enhancing Sentiment Analysis in Social Media Texts Using Transformer-Based NLP Models," SSRG International Journal of Electrical and Electronics Engineering, vol. 11, no. 8, pp. 208-216, 2024. Crossref, https://doi.org/10.14445/23488379/IJEEE-V11I8P118
Abstract:
Sentiment analysis plays a pivotal role in understanding public opinion and consumer sentiment expressed through social media platforms. Traditional methods of social media sentiment analysis texts are challenged by the use of informal languages, irony, and cultural variations. Here, they take into reason Transformer-based Natural Language Processing (NLP) models, notably BERT, to assess an enhancement in the stability and accuracy of the sentiment analysis. Using BERT-based models, Bi-LSTM, and dilated convolution, this research suggests novel methods for social media sentiment analysis. It successfully tackles the problems of informal language, maintaining extremely long-range context and lowering overfitting for sentiment classification in pre-trained BERT. The experiment demonstrates that BERT can detect files containing malicious code with an accuracy of 85%. When done on SMTs, there was a 3% improvement in the categorization of sentiment into positive, negative, and neutral categories. This is evident when comparing BERT to other well-known models, like Naive Bayes and Support Vector Machines, where the former performs better than the latter due to its enhanced capacity to identify sentiment expressions and contextual cues. The suggested results' generalization is connected to the real-world applications of brand analysis, public opinion research, and social media monitoring. BERT makes it feasible to examine consumer attitudes and market trends in greater detail than just scale points. Future research priorities include enhancing BERT's effectiveness in settings when computing resources are limited, offering resources for model interpretability, and shifting to using BERT for data that is multimodal and polyglot.
Keywords:
Social media, Natural Language Processing (NLP), BERT model, Sentimental analysis, Transformer.
References:
[1] Sayyida Tabinda Kokab, Sohail Asghar, and Shehneela Naz, “Transformer-Based Deep Learning Models for the Sentiment Analysis of Social Media Data,” Array, vol. 14, 2022.
[CrossRef] [Google Scholar] [Publisher Link]
[2] Xiaokang Gong et al., “Text Sentiment Analysis Based on Transformer and Augmentation,” Frontiers in Psychology, vol. 13, 2022.
[CrossRef] [Google Scholar] [Publisher Link]
[3] Yazdan Zandiye Vakili, Avisa Fallah, and Soodabeh Zakeri, “Enhancing Sentiment Analysis of Persian Tweets: A Transformer-Based Approach,” 2024 10th International Conference on Web Research (ICWR), Tehran, Iran, pp. 226-230, 2024. [CrossRef] [Google Scholar] [Publisher Link]
[4] Haiyan Xiao, and Linghua Luo, “An Automatic Sentiment Analysis Method for Short Texts Based on Transformer-BERT Hybrid Model,” IEEE Access, vol. 12, pp. 93305-93317, 2024.
[CrossRef] [Google Scholar] [Publisher Link]
[5] B.V. Pranay Kumar, and Manchala Sadanandam, “A Fusion Architecture of BERT and RoBERTa for Enhanced Performance of Sentiment Analysis of Social Media Platforms,” International Journal of Computing and Digital Systems, vol. 15, no. 1, pp. 51-66, 2024.
[CrossRef] [Google Scholar] [Publisher Link]
[6] Wael Alosaimi et al., “ArabBert-LSTM: Improving Arabic Sentiment Analysis Based on Transformer Model and Long Short-Term Memory,” Frontiers in Artificial Intelligence, vol. 7, 2024.
[CrossRef] [Google Scholar] [Publisher Link]
[7] Md Saef Ullah Miah et al., “A Multimodal Approach to Cross-Lingual Sentiment Analysis with Ensemble of Transformer and LLM,” Scientific Reports, vol. 14, no. 1, 2024.
[CrossRef] [Google Scholar] [Publisher Link]
[8] Mehdi Ashayeri, and Narjes Abbasabadi, “Unraveling Energy Justice in NYC Urban Buildings through Social Media Sentiment Analysis and Transformer Deep Learning,” Energy and Buildings, vol. 306, 2024.
[CrossRef] [Google Scholar] [Publisher Link]
[9] Tayef Billah Saad et al., “A Novel Transformer Based Deep Learning Approach of Sentiment Analysis for Movie Reviews,” 2024 6th International Conference on Electrical Engineering and Information & Communication Technology (ICEEICT), Dhaka, Bangladesh, pp. 1228-1233, 2024.
[CrossRef] [Google Scholar] [Publisher Link]
[10] Koyyalagunta Krishna Sampath, and M. Supriya, “Transformer Based Sentiment Analysis on Code Mixed Data,” Procedia Computer Science, vol. 233, pp. 682-691, 2024.
[CrossRef] [Google Scholar] [Publisher Link]
[11] M. Sangeetha, and K. Nimala, “DL-TBAM: Deep Learning Transformer Based Architecture Model for Sentiment Analysis on Tamil-English Dataset,” Journal of Intelligent & Fuzzy Systems, vol. 46, no. 4, pp. 7479-7493, 2024.
[CrossRef] [Google Scholar] [Publisher Link]
[12] Ranju Mandal et al., “Tweets Topic Classification and Sentiment Analysis Based on Transformer-Based Language Models,” Vietnam Journal of Computer Science, vol. 10, no. 2, pp. 117-134, 2023.
[CrossRef] [Google Scholar] [Publisher Link]
[13] Arif Ridho Lubis, Yulia Fatmi, and Deden Witarsyah, “Comparison of Transformer Based and Traditional Models on Sentiment Analysis on Social Media Datasets,” 2023 6th International Conference of Computer and Informatics Engineering (IC2IE), Lombok, Indonesia, pp. 163-168, 2023.
[CrossRef] [Google Scholar] [Publisher Link]
[14] Anandi Dutta, and Subasish Das, “Tweets about Self-Driving Cars: Deep Sentiment Analysis Using Long Short-Term Memory Network (LSTM),” International Conference on Innovative Computing and Communications, pp. 515-523, 2020.
[CrossRef] [Google Scholar] [Publisher Link]
[15] Olumide Ebenezer Ojo et al., “Transformer-Based Approaches to Sentiment Detection,” Recent Developments and the New Directions of Research, Foundations, and Applications, pp. 101-110, 2023.
[CrossRef] [Google Scholar] [Publisher Link]