An Enhanced Aspect-Based Sentiment Analysis Model Based on RoBERTa For Text Sentiment Analysis
Abstract
Full Text:
PDFReferences
Y. Kang, Z. Cai, C.-W. Tan, Q. Huang, and
H. Liu, “Natural language processing (nlp) in
management research: A literature review,”
Journal of Management Analytics, vol. 7,
no. 2, pp. 139–172, 2020.
P. Juyal, “Classification accuracy in sen-
timent analysis using hybrid and ensem-
ble methods,” in 2022 IEEE World Confer-
ence on Applied Intelligence and Computing
(AIC). IEEE, 2022, pp. 583–587.
H. H. Do, P. W. Prasad, A. Maag, and A. Al-
sadoon, “Deep learning for aspect-based sen-
timent analysis: a comparative review,” Ex-
pert systems with applications, vol. 118, pp.
–299, 2019.
Z. Yang, Z. Dai, Y. Yang, J. Carbonell,
R. R. Salakhutdinov, and Q. V. Le, “Xlnet:
Generalized autoregressive pretraining for
language understanding,” Advances in neu-
ral information processing systems, vol. 32,
K. Scaria, H. Gupta, S. A. Sawant, S. Mishra,
and C. Baral, “Instructabsa: Instruction
learning for aspect based sentiment analy-
sis,” arXiv preprint arXiv:2302.08624, 2023.
H. Yang and K. Li, “Improving implicit sen-
timent learning via local sentiment aggrega-
tion,” arXiv e-prints, pp. arXiv–2110, 2021.
H. Yang, B. Zeng, J. Yang, Y. Song, and
R. Xu, “A multi-task learning model for
chinese-oriented aspect polarity classification
and aspect term extraction,” Neurocomput-
ing, vol. 419, pp. 344–356, 2021.
E. H. d. Silva and R. M. Marcacini, “Aspect-
based sentiment analysis using bert with dis-
entangled attention,” in Proceedings, 2021.
Y. Zhang, M. Zhang, S. Wu, and J. Zhao,
“Towards unifying the label space for
aspect-and sentence-based sentiment analy-
sis,” arXiv preprint arXiv:2203.07090, 2022.
J. Dai, H. Yan, T. Sun, P. Liu, and X. Qiu,
“Does syntax matter? a strong baseline
for aspect-based sentiment analysis with
roberta,” arXiv preprint arXiv:2104.04986,
B. Xing and I. W. Tsang, “Understand me,
if you refer to aspect knowledge: Knowledge-
aware gated recurrent memory network,”IEEE Transactions on Emerging Topics in
Computational Intelligence, vol. 6, no. 5, pp.
–1102, 2022.
A. Rietzler, S. Stabinger, P. Opitz, and
S. Engl, “Adapt or get left behind: Domain
adaptation through bert language model
finetuning for aspect-target sentiment clas-
sification,” arXiv preprint arXiv:1908.11860,
A. Karimi, L. Rossi, and A. Prati,
“Improving bert performance for aspect-
based sentiment analysis,” arXiv preprint
arXiv:2010.11731, 2020.
Y. Song, J. Wang, T. Jiang, Z. Liu,
and Y. Rao, “Attentional encoder network
for targeted sentiment classification,” arXiv
preprint arXiv:1902.09314, 2019.
D. Kirange, R. R. Deshmukh, and M. Ki-
range, “Aspect based sentiment analysis
semeval-2014 task 4,” Asian Journal of Com-
puter Science and Information Technology
(AJCSIT) Vol, vol. 4, 2014.
R. JeffreyPennington and C. Manning,
“Glove: Global vectors for word representa-
tion,” in Conference on Empirical Methods in
Natural Language Processing. Citeseer, 2014.
V. Jakkula, “Tutorial on support vector ma-
chine (svm),” School of EECS, Washington
State University, vol. 37, no. 2.5, p. 3, 2006.
G. Biau, “Analysis of a random forests
model,” The Journal of Machine Learning
Research, vol. 13, no. 1, pp. 1063–1095, 2012.
Y. Yu, X. Si, C. Hu, and J. Zhang, “A review
of recurrent neural networks: Lstm cells and
network architectures,” Neural computation,
vol. 31, no. 7, pp. 1235–1270, 2019.
Y. Song, J. Wang, T. Jiang, Z. Liu,
and Y. Rao, “Attentional encoder network
for targeted sentiment classification,” arXiv
preprint arXiv:1902.09314, 2019.
P. He, X. Liu, J. Gao, and W. Chen,
“Deberta: Decoding-enhanced bert with
disentangled attention,” arXiv preprint
arXiv:2006.03654, 2020.
DOI: https://doi.org/10.31449/inf.v49i14.5423

This work is licensed under a Creative Commons Attribution 3.0 License.