Enhancing Pragmatic Understanding in Natural Language Processing through Advanced Transformer Models
Authors
-
Johnwendy Chinedu Nwaukwa
Department of Computer Science, University of Benin, Ugbowo, Benin city, Nigeria
-
Imianvan Anthony Agboizebeta
Department of Computer Science, University of Benin, Ugbowo, Benin city, Nigeria
Abstract
Current methods of assessing text complexity often overlook the nuancedrnlinguistic and contextual elements crucial for pragmatic comprehension.rnThis study addresses this gap by introducing a novel framework thatrnintegrates advanced Natural Language Processing techniques, includingrnphrase dependency parsing, POS tagging and the Bigram Model tornenhance on the transformer models for pragmatic identification andrnevaluation, to better capture these intricate aspects. The researchrnsystematically evaluates the effectiveness of transformer models suchrnas BERT, RoBERTa, ALBERT, and XLNet in tasks related to pragmaticrnunderstanding. Through a meticulous analysis of linguistic features,rncontextual cues, and model performance metrics, the study providesrnvaluable insights and recommendations for enhancing text complexityrnevaluation. The results highlight RoBERTa’s exceptional performance,rnachieving an accuracy of 0.89 and demonstrating strong contextual andrnsyntactic comprehension. While BERT, ALBERT, and XLNet showrnsimilar performance metrics, there are slight variations in featurernimportance values. These findings pave the way for more refined andrncomprehensive approaches to pragmatic understanding in Natural Language Processing.
Keywords: Text complexity, ALBERT, BERT, ROBERTA, POS tagging, Sentence dependency parsing