
		<paper>
			<loc>https://jjcit.org/paper/269</loc>
			<title>ENHANCING FEW-SHOT LEARNING PERFORMANCE WITH BOOSTING ON TRANSFORMERS: EXPERIMENTS ON SENTIMENT ANALYSIS TASKS</title>
			<doi>10.5455/jjcit.71-1734985264</doi>
			<authors>Lenh Phan Cong Pham,Huan Thai Phong</authors>
			<keywords>Few-shot learning,Boosting,Transformer models,Sentiment analysis</keywords>
			<views>1778</views>
			<downloads>933</downloads>
			<received_date>23-Dec.-2024</received_date>
			<revised_date>  11-Jun.-2025</revised_date>
			<accepted_date>  4-Jul.-2025</accepted_date>
			<abstract>This study addresses challenges in sentiment analysis for low-resource educational contexts by proposing a 
framework that integrates Few-Shot Learning (FSL) with Transformer-based ensemble models and boosting 
techniques. Sentiment analysis of student feedback is crucial for improving teaching quality, yet traditional 
methods struggle with data scarcity and computational inefficiency. The proposed framework leverages self-
attention mechanisms in Transformers and combines models through Gradient Boosting to enhance performance 
and generalization with minimal labeled data. Evaluated on the UIT-VSFC dataset, comprising Vietnamese 
student feedback, the framework achieved superior F1-scores in sentiment and topic-classification tasks, 
outperforming individual models. Results demonstrate the potential of the proposed framework for extracting 
actionable insights to enhance educational experiences. Despite its effectiveness, the approach faces limitations, 
such as reliance on pre-trained models and computational complexity. Future work could optimize lightweight 
models and explore applications in other domains, like healthcare and finance.</abstract>
		</paper>


