Search papers, labs, and topics across Lattice.
Sentiment analysis of social media text has become increasingly important for understanding public opinion, brand perception, and societal trends. However, social media content is often informal, contextdependent, and filled with slang, sarcasm, emojis, and short-text structures, making traditional machine learning approaches less effective. This paper proposes a Context-Aware Sentiment Classification Framework using attention-based transformer models to capture semantic relationships and contextual dependencies within social media text. The proposed system leverages pre-trained transformer architectures such as BERT and its variants, enhanced with attention mechanisms to focus on sentimentbearing words and contextual cues. The framework incorporates contextual embeddings, token-level attention scoring, and fine-tuning strategies to improve classification performance on noisy and short-text datasets. Experimental results demonstrate superior accuracy and robustness compared to conventional machine learning and recurrent neural network models. The proposed model effectively captures nuanced sentiment expressions, making it suitable for real-time social media monitoring and opinion mining applications.