📰 Quantifying Valence and Arousal in Text with Multilingual Pre-trained Transformers

This work evaluates the use of pre-trained Transformers to predict valence and arousal in text, using a dimensional approach. Our results show that model size significantly impacts prediction quality, and fine-tuning a large model enables confident predictions across multiple languages.

#NLP #AffectiveComputing #SentimentAnalysis #BERT #MultilingualEmotionRecognition