SEMTec: Social Emotion Mining Techniques for Analysis and Prediction of Facebook Post Reactions

Tobias Moers, Florian Krebs, Gerasimos Spanakis*

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingChapterAcademic

144 Downloads (Pure)


Nowadays social media are utilized by many people in order to review products and services. Subsequently, companies can use this feedback in order to improve customer experience. Facebook provided its users with the ability to express their experienced emotions by using five so-called 'reactions'. Since this launch happened in 2016, this paper is one of the first approaches to provide a complete framework for evaluating different techniques for predicting reactions to user posts on public pages. For this purpose, we used the FacebookR dataset that contains Facebook posts (along with their comments and reactions) of the biggest international supermarket chains. In order to build a robust and accurate prediction pipeline state-of-the-art neural network architectures (convolutional and recurrent neural networks) were tested using pretrained word embeddings. The models are further improved by introducing a bootstrapping approach for sentiment and emotion mining on the comments for each post and a data augmentation technique to obtain an even more robust predictor. The final proposed pipeline is a combination of a neural network and a baseline emotion miner and is able to predict the reaction distribution on Facebook posts with a mean squared error (or misclassification rate) of 0.1326.
Original languageEnglish
Title of host publicationAgents and Artificial Intelligence
EditorsJaap van den Herik, Ana Paula Rocha
Place of PublicationCham
Number of pages22
ISBN (Print)978-3-030-05453-3
Publication statusPublished - Dec 2018

Publication series

SeriesLecture Notes in Computer Science


Dive into the research topics of 'SEMTec: Social Emotion Mining Techniques for Analysis and Prediction of Facebook Post Reactions'. Together they form a unique fingerprint.

Cite this