skip to main content
10.1145/3415959.3416000acmotherconferencesArticle/Chapter ViewAbstractPublication PagesrecsysConference Proceedingsconference-collections
research-article
Open access

Predicting Twitter Engagement With Deep Language Models

Published: 26 September 2020 Publication History

Abstract

Twitter has become one of the main information sharing platforms for millions of users world-wide. Numerous tweets are created daily, many with highly time sensitive content such as breaking news, new multimedia content or personal updates. Consequently, accurately recommending relevant tweets to users in a timely manner is a highly important and challenging problem. The 2020 ACM RecSys Challenge is aimed at benchmarking leading recommendation models for this task. The challenge is based on a large and recent dataset of over 200M tweet engagements released by Twitter with content in over 50 languages. In this work we present our approach where we leverage recent advances in deep language modeling and attention architectures, to combine information from extracted features, user engagement history and target tweet content. We first fine-tune leading multilingual language models M-BERT and XLM-R for Twitter data. Embeddings from these models are used to extract tweet and user history representations. We then combine all components together and jointly train them to maximize engagement prediction accuracy. Our approach achieves highly competitive performance placing 2’nd on the final private leaderboard. Full code is available here: https://github.com/layer6ai-labs/RecSys2020.

References

[1]
Luca Belli, Sofia Ira Ktena, Alykhan Tejani, Alexandre Lung-Yut-Fon, Frank Portman, Xiao Zhu, Yuanpu Xie, Akshay Gupta, Michael Bronstein, Amra Delić, 2020. Privacy-Preserving Recommender Systems Challenge on Twitter’s Home Timeline. arXiv preprint arXiv:2004.13715(2020).
[2]
Kevin Clark, Minh-Thang Luong, Quoc V Le, and Christopher D Manning. 2020. Electra: Pre-training text encoders as discriminators rather than generators. In International Conference on Learning Representations.
[3]
Alexis Conneau, Kartikay Khandelwal, Naman Goyal, Vishrav Chaudhary, Guillaume Wenzek, Francisco Guzmán, Edouard Grave, Myle Ott, Luke Zettlemoyer, and Veselin Stoyanov. 2019. Unsupervised cross-lingual representation learning at scale. In Association for Computational Linguistics.
[4]
Jacob Devlin, Ming-Wei Chang, Kenton Lee, and Kristina Toutanova. 2018. Bert: Pre-training of deep bidirectional transformers for language understanding. In Association for Computational Linguistics.
[5]
Yuheng Hu, Shelly Farnham, and Kartik Talamadupula. 2015. Predicting user engagement on twitter with real-world events. In Ninth International AAAI Conference on Web and Social Media.
[6]
Zhenzhong Lan, Mingda Chen, Sebastian Goodman, Kevin Gimpel, Piyush Sharma, and Radu Soricut. 2020. Albert: A lite Bert for self-supervised learning of language representations. In International Conference on Learning Representations.
[7]
Yinhan Liu, Myle Ott, Naman Goyal, Jingfei Du, Mandar Joshi, Danqi Chen, Omer Levy, Mike Lewis, Luke Zettlemoyer, and Veselin Stoyanov. 2019. Roberta: A robustly optimized bert pretraining approach. arXiv preprint arXiv:1907.11692(2019).
[8]
Ilya Loshchilov and Frank Hutter. 2017. Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101(2017).
[9]
Badrul Sarwar, George Karypis, Joseph Konstan, and John Riedl. 2001. Item-based collaborative filtering recommendation algorithms. In International World Wide Web Conference.
[10]
Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N Gomez, Łukasz Kaiser, and Illia Polosukhin. 2017. Attention is all you need. In Neural Information Processing Systems.
[11]
In-Kwon Yeo and Richard A Johnson. 2000. A new family of power transformations to improve normality or symmetry. Biometrika 87, 4 (2000), 954–959.

Cited By

View all
  • (2023)Predicting Tweet Engagement with Graph Neural NetworksProceedings of the 2023 ACM International Conference on Multimedia Retrieval10.1145/3591106.3592294(172-180)Online publication date: 12-Jun-2023
  • (2022)Understanding social engagements: A comparative analysis of user and text features in TwitterSocial Network Analysis and Mining10.1007/s13278-022-00872-112:1Online publication date: 31-Mar-2022

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
RecSysChallenge '20: Proceedings of the Recommender Systems Challenge 2020
September 2020
49 pages
ISBN:9781450388351
DOI:10.1145/3415959
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 26 September 2020

Check for updates

Author Tags

  1. Attention
  2. Deep Learning
  3. Language Modeling
  4. Recommender Systems

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

RecSys Challenge '20

Acceptance Rates

Overall Acceptance Rate 11 of 15 submissions, 73%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)672
  • Downloads (Last 6 weeks)98
Reflects downloads up to 01 Oct 2024

Other Metrics

Citations

Cited By

View all
  • (2023)Predicting Tweet Engagement with Graph Neural NetworksProceedings of the 2023 ACM International Conference on Multimedia Retrieval10.1145/3591106.3592294(172-180)Online publication date: 12-Jun-2023
  • (2022)Understanding social engagements: A comparative analysis of user and text features in TwitterSocial Network Analysis and Mining10.1007/s13278-022-00872-112:1Online publication date: 31-Mar-2022

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Get Access

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media