Domain-aware Self-supervised Pre-training for Label-Efficient Meme Analysis

Published in AACL’22 (Main), 2022

Recommended citation: Shivam Sharma, Mohd Khizir Siddiqui, Md. Shad Akhtar, and Tanmoy Chakraborty. 2022. Domain-aware Self-supervised Pre-training for Label-Efficient Meme Analysis. In Proceedings of the 2nd Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 12th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 792–805, Online only. Association for Computational Linguistics. https://aclanthology.org/2022.aacl-main.60

Download paper here

The paper presents two self-supervised pre-training methods, Ext-PIE-Net and MM-SimCLR, for multi-modal tasks like meme analysis. These methods use specialized pretext tasks and outperform fully supervised approaches in various meme-related tasks, demonstrating their generalizability and the importance of better multi-modal self-supervision methods.

Recommended citation: Shivam Sharma, Mohd Khizir Siddiqui, Md. Shad Akhtar, and Tanmoy Chakraborty. 2022. Domain-aware Self-supervised Pre-training for Label-Efficient Meme Analysis. In Proceedings of the 2nd Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 12th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 792–805, Online only. Association for Computational Linguistics.