Playlist of BIRDS 2020 presentations - https://www.youtube.com/playlist?list=PLI13W1gRqqf1HiAWV7FCfxgGEUU_GvATr

More about how we brought together Information Science, Information Retrieval and Data Science at BIRDS - https://birds-ws.github.io/birds2020/index.html

#BIRDS2020 #SIGIR2020

BIRDS 2020

YouTube

MARGE: Pre-training via paraphrasing by #SIGIR2020 keynote speaker Luke Zettlemoyer.

https://arxiv.org/abs/2006.15020

Pre-training via Paraphrasing

We introduce MARGE, a pre-trained sequence-to-sequence model learned with an unsupervised multi-lingual multi-document paraphrasing objective. MARGE provides an alternative to the dominant masked language modeling paradigm, where we self-supervise the reconstruction of target text by retrieving a set of related texts (in many languages) and conditioning on them to maximize the likelihood of generating the original. We show it is possible to jointly learn to do retrieval and reconstruction, given only a random initialization. The objective noisily captures aspects of paraphrase, translation, multi-document summarization, and information retrieval, allowing for strong zero-shot performance on several tasks. For example, with no additional task-specific training we achieve BLEU scores of up to 35.8 for document translation. We further show that fine-tuning gives strong performance on a range of discriminative and generative tasks in many languages, making MARGE the most generally applicable pre-training method to date.

We are bridging the gap between Data Science, Information Science and Information Retrieval tomorrow at our BIRDS workshop at #SIGIR2020. Join us online! More info on our homepage https://birds-ws.github.io/birds2020/index.html.
BIRDS2020

All ACM SIGIR publications and permanent open access in the ACM digital library! #SIGIR2020
#SIGIR2020 Diversity, Equity and Inclusivity checklist
Best #SIGIR2020 short paper: Shi Yu et al. "Few-short conversational query rewriting"
Best #SIGIR2020 paper: Marco Mornik et al. "Controlling fairness and bias in dynamic learning to rank"
Honorable mention best paper award #SIGIR2020: Fan Zhang et al. "Models vs Satisfaction"
Going to find out how deep learning works for Information Retrieval with Dacheng Tao. #SIGIR2020