Get our free extension to see links to code for papers anywhere online!

Chrome logo  Add to Chrome

Firefox logo Add to Firefox

Unsupervised Dense Retrieval Deserves Better Positive Pairs: Scalable Augmentation with Query Extraction and Generation


Dec 17, 2022
Rui Meng, Ye Liu, Semih Yavuz, Divyansh Agarwal, Lifu Tu, Ning Yu, Jianguo Zhang, Meghana Bhat, Yingbo Zhou

Add code


   Access Paper or Ask Questions

  • Share via Twitter
  • Share via Facebook
  • Share via LinkedIn
  • Share via Whatsapp
  • Share via Messenger
  • Share via Email

CREATIVESUMM: Shared Task on Automatic Summarization for Creative Writing


Nov 10, 2022
Divyansh Agarwal, Alexander R. Fabbri, Simeng Han, Wojciech Kryscinski, Faisal Ladhak, Bryan Li, Kathleen McKeown, Dragomir Radev, Tianyi Zhang, Sam Wiseman

Add code

* 4 pages + 3 for references and appendix 

   Access Paper or Ask Questions

  • Share via Twitter
  • Share via Facebook
  • Share via LinkedIn
  • Share via Whatsapp
  • Share via Messenger
  • Share via Email

BookSum: A Collection of Datasets for Long-form Narrative Summarization


May 18, 2021
Wojciech Kryściński, Nazneen Rajani, Divyansh Agarwal, Caiming Xiong, Dragomir Radev

Add code

* 19 pages, 12 tables, 3 figures 

   Access Paper or Ask Questions

  • Share via Twitter
  • Share via Facebook
  • Share via LinkedIn
  • Share via Whatsapp
  • Share via Messenger
  • Share via Email

Accurate and Scalable Matching of Translators to Displaced Persons for Overcoming Language Barriers


Nov 30, 2020
Divyansh Agarwal, Yuta Baba, Pratik Sachdeva, Tanya Tandon, Thomas Vetterli, Aziz Alghunaim

Add code

* Presented at NeurIPS 2020 Workshop on Machine Learning for the Developing World 

   Access Paper or Ask Questions

  • Share via Twitter
  • Share via Facebook
  • Share via LinkedIn
  • Share via Whatsapp
  • Share via Messenger
  • Share via Email

Semblance: A Rank-Based Kernel on Probability Spaces for Niche Detection


Sep 09, 2018
Divyansh Agarwal, Nancy Zhang

Add code

* The method presented in this paper is being refined, and needs a major modification. Please contact the authors if you have any question about the construction of this kernel method 

   Access Paper or Ask Questions

  • Share via Twitter
  • Share via Facebook
  • Share via LinkedIn
  • Share via Whatsapp
  • Share via Messenger
  • Share via Email