8125 Paint Branch Dr
College Park, MD 20742
I am a second year Computer Science PhD student at the University of Maryland, College Park, currently working with Tianyi Zhou and Jordan Boyd-Graber. I am a member of the Computational Linguistics and Information Processing (CLIP) Lab at UMIACS.
I am broadly interested in Natural Language Understanding and Efficient learning methods; particularly around Question Answering (QA), Semantic structure understanding, Model Robustness and Interpretability.
I’ve spent some time working at Google Research where I collaborated Brain with several Language Research teams, with focus on Model Interpretation and Analysis for Question Answering, Semi-Structured Text Understanding and Retrieval-augmented Language models for long-context understanding.
In the past, I have also worked on some of the Computer vision and Machine Learning problems like Human motion sequence modeling, Generative and Representation Learning and Adversarial Machine Learning.
In my free time, I like to engage in social deception board games. I also very much like to play Magic the Gathering :)
|Sep 14, 2022||Our work “Toward Efficient Robust Training against Union of Lp Threat Models” is accepted at NeurIPS 2022 and also for oral presentation at ADVML FRONTIERS @ ICML 2022.|
|Sep 12, 2022||I am starting as part-time Student Researcher at Google and will primarily be working on Long Context Text Understanding.|
|May 23, 2022||I am spending my summer at X, the moonshot factory (formerly Google X) as PhD Research Resident, and will be working on Program Synthesis.|
|Nov 22, 2021||I will be serving as a Program Committee member for SUKI: Workshop on Structured and Unstructured Knowledge Integration and DADC: Workshop on Dynamic Adversarial Data Collection at NAACL 2022|
|Sep 22, 2021||Our work “MATE: Multi-view Attention for Table Transformer Efficiency” selected for an oral presentation at EMNLP 2021.|
- NeurIPS 2022Toward Efficient Robust Training against Union of Lp Threat ModelsIn Advances in Neural Information Processing Systems 2022
- EMNLP 2021MATE: Multi-view Attention for Table Transformer EfficiencyIn Empirical Methods in Natural Language Processing 2021
- EMNLP 2021Toward Deconfounding the Influence of Entity Demographics for Question Answering AccuracyIn Empirical Methods in Natural Language Processing 2021