I am a second year Computer Science PhD student at UMass Amherst working with Prof. Andrew McCallum. During my time as a master’s student at UMass, I worked as a research intern at Information Extraction and Synthesis Lab (IESL) and Abridge AI. Prior to starting my master’s program I had spent a year working with Prof. Partha Talukdar on various NLP problems. I obtained by undergrad from IIT Madras, where, under the guidance of Prof. Bandyopadhyay, my research primarily focused on problems in Robotics. I have also worked for 2 years as a software engineer at MathWorks.
Coming with an eclectic background–a mix of robotics, software engineering, natural language processing, and mathematics–I like to think about foundational aspects of machine learning, with major focus on representation learning. While most representation learning methods only focus on metric learning, my work on box embeddings aims to show that representation learning can also capture various other kinds of structures like algebraic and relational structure, thereby allowing models to perform compositional reasoning. I have also worked with energy models, where the goal is to utilize energy as a learned loss function to train feedforward prediction network. My most recent empirical explorations have been on NLP and structured prediction tasks. Going forward, I am most interested analysing deep representation learning in rich non-Euclidean spaces through the lens of the burgeoning theory of deep learning.
|Aug 1, 2023||My work on Pre-trained language models for Visual Planning for Human Assistance, done as a research intern at Meta Reality Labs., has been accepted at ICCV 2023.|
|Apr 25, 2022||Excited to present our work on multi-label classification using box embeddings at ICLR 2022!|
|Nov 1, 2020||Happy to announce that I will be starting my Ph.D. in Spring (January) 2021 at UMass Amherst with Prof. Andrew McCallum as my advisor.|
|Oct 1, 2020||Internship work done at Abridge AI is accepted at Clinical NLP workshop 2020.|
|Sep 30, 2020||Paper titled Reading Comprehension as Natural Language Inference: A Semantic Analysis is accepted at *SEM 2020.|
|Aug 25, 2021||Signal Propagation On Slurm|
|Jan 21, 2019||VIMing on Mac|
|Apr 15, 2018||Relative Entropy and its role as a cost function for machine learning tasks|
- Pretrained Language Models as Visual Planners for Human AssistancearXiv preprint arXiv:2304.09179, 2023
- Modeling Label Space Interactions in Multi-label Classification using Box EmbeddingsIn International Conference on Learning Representations (To Appear), 2022
- Structured Energy Network As a LossIn , 2022
- Weakly Supervised Medication Regimen Extraction from Medical ConversationsIn Proceedings of the 3rd Clinical Natural Language Processing Workshop, Nov 2020
- Representing Joint Hierarchies with Box EmbeddingsIn Automated Knowledge Base Construction (AKBC), Nov 2020