I am a second year Computer Science PhD student at UMass Amherst working with Prof. Andrew McCallum. During my time as a master’s student at UMass, I worked as a research intern at Information Extraction and Synthesis Lab (IESL) and Abridge AI. Prior to starting my master’s program I had spent a year working with Prof. Partha Talukdar on various NLP problems. I obtained by undergrad from IIT Madras, where, under the guidance of Prof. Bandyopadhyay, my research primarily focused on problems in Robotics. I have also worked for 2 years as a software engineer at MathWorks.
Coming with an eclectic background–a mix of robotics, software engineering, natural language processing, and mathematics–I like to think about foundational aspects of machine learning, with major focus on representation learning. While most representation learning methods only focus on metric learning, my work on box embeddings aims to show that representation learning can also capture various other kinds of structures like algebraic and relational structure, thereby allowing models to perform compositional reasoning. I have also worked with energy models, where the goal is to utilize energy as a learned loss function to train feedforward prediction network. My most recent empirical explorations have been on NLP and structured prediction tasks. Going forward, I am most interested analysing deep representation learning in rich non-Euclidean spaces through the lens of the burgeoning theory of deep learning.
|Apr 25, 2022||Excited to present our work on multi-label classification using box embeddings at ICLR 2022!|
|Nov 1, 2020||Happy to announce that I will be starting my Ph.D. in Spring (January) 2021 at UMass Amherst with Prof. Andrew McCallum as my advisor.|
|Oct 1, 2020||Internship work done at Abridge AI is accepted at Clinical NLP workshop 2020.|
|Sep 30, 2020||Paper titled Reading Comprehension as Natural Language Inference: A Semantic Analysis is accepted at *SEM 2020.|
|May 10, 2020||Excited to start research internship at Abridge AI.|
selected publicationscheck the publications page for the complete list
PreprintStructured Energy Network as a dynamic loss function. A case study with multi-label ClassificationIn (Under Review) 2022