Dhruvesh Patel

I have an eclectic academic background involving robotics, software engineering, computer science, machine learning and natural language processing. Currently, I am pursuing a master's degree in Computer Science at the University of Massachusetts Amherst with focus on Machine Learning as applied to Natural Language Processing (NLP) and Knowledge Graphs (KGs). My current research interests lie in the intersection of geometry, deep learning and NLP. I am broadly interested in (1) exploiting the geometry of non-euclidean spaces to provide better inductive bias to the representation models of natural language and Knowledge Graphs, and in (2) integrating external knowledge sources like Knowledge Graphs, ontologies and hierarchies with deep learning models using end-to-end gradient based learning to create explainable, instructable and data efficient learning machines.
I was a summer research intern (2019) and continue to work as a Graduate Student Research Assistant at Information Extraction and Synthesis Lab (IESL) advised by Prof. Andrew McCallum. Prior to starting my master's program, I worked in a small team at Kenome -- a nascent but extremely driven startup founded by Prof. Partha Talukdar -- solving challenging problems using Machine Learning and NLP. I have also worked for ~2 years as a software engineer at MathWorks, focusing on developing Simulink and related products. During my undergrad at IIT Madras, I worked with Prof. Bandyopadhyay on problems in Robotics.

I am always on the lookout for interesting conversations and fruitful collaborations. Feel free to use the chat-box at the bottom or my email to get in touch.

News

Nov 1, 2020 Happy to announce that I will be starting my Ph.D. in Spring (January) 2021 at UMass Amherst with Prof. Andrew McCallum as my advisor.
Oct 1, 2020 Internship work done at Abridge AI is accepted at Clinical NLP workshop 2020.
Sep 30, 2020 Paper titled Reading Comprehension as Natural Language Inference: A Semantic Analysis is accepted at *SEM 2020.
May 10, 2020 Excited to start research internship at Abridge AI.
Jan 15, 2020 Paper titled “Representing Joint Hierarchies using Box Embeddings” is accepted at AKBC 2020.

Recent Publications

  1. Reading Comprehension as Natural Language Inference: A Semantic Analysis *Anshuman Mishra, *Dhruvesh Patel , *Aparna Vijayakumar, Xiang Li, Pavan Kapanipathi, and Kartik Talamadupula In StarSem 2020 Workshop at COLING 2020 (To Appear) [Abstract] [URL] [Code]

    In the recent past, Natural language Inference (NLI) has gained significant attention, particularly given its promise for downstream NLP tasks. However, its true impact is limited and has not been well studied. Therefore, in this paper, we explore the utility of NLI for one of the most prominent downstream tasks, viz. Question Answering (QA). We transform the one of the largest available MRC dataset (RACE) to an NLI form, and compare the performances of a state-of-the-art model (RoBERTa) on both these forms. We propose new characterizations of questions, and evaluate the performance of QA and NLI models on these categories. We highlight clear categories for which the model is able to perform better when the data is presented in a coherent entailment form, and a structured question-answer concatenation form, respectively.

  2. Weakly Supervised Medication Regimen Extraction from Medical Conversations Dhruvesh Patel , Sandeep Konam, and Sai Prabhakar Selvaraj In Proceedings of the 3rd Clinical Natural Language Processing Workshop 2020 (To Appear) [Abstract] [URL]

    Automated Medication Regimen (MR) extraction from medical conversations can improve recall and care-plan compliance for patients, and reduce documentation burden for doctors. In this paper, we focus on extracting spans for frequency, route and change, corresponding to medications discussed in the conversation. We first describe a unique dataset of annotated doctor-patient conversations and then present a weakly supervised model architecture that can perform span extraction using noisy classification data. The model utilizes an attention bottleneck inside a classification model to perform the extraction.We experiment with several variants of attention scoring and projection functions and propose a novel transformer-based attention scoring function (TAScore). The proposed combination of TAScore and Fusedmax projection achieves a 10 point increase in Longest Common Substring F1 compared to the baseline of additive scoring plus softmax projection.

  3. Looking Beyond Sentence-Level Natural Language Inference for Downstream Tasks *Anshuman Mishra, *Dhruvesh Patel , *Aparna Vijayakumar, Xiang Li, Pavan Kapanipathi, and Kartik Talamadupula In ArXiv 2020 [Abstract] [URL] [Code]

    In recent years, the Natural Language Inference (NLI) task has garnered significant attention, with new datasets and models achieving near human-level performance on it. However, the full promise of NLI – particularly that it learns knowledge that should be generalizable to other downstream NLP tasks – has not been realized. In this paper, we study this unfulfilled promise from the lens of two downstream tasks: question answering (QA), and text summarization. We conjecture that a key difference between the NLI datasets and these downstream tasks concerns the length of the premise; and that creating new long premise NLI datasets out of existing QA datasets is a promising avenue for training a truly generalizable NLI model. We validate our conjecture by showing competitive results on the task of QA and obtaining the best reported results on the task of Checking Factual Correctness of Summaries.

  4. Representing Joint Hierarchies with Box Embeddings *Dhruvesh Patel , *Shib Sankar Dasgupta, Michael Boratko, Xiang Li, Luke Vilnis, and Andrew McCallum In Automated Knowledge Base Construction (AKBC) 2020 [Abstract] [URL] [Slides] [Video] [Code]

    Learning representations for hierarchical and multi-relational knowledge has emerged as an active area of research. Box Embeddings [Vilnis et al., 2018, Li et al., 2019] represent concepts with hyperrectangles in n-dimensional space and are shown to be capable of modeling tree-like structures efficiently by training on a large subset of the transitive closure of the WordNet hypernym graph. In this work, we evaluate the capability of box embeddings to learn the transitive closure of a tree-like hierarchical relation graph with far fewer edges from the transitive closure. Box embeddings are not restricted to tree-like structures, however, and we demonstrate this by modeling the WordNet meronym graph, where nodes may have multiple parents. We further propose a method for modeling multiple relations jointly in a single embedding space using box embeddings. In all cases, our proposed method outperforms or is at par with all other embedding methods.

  5. Computing the Safe Working Zone of a 3-RRS Parallel Manipulator Dhruvesh Patel , Rohit Kalla, Tetik Halil, Kiper Gökhan, and Sandipan Bandyopadhyay In New Trends in Mechanism and Machine Science 2017 [Abstract] [URL]

    Determination of the safe working zone (SWZ) of a parallel manipulator is a one-time computational task with several permanent benefits. As this subspace of the workspace of the manipulator is free of both the loss- and gain-type singularities, link interference, as well as physical joint limits, the manipulator can move freely in this space. Moreover, if the natural choice of a convex-shaped SWZ is adhered to, then point-to-point path planning inside the SWZ always has a trivial solution, namely, a segment joining the two points, which is guaranteed to be inside the workspace. In this paper, the SWZ of the 3-RRS existing in the İzmir Institute of Technology has been computed. Starting with the geometry of the manipulator, the loop-closure constraint equations have been derived. The singularity conditions are obtained based on the singularity of certain Jacobian matrices associated with the constraint functions. The interference between the links are detected by first encapsulating the links in rectangular parallelepipeds, which are then discretized into triangles, and subjected to collision tests between the relevant pairs of triangles. Using these theoretical developments, the SWZ is computed. The numerical results are depicted graphically.

Professional Journey

Information Extraction and Summarization for healthcare conversations.. read more May 2019 - Sep 2019
Natural Language Processing Research Intern

Working on learning and using representations of language data in non-euclidean spaces.. read more May 2019 - Sep 2019
Research Assistant

Working towards a master's degree in Computer Science with focus on ML and NLP.. read more Jan 2019 - present
Master of Science in Computer Science

Making sense of structured and unstructured data.. read more Mar 2018 - Jan 2019
Software Development Engineer, Machine Learning

Worked on development of Simulink and allied products. Mainly using C++... read more Jun 2016 - Jan 2018
Software Engineer

Analysis of Kinematics and Dynamics, Design, Interfacing and Control of a vehicle motion simulator based on a 3-DoF Parallel Manipulator... read more Dec 2014 - May 2015
Research and Design Engineer (Intern)

Trained for developing products, be it a software or a robot--taking it from need analysis to a final prototype and beyond. With a wide focus, learned basics of almost all the fields of engineering like computer science, electrical engineering and mechanical engineering. The interdisciplinary yet technical nature of the curriculum prepared me for diving deep into any field of engineering with focus on building products... read more Aug 2011 - Jul 2016
Dual Degree (Btech+Mtech) in Engineering Design with minor in Systems Engineering

My Projects

Weights and Biases with AllenNLP

Utilities and boilerplate code which allows using wandb to tune the hypereparameters for any AllenNLP model without a single line of extra code!
read more...

...   ...

Natural Language Inference for Question Answering

Solving the task of Question Answering through the knowledge of inference.
read more...

...   ...

Box Embedding for Knowledge Graph Completion

Improving KG representations using the inductive bias of Box Space, and analysing the issues with current datasets.
read more...

...   ...

Using boxes for Natural Language Inference

Using Box representation for Natural Language Inference. This is an ongoing research project at IESL UMass with updates coming soon!
read more...

...   ...

HyperA -- Attention in Hyperbolic Space

A pytorch implementation of Hyperbolic Neural Networks and attention mechanisms. Hyperbolic attention applied to the task of Natural Language Inference.
read more...

...   ...

Polymetic

A c++ library for polynomial and matrix arithmetic, focused on applications in Robot Kinematics.
read more...

...   ...

Intelibugger

An intelligent debugger for C/C++ code on Linux

read more...

...   ...

Computing the safe working zone of parallel manipulators

Work in computational kinematics of parallel robot manipulators.
read more...

...   ...

Currently Reading