Research Summary

I am currently a member of the Learning by Instruction Agent (LIA) project. LIA is an intelligent personal agent that can be tought by verbal instructions or visual demonstration. Take a look at this short demo to see how LIA interacts with humans. In my Ph.D. I focused on extracting latent hierarchical structures from the data and developed probabilistic learning and deep learning models applicable to a broad domain of real world problems such as prediction of high dimensional time series, topic models and neural programming.

Interests

  • Neural Programming
  • Deep Learning
  • Probabilistic Graphical Models
  • Spectral Learning

Projects

Learning by Instruction Agent (LIA)

LIA is an intelligent personal assistant that can be programmed using natural language. Unlike today's conversational assistants such as Alexa that act on a small set of pre-programmed commands, LIA is capable of being instructed (programmed) by the non-expert human through conversational interactions. We are currently extending the natural language capabilities of LIA as well its knowledge base.

Neural Math

Human beings possess impressive abilities for abstract mathematical and logical thinking. It has long been the dream of computer scientists to design machines with such capabilities: machines that can automatically learn and reason, thereby removing the need to manually program them. Neural programming, where neural networks are used to learn programs, mathematics, or logic, has recently shown promise towards this goal. In this project, we developed a neural programmer that accurately predicts the correctness of a given input mathematical equation and fills a blank in a given input equation. Our developed neural programmer accurately solvs ordinary differential equations.

Correlated Topic Models

Topic models are a type of probabilistic models for extracting abstract topics from large document corpora. One of the well-known topic models is Latent Dirichlet Allocation (LDA). LDA does not model correlations between the topics. However, this is not a realistic assumption. For example, a document about sports is more likely to be about health as well, rather than finance. In this work, we propose a class of correlated topic models that capture such correlations and show that the entire class can be trained with polynomial complexity using spectral methods.

Predicting High Dimensional Time Series

Scalable probabilistic modeling and prediction in high dimensional multivariate time-series is a challenging problem, particularly for systems with hidden sources of dependence and/or homogeneity. Examples of such problems include dynamic social networks with co-evolving nodes and edges and dynamic student learning in online courses. In this project we address these problems through the discovery of hierarchical latent groups. We introduce a family of Conditional Latent Tree Models (CLTM), in which tree-structured latent variables incorporate the unknown groups.