Machine Learning Thesis Proposal

  • Remote Access - Zoom
  • Virtual Presentation - ET
  • Ph.D. Student
  • Machine Learning Department
  • Carnegie Mellon University
Thesis Proposals

Multi-Objective Optimization for Black-Box and Differentiable Functions

Multi-objective optimization (MOO) problems occur frequently in practice, where we wish to simultaneously optimize multiple objectives with respect to a common set of input parameters. Typically, these objectives do not achieve their optima for the same inputs. In such scenarios, rather than searching for a single best solution, a set of Pareto optimal solutions is desired. In this thesis, we study MOO in a variety of settings.

The first part of this thesis is about MOO approaches for expensive black-box functions, where only zeroth-order function evaluations are available. Bayesian optimization is a popular approach for optimization of such functions. In our completed work, we propose a multi-objective Bayesian optimization (MOBO) framework based on random scalarizations of the objectives that is flexible, computationally cheap, and comes with theoretical guarantees while being conceptually simple. We also introduce a notion of multi-objective regret and show that our strategy achieves zero regret as the number of queries grows. As a part of the proposed work, we aim to demonstrate the flexibility of our model by adapting other popular metrics in our framework, such as the hypervolume, and show improved scalability and parallelization, along with further theoretical study of this setting.

In the second part of this thesis, we study two differentiable MOO problems. Our completed work includes learning of sparse embeddings using neural networks, with applications to fast image retrieval. In this work, we introduce a novel sparsity regularizer, and demonstrate an annealing strategy that yields a better Pareto frontier of the objectives compared to other methods. For our proposed work, we study a time series forecasting problem with multiple criteria to be optimized. We consider the problem of hierarchical time series forecasting, where multiple related time series are organized as a hierarchy. We propose an approach that accounts for the hierarchical structure, and in our preliminary experiments, we show that it leads to an improved overall accuracy. Going ahead, we propose to treat this as a multi objective problem - with the goal of understanding the accuracy tradeoffs at the various levels of the hierarchy.

Thesis Committee:
Barnabas Poczos (Co-chair)
Jeff Schneider (Co-chair)
Zachary Lipton
Abhimanyu Das (Google)

Zoom Participation. See announcement.

For More Information, Please Contact: