Fairly dividing goods among several (possibly strategic) parties is a common problem both in economics and computer science. For instance, a government may wish to divide tracts of land to various organizations or a cloud computing environment may wish to divide computing time among several users. Often, such situations are dealt with via the use of monetary transfers — such as in auctioning o the desired goods; however we are concerned with the case where such transfers are not allowed.
In this proposal we expound upon our work in both the divisible setting as well as the indivisible. We start with an examination of our work and future problems to tackle in classical envy-free cake-cutting, as well as the often ignored game-theoretic aspects of this area. We then move onto the indivisible setting and present our results on the MMS guarantee, and two new properties we coin as PMMS and EFX. The improvement of guaranteeable approximations is the main open problem regarding these properties. We end with a discussion on our applications of fair division techniques and research to real world problems: classroom allocation among schools, and the peer review process.
Ariel Procaccia (Chair)
Ioannis Caragiannis (University of Patras)
In daily life, humans often make use of both hands simultaneously (bimanual manipulation). However, it is not known how often people use bimanual manipulation or what factors cause them to use two hands.
In our work, we focus on the simple task of grasping. In particular, we investigate the effect of object size, weight, and position as well as the presence of a balance requirement on whether bimanual grasping is used in a bowl-moving task. Grasp poses are also collected and analyzed. We find that position and balance have a strong effect on bimanual usage, and weight has a weaker effect. Size, weight, and balance (but not position) affect the pose used to grasp the bowls. We conclude that the choice of strategy is most likely determined by stability requirements and the desire to minimize both body rotation and the need to reach across the body.
Finally, I discuss applications of this work to computer graphics and robotics.
Presented in Partial Fulfillment of the CSD Speaking Skills Requirement.
Counterfactual Regret Minimization (CFR) is a leading algorithm for solving large zero-sum imperfect-information games. CFR is an iterative algorithm that repeatedly traverses the game tree, updating regrets at each information set. We introduce Regret-Based Pruning (RBP), an improvement to CFR that prunes any path of play in the tree, and its descendants, that has negative regret. It revisits that sequence at the earliest subsequent CFR iteration where the regret could have become positive, had that path been explored on every iteration. We prove that in zero-sum games it asymptotically prunes any action that is not part of a best response to some Nash equilibrium. This leads to provably faster convergence and lower space requirements. Experiments show that RBP can result in an order of magnitude reduction in both space and time needed to converge, and that the reduction factor increases with game size.
Presented in Partial Fulfillment of the CSD Speaking Skills Requirement.
This thesis tries to expand the frontiers of approximation algorithms, with respect to the range of optimization problems as well as mathematical tools for algorithms and hardness. We show tight approximabilities for various fundamental problems in combinatorial optimization.
We organize these problems into three categories — new and applied constraint satisfaction problems (CSP), hypergraph coloring under promise, and graph covering/cut problems. Conceptually extending the traditional problems like CSP, coloring, and covering, they include important open problems in approximation algorithms. Some problems are inspired by other areas of theory of computing including coding theory and computational economics.
We also try to present new techniques to prove tight approximabilities. For coloring and cut problems, we provide a unified framework to show strong hardness results, encompassing many previous techniques for related problems and yielding new results. For CSPs, our new techniques often stem from a combination of ideas from multiple traditional techniques that have been developed separately.
Venkatesan Guruswami (Chair)
Ola Svensson (EPFL)
As biomedical research advances into more complicated systems, there is an increasing need to model and analyze these systems to better understand them. Formal specification and analyzing methods, such as model checking techniques, hold great promise in helping further discovery and innovation for complicated biochemical systems. Models can be tested and adapted inexpensively in-silico providing new insights. However, development of accurate and efficient modeling methodologies and analysis techniques are still open challenges for biochemical systems.
This thesis is focused on designing appropriate modeling formalisms and efficient analyzing algorithms for various biological systems in three different thrusts:
- Modeling Formalisms: we have designed a multi-scale hybrid rule-based modeling formalism to depict intra- and intercellular dynamics using discrete and continuous variables respectively. Its hybrid characteristic inherits advantages of logic and kinetic modeling approaches.
- Formal Analyzing Algorithms:
1)We have developed a LTL model checking algorithm for Qualitative Networks (QNs). It considers the unique feature of QNs and combines it with over-approximation to compute decreasing sequences of reachability set, resulting in a more scalable method.
2) We have developed a formal analyzing method to handle probabilistic bounded reachability problems for two kinds of stochastic hybrid systems considering uncertainty parameters and probabilistic jumps. It combines a SMT-based model checking technique with statistical tests in a sound manner. Compared to standard simulation-based methods, it supports non-deterministic branching, increases the coverage of simulation, and avoids the zero-crossing problem.
3)We have designed a new framework, where formal methods and machine learning techniques take joint efforts to enhance the understanding of biological and biomedical systems. Within this framework, statistical model checking is used as a (sub)model selection method.
- Applications: To check the feasibility of our model language and algorithms, we have 1) constructed Boolean Network models for the signaling network for single pancreatic cancer cell, and used symbolical model checking to analyze these models, 2) built Qualitative Network models describing cellular interactions during skin cells' differentiation, and applied our improved bounded LTL model checking technique, 3) developed a multi-scale hybrid rule-based model for the pancreatic cancer micro-environment, and employed statistical model checking, 4) created a nonlinear hybrid model to depict a bacteria-killing process, and adopted a recently promoted δ-complete decision procedure-based model checking technique, 5) extended hybrid models for atrial fibrillation, prostate cancer treatment, and our bacteria-killing process into stochastic hybrid models, and applied our probabilistic bounded reachability analyzer SReach, and 6) carried out the probabilistic reachability analysis of the tap withdrawal circuit in C. elegans using SReach.
Edmund M. Clarke (Chair)
Marta Zofia Kwiatkowska (University of Oxford)
Natasa Miskov-Zivanov (University of Pittsburgh)
A wide range of machine learning problems, including astronomical inference about galaxy clusters, natural image scene classification, parametric statistical inference, and detection of potentially harmful sources of radioactive, can be well-modeled as learning a function on (samples from) distributions. This thesis explores problems in learning such functions via kernel methods, and applies the framework to yield state-of-the-art results in several novel settings.
One major challenge with this approach is one of computational efficiency when learning from large numbers of distributions: the computation of typical methods scales between quadratically and cubically, and so they are not amenable to large datasets. As a solution, we investigate approximate embeddings into Euclidean spaces such that inner products in the embedding space approximate kernel values between the source distributions. We provide a greater understanding of the standard existing tool for doing so on Euclidean inputs, random Fourier features. We also present a new embedding for a class of information-theoretic distribution distances, and evaluate it and existing embeddings on several real-world applications.
The next challenge is that the choice of distance is important for getting good practical performance, but how to choose a good distance for a given problem is not obvious. We study this problem in the setting of two-sample testing, where we attempt to distinguish two distributions via the maximum mean divergence, and provide a new technique for kernel choice in these settings, including the use of kernels defined by deep learning-type models.
In a related problem setting, common to physical observations, autonomous sensing, and electoral polling, we have the following challenge: when observing samples is expensive, but we can choose where we would like to do so, how do we pick where to observe? We give a method for a closely related problem where we search for instances of patterns by making point observations.
Throughout, we combine theoretical results with extensive empirical evaluations to increase our understanding of the methods.
Jeff Schneider (Chair)
Arthur Gretton (University College London )
Computer Science Undergraduates
Are you interested in continuing your education and obtaining a Masters Degree?
Then you need to attend this talk!
Professor Karl Crary will discuss the Master of Science in Computer Science Program. The M.S. program in Computer Science offers students with a B.S. degree the opportunity to improve their training with advanced study in Computer Science. We cater to students with bsaic analytic skills and a strong aptitude for mathematics, programming, and logical reasoning.
Professor Peter Steenkiste will discuss the 5th Year Master of Science Program.
This program is a direct Master program students receiving a B.S. in Computer Science degree from Carnegie Mellon. The purpose and goal of the program is to encourage our very brightest undergraudates to become involved with research that broadens their undergraduate experience.
Stop by and learn about these programs. Pizza will be provided.
The focus of pancreatic cancer research has been shifted from pancreatic cancer cells towards their microenvironment, involving pancreatic stellate cells that interact with cancer cells and influence tumor progression.
To quantitatively understand the pancreatic cancer microenvironment, in this work, we construct a computational model for intracellular signaling networks of cancer cells and stellate cells as well as their intercellular communication. We extend the rule-based BioNetGen language to depict intra- and inter-cellular dynamics using discrete and continuous variables respectively. Our framework also enables a statistical model checking procedure for analyzing the system behavior in response to various perturbations.
The results demonstrate the predictive power of our model by identifying important system properties that are consistent with existing experimental observations. We also obtain interesting insights into the development of novel therapeutic strategies for pancreatic cancer.
Presented in Partial Fulfillment of the CSD Speaking Skills Requirement