International Symposium on DIStributed Computing (DISC) 2019

Invited talks

Seth Gilbert (National University of Singapore): When is an algorithm robust?

Seth Gilbert Seth Gilbert is an Associate Professor at the National University of Singapore. He received his PhD from MIT, and spent several years as a postdoc at EPFL. His work includes research on backoff protocols, dynamic graph algorithms, wireless networks, robust scheduling, and the occasional blockchain. In fact, Seth’s research focuses on algorithmic issues of robustness and scalability, wherever they may arise.

Dan Alistarh (IST Austria): Distributed and Concurrent Optimization for Machine Learning

Dan Alistarh Dan Alistarh is currently an Assistant Professor at IST Austria. Previously, he was affiliated with ETH Zurich, Microsoft Research, and MIT. He received his PhD from the EPFL, under the guidance of Prof. Rachid Guerraoui. His research focuses on distributed algorithms and concurrent data structures, and spans from algorithms and lower bounds to practical implementations. He is the recipient of an ERC Starting Grant with a focus on distributed machine learning.

Abstract: Machine learning has made considerable progress over the past decade, matching and even surpassing human performance on a varied set of narrow computational tasks. This progress has been enabled by the widespread availability of large datasets, as well as by improved algorithms and models. Distribution, implemented either through single-node concurrency or through multi-node parallelism, has been third third key ingredient to these advances.

The goal of this talk is to provide an overview of the role of distributed computing in machine learning, with an eye towards the intriguing trade-offs between synchronization and communication costs of distributed machine learning algorithms, on the one hand, and their convergence, on the other. The focus will be on parallelization strategies for the fundamental stochastic gradient descent (SGD) algorithm, which is a key tool when training machine learning models, from venerable linear regression, to state-of-the-art neural network architectures. Along the way, we will provide an overview of the ongoing research and open problems in distributed machine learning. The lecture will assume no prior knowledge of machine learning or optimization, beyond familiarity with basic concepts in algebra and analysis.

DISC 2019 Sponsors