Â鶹´«Ã½AV

Video file
Estimation and Hypothesis Testing under Information Constraints (Part 3/3)
Presenter(s)
Presenter Profile Picture
Clément Canonne
University of Sydney

2021 Croucher Summer Course in Information Theory, The Chinese University of Hong Kong

Lecture

Date

Abstract

In this tutorial, we will provide an overview of techniques and recipes for distributed learning (estimation) and testing under information constraints such as communication, local privacy, and memory constraints. Motivated by applications in machine learning and distributed computing, these questions lie at the intersection of information theory, statistics, theoretical computer science, and machine learning. We will mainly focus on minimax lower bound techniques, and cover a set of general methods to establish information-theoretic lower bounds, for both estimation and hypothesis testing questions, in both noninteractive and interactive settings.

Biography
Clément Canonne joined the School of Computer Science of the University of Sydney as a Lecturer in January 2021. Prior to that, he was a postdoc first in the Stanford Theory Group, then at IBM Research Almaden. He obtained his Ph.D. from the Computer Science department of Columbia University, where he was advised by Prof. Rocco Servedio, and received an M.Sc. in Computer Science from the Parisian Master of Research in Computer Science, and an engineering degree from one of France's "Grand Schools," the Ecole Centrale Paris.