The event took place virtually (on gather.town) on March 4-5 2021, two weeks before the ALT 2021 conference. See full schedule below.

Mentorship Workshop

Event Schedule

Surbhi Goel

Welcome Remarks. Surbhi Goel

Venue: Seminar Room (gather.town)

Jacob Abernethy
Pravesh Kothari

How-to Talks. Jacob Abernethy, Pravesh Kothari

Slides: [Part1] [Part2]

Session chair: Cyril Zhang

Venue: Seminar Room (gather.town)

The talk will have two parts. The first part will cover general advice about giving talks, structuring papers, and writing reviews. The second part will discuss how to get the most out of a conference---and ALT in particular---covering topics such as networking and attending poster sessions.

Break.

Social chairs: Sumegha Garg, Suriya Gunasekar, Thodoris Lykouris

Venue: Lounge (gather.town)

This session is intended to be free time to socialize with senior members of the community as well as other participants. There will be tables with set topics where a senior member would be available to seek advice and answer your questions!

(Parallel Session)
Kamalika Chaudhuri
Marco Carmosino

Talk Dissection. Discussant: Kamalika Chaudhuri, Presenter: Marco Carmosino

Session chair: Qi Lei

Venue: Room A (gather.town)

Title: Efficient, Noise-Tolerant, and Private Learning via Boosting


This session will begin with a 15-20 minutes talk by the presenter. The presenter and the discussant will then “dissect” the talk: the presenter will describe their process when making the talk and the discussant will identify aspects of the talk that went well and those that could be improved. This session should be as instructive and beneficial for the participants in the audience as it is for the presenter since the audience will learn about what goes into making a great talk.

(Parallel Session)
Robert Kleinberg
Emily Diana

Talk Dissection. Discussant: Robert Kleinberg, Presenter: Emily Diana

Session chair: Eric Balkanski

Venue: Room B (gather.town)

Title: Minimax and Lexicographically Fair Learning: Algorithms, Experiments, and Generalization


This session will begin with a 15-20 minutes talk by the presenter. The presenter and the discussant will then “dissect” the talk: the presenter will describe their process when making the talk and the discussant will identify aspects of the talk that went well and those that could be improved. This session should be as instructive and beneficial for the participants in the audience as it is for the presenter since the audience will learn about what goes into making a great talk.

Break.

Social chairs: Sumegha Garg, Suriya Gunasekar, Thodoris Lykouris

Venue: Lounge (gather.town)

This session is intended to be free time to socialize with senior members of the community as well as other participants. There will be tables with set topics where a senior member would be available to seek advice and answer your questions!

Lester Mackey

Ask Me Anything. Lester Mackey

Moderator: Aaditya Ramdas

Venue: Lounge (gather.town)

Participants will be encouraged to “ask anything” of our featured senior member of the learning theory community. The questions could be about choosing research directions, conducting interdisciplinary research, experiencing success and failure in research, inclusion and diversity, etc.

Break.

Social chairs: Sumegha Garg, Suriya Gunasekar, Thodoris Lykouris

Venue: Lounge (gather.town)

This session is intended to be free time to socialize with senior members of the community as well as other participants. There will be tables with set topics where a senior member would be available to seek advice and answer your questions!

Po-Ling Loh

General Research Talk. Po-Ling Loh

Session chair: Nishanth Dikkala

Venue: Seminar Room (gather.town)

Title: Mean estimation for entangled single-sample distributions

Abstract: We consider the problem of estimating the common mean of univariate data, when independent samples are drawn from non-identical symmetric, unimodal distributions. This captures the setting where all samples are Gaussian with different unknown variances. We propose an estimator that adapts to the level of heterogeneity in the data, achieving near-optimality in both the i.i.d. setting and some heterogeneous settings, where the fraction of "low-noise" points is as small as log(n)/n. Our estimator is a hybrid of the modal interval, shorth, and median estimators from classical statistics. The rates depend on the percentile of the mixture distribution, making our estimators useful even for distributions with infinite variance. This is joint work with Ankit Pensia and Varun Jog.

Vatsal Sharan

General Research Talk. Vatsal Sharan

Session chair: Nishanth Dikkala

Venue: Seminar Room (gather.town)

Title: Sample Amplification: Increasing Dataset Size even when Learning is Impossible

Abstract: Is learning a distribution always necessary for generating new samples from the distribution? To study this, we introduce the problem of sample "amplification": given n independent draws from an unknown distribution, D, to what extent is it possible to output a set of m > n datapoints that are indistinguishable from m i.i.d. draws from D? Curiously, we show that nontrivial amplification is often possible in the regime where the number of datapoints n is too small to learn D to any nontrivial accuracy. We also discuss some connections between this setting and the challenge of interpreting the behavior of GANs and other ML/AI systems. This is based on joint work with Brian Axelrod, Shivam Garg and Greg Valiant.

Social Event

Venue: Lounge (gather.town)

This session will be a social gathering with virtual board games and dancing.

Nika Haghtalab

Welcome Remarks. Nika Haghtalab

Venue: Seminar Room (gather.town)

Rafael Frongillo
Jamie Morgenstern

How-to Talks. Rafael Frongillo, Jamie Morgenstern

Slides: [Part1] [Part2]

Session chair: Bo Waggoner

Venue: Seminar Room (gather.town)

The talk will have two parts. The first part will cover general advice about giving talks, structuring papers, and writing reviews. The second part will discuss how to get the most out of a conference---and ALT in particular---covering topics such as networking and attending poster sessions.

Break.

Social chairs: Sumegha Garg, Suriya Gunasekar, Thodoris Lykouris

Venue: Lounge (gather.town)

This session is intended to be free time to socialize with senior members of the community as well as other participants. There will be tables with set topics where a senior member would be available to seek advice and answer your questions!

Shafi Goldwasser

Ask Me Anything. Shafi Goldwasser

Moderator: Nika Haghtalab

Venue: Lounge (gather.town)

Participants will be encouraged to “ask anything” of our featured senior member of the learning theory community. The questions could be about choosing research directions, conducting interdisciplinary research, experiencing success and failure in research, inclusion and diversity, etc.

Break.

Social chairs: Sumegha Garg, Suriya Gunasekar, Thodoris Lykouris

Venue: Lounge (gather.town)

This session is intended to be free time to socialize with senior members of the community as well as other participants. There will be tables with set topics where a senior member would be available to seek advice and answer your questions!

(Parallel session)
Praneeth Netrapalli
Mingda Qiao

Talk Dissection. Discussant: Praneeth Netrapalli, Presenter: Mingda Qiao

Session chair: Pasin Manurangsi

Venue: Room A (gather.town)

Title: Stronger Calibration Lower Bounds via Sidestepping

This session will begin with a 15-20 minutes talk by the presenter. The presenter and the discussant will then “dissect” the talk: the presenter will describe their process when making the talk and the discussant will identify aspects of the talk that went well and those that could be improved. This session should be as instructive and beneficial for the participants in the audience as it is for the presenter since the audience will learn about what goes into making a great talk.

(Parallel session)
Mary Wootters
Ainesh Bakshi

Talk Dissection. Discussant: Mary Wootters, Presenter: Ainesh Bakshi

Session chair: Sam Hopkins

Venue: Room B (gather.town)

Title: List-Decodable Subspace Recovery: Dimension Independent Error in Polynomial Time

This session will begin with a 15-20 minutes talk by the presenter. The presenter and the discussant will then “dissect” the talk: the presenter will describe their process when making the talk and the discussant will identify aspects of the talk that went well and those that could be improved. This session should be as instructive and beneficial for the participants in the audience as it is for the presenter since the audience will learn about what goes into making a great talk.

Break.

Social chairs: Sumegha Garg, Suriya Gunasekar, Thodoris Lykouris

Venue: Lounge (gather.town)

This session is intended to be free time to socialize with senior members of the community as well as other participants. There will be tables with set topics where a senior member would be available to seek advice and answer your questions!

Nadav Cohen

General Research Talk. Nadav Cohen

Session chair: Thodoris Lykouris

Venue: Seminar Room (gather.town)

Title: Implicit Regularization in Deep Learning: Lessons Learned from Matrix and Tensor Factorization

Abstract: Understanding deep learning calls for addressing three fundamental questions: expressiveness, optimization and generalization. Expressiveness refers to the ability of compactly sized deep neural networks to represent functions capable of solving real-world problems. Optimization concerns the effectiveness of simple gradient-based algorithms in solving non-convex neural network training programs. Generalization treats the phenomenon of an implicit regularization preventing deep learning models from overfitting even when having much more parameters than examples to learn from. This talk will describe a series of works aimed at unraveling some of the mysteries behind generalization. Appealing to matrix and tensor factorization, I will present theoretical and empirical results that shed light on both implicit regularization of neural networks and the properties of real-world data translating it to generalization. Works covered in the talk were in collaboration with Sanjeev Arora, Wei Hu, Yuping Luo, Asaf Maman and Noam Razin.

Zhiyi Huang

General Research Talk. Zhiyi Huang

Session chair: Thodoris Lykouris

Venue: Seminar Room (gather.town)

Title: Settling the Sample Complexity of Single-parameter Revenue Maximization

Abstract: This work settles the sample complexity of single-parameter revenue maximization by showing matching upper and lower bounds, up to a poly-logarithmic factor, for all families of value distributions that have been considered in the literature. The upper bounds are unified under a novel framework, which builds on the strong revenue monotonicity by Devanur, Huang, and Psomas (STOC 2016), and an information theoretic argument. This is fundamentally different from the previous approaches that rely on either constructing an epsilon-net of the mechanism space, explicitly or implicitly via statistical learning theory, or learning an approximately accurate version of the virtual values. To our knowledge, it is the first time information theoretical arguments are used to show sample complexity upper bounds, instead of lower bounds. Our lower bounds are also unified under a meta construction of hard instances.

Social Event

Venue: Lounge (gather.town)

This session will be a social gathering with virtual board games and dancing.