COLT 2017 accepted papers

COLT 2017 accepted papers

– Shachar Lovett and Jiapeng Zhang. Noisy Population Recovery from Unknown Noise

– Michal Moshkovitz and Dana Moshkovitz. Mixing Implies Lower Bounds for Space Bounded Learning

– Andreas Maurer. A second-order look at stability and generalization

– Eric Balkanski and Yaron Singer. The Sample Complexity of Optimizing a Convex Function

– Daniel Vainsencher, Shie Mannor and Huan Xu. Ignoring Is a Bliss: Learning with Large Noise Through Reweighting-Minimization

– Nikita Zhivotovskiy. Optimal learning via local entropies and sample compression

– Bin Hu, Peter Seiler and Anders Rantzer. A Unified Analysis of Stochastic Optimization Methods Using Jump System Theory and Quadratic Constraints

– Jerry Li. Robust Sparse Estimation Tasks in High Dimensions (*to be merge)

– Amir Globerson, Roi Livni and Shai Shalev-Shwartz. Effective Semisupervised Learning on Manifolds

– Joon Kwon, Vianney Perchet and Claire Vernade. Sparse Stochastic Bandits

– Arpit Agarwal, Shivani Agarwal, Sepehr Assadi and Sanjeev Khanna. Learning with Limited Rounds of Adaptivity: Coin Tossing, Multi-Armed Bandits, and Ranking from Pairwise Comparisons

– Jerry Li and Ludwig Schmidt. Robust Proper Learning for Mixtures of Gaussians via Systems of Polynomial Inequalities

– Sebastian Casalaina-Martin, Rafael Frongillo, Tom Morgan and Bo Waggoner. Multi-Observation Elicitation

– Rafael Frongillo and Andrew Nobel. Memoryless Sequences for Differentiable Losses