Search papers, labs, and topics across Lattice.
2
0
2
Learnability under distributional adversaries hinges on a single property – generalized smoothness – which also dictates private learnability, unifying online learning and differential privacy under a common framework.
Multi-distribution learning suffers an unavoidable $k/ε^2$ sample complexity scaling with the number of distributions, even under bounded label noise, challenging the intuition that shared structure can easily translate to faster learning.