Search papers, labs, and topics across Lattice.
1
0
2
Infinite neural nets can be sparse, and this paper proves it, showing that total variation regularization provably yields sparse solutions in infinite-width shallow ReLU networks, with sparsity bounds tied to the geometry of the data.