Search papers, labs, and topics across Lattice.
1
0
2
Dataset distillation, intended to compress data while preserving model performance, actually leaks sensitive information about the original training data and model architecture.