Search papers, labs, and topics across Lattice.
2
0
3
0
A shockingly small number of poisoned, synthetically distilled data points can completely hijack a model during transfer learning, turning it into an unwitting accomplice.
Dataset distillation, intended to compress data while preserving model performance, actually leaks sensitive information about the original training data and model architecture.