Search papers, labs, and topics across Lattice.
2
0
5
Turns out, federated learning with PEFT doesn't protect your LLM training data as well as you thought: FedSpy-LLM can reconstruct surprisingly long sequences from shared gradients, even across different model architectures.
Dataset distillation can be sped up by 18x on ImageNet-1K without sacrificing accuracy by focusing optimization on high-loss regions.