Search papers, labs, and topics across Lattice.
The paper introduces Jeffreys Flow, a Boltzmann generator framework that uses the symmetric Jeffreys divergence to train generative models for sampling physical systems with rough energy landscapes. This approach addresses mode collapse, a common issue with reverse KL divergence, by balancing target-seeking precision with global mode coverage through distillation of empirical data from Parallel Tempering. The authors demonstrate the framework's effectiveness in correcting stochastic gradient biases in Replica Exchange Stochastic Gradient Langevin Dynamics and accelerating importance sampling in Path Integral Monte Carlo for quantum thermal states.
Boltzmann generators can now robustly sample rare events in complex physical systems, thanks to a new training method that avoids catastrophic mode collapse.
Sampling physical systems with rough energy landscapes is hindered by rare events and metastable trapping. While Boltzmann generators already offer a solution, their reliance on the reverse Kullback--Leibler divergence frequently induces catastrophic mode collapse, missing specific modes in multi-modal distributions. Here, we introduce the Jeffreys Flow, a robust generative framework that mitigates this failure by distilling empirical sampling data from Parallel Tempering trajectories using the symmetric Jeffreys divergence. This formulation effectively balances local target-seeking precision with global modes coverage. We show that minimizing Jeffreys divergence suppresses mode collapse and structurally corrects inherent inaccuracies via distillation of the empirical reference data. We demonstrate the framework's scalability and accuracy on highly non-convex multidimensional benchmarks, including the systematic correction of stochastic gradient biases in Replica Exchange Stochastic Gradient Langevin Dynamics and the massive acceleration of exact importance sampling in Path Integral Monte Carlo for quantum thermal states.