Search papers, labs, and topics across Lattice.
The paper introduces Fun-DDPS, a generative framework for carbon capture and storage (CCS) modeling that combines function-space diffusion models with differentiable neural operator surrogates for both forward and inverse problems. By decoupling the learning of a prior over geological parameters from the physics-consistent guidance provided by a Local Neural Operator (LNO) surrogate, Fun-DDPS effectively handles data sparsity and ensures physically realistic solutions. Experiments on synthetic CCS datasets demonstrate that Fun-DDPS significantly outperforms standard surrogates in forward modeling with sparse observations and achieves comparable accuracy to rejection sampling in inverse modeling, while also generating physically consistent realizations with improved sample efficiency.
Diffusion models can now solve notoriously ill-posed inverse problems in carbon capture and storage, outperforming standard methods by an order of magnitude and even rivaling asymptotically exact methods like Rejection Sampling, but with better physical realism.
Accurate characterization of subsurface flow is critical for Carbon Capture and Storage (CCS) but remains challenged by the ill-posed nature of inverse problems with sparse observations. We present Fun-DDPS, a generative framework that combines function-space diffusion models with differentiable neural operator surrogates for both forward and inverse modeling. Our approach learns a prior distribution over geological parameters (geomodel) using a single-channel diffusion model, then leverages a Local Neural Operator (LNO) surrogate to provide physics-consistent guidance for cross-field conditioning on the dynamics field. This decoupling allows the diffusion prior to robustly recover missing information in parameter space, while the surrogate provides efficient gradient-based guidance for data assimilation. We demonstrate Fun-DDPS on synthetic CCS modeling datasets, achieving two key results: (1) For forward modeling with only 25% observations, Fun-DDPS achieves 7.7% relative error compared to 86.9% for standard surrogates (an 11x improvement), proving its capability to handle extreme data sparsity where deterministic methods fail. (2) We provide the first rigorous validation of diffusion-based inverse solvers against asymptotically exact Rejection Sampling (RS) posteriors. Both Fun-DDPS and the joint-state baseline (Fun-DPS) achieve Jensen-Shannon divergence less than 0.06 against the ground truth. Crucially, Fun-DDPS produces physically consistent realizations free from the high-frequency artifacts observed in joint-state baselines, achieving this with 4x improved sample efficiency compared to rejection sampling.