Search papers, labs, and topics across Lattice.
The paper introduces EPIC, a hardware- and physics-co-guided distributed SciML framework designed for in-field applications with limited resources. EPIC employs lightweight local encoding on edge devices and physics-aware decoding at a central node, transmitting compact latent features instead of raw data. Results on a distributed testbed using full-waveform inversion (FWI) demonstrate that EPIC reduces latency by 8.9x and communication energy by 33.8x, while also improving reconstruction fidelity on most datasets.
Distributing SciML models with hardware and physics awareness slashes latency and energy consumption by over 8x and 33x, respectively, while paradoxically *improving* reconstruction fidelity.
Scientific machine learning (SciML) is increasingly applied to in-field processing, controlling, and monitoring; however, wide-area sensing, real-time demands, and strict energy and reliability constraints make centralized SciML implementation impractical. Most SciML models assume raw data aggregation at a central node, incurring prohibitively high communication latency and energy costs; yet, distributing models developed for general-purpose ML often breaks essential physical principles, resulting in degraded performance. To address these challenges, we introduce EPIC, a hardware- and physics-co-guided distributed SciML framework, using full-waveform inversion (FWI) as a representative task. EPIC performs lightweight local encoding on end devices and physics-aware decoding at a central node. By transmitting compact latent features rather than high-volume raw data and by using cross-attention to capture inter-receiver wavefield coupling, EPIC significantly reduces communication cost while preserving physical fidelity. Evaluated on a distributed testbed with five end devices and one central node, and across 10 datasets from OpenFWI, EPIC reduces latency by 8.9$\times$ and communication energy by 33.8$\times$, while even improving reconstruction fidelity on 8 out of 10 datasets.