Search papers, labs, and topics across Lattice.
The paper introduces Metriplector, a novel neural architecture primitive inspired by metriplectic field theory, where computation arises from the dynamics of coupled fields. This architecture leverages the stress-energy tensor derived from Noether's theorem for readout, allowing for a spectrum of instantiations ranging from solving screened Poisson equations to complex field dynamics. Experiments demonstrate Metriplector's effectiveness across maze pathfinding, Sudoku solving, image recognition (CIFAR-100), and language modeling, achieving strong performance with fewer parameters and training data compared to baselines.
Forget attention: Metriplectic dynamics offer a surprisingly effective and parameter-efficient route to neural computation, outperforming standard architectures in several domains.
We present Metriplector, a neural architecture primitive in which the input configures an abstract physical system--fields, sources, and operators--and the dynamics of that system is the computation. Multiple fields evolve via coupled metriplectic dynamics, and the stress-energy tensor T^{{\mu}{\nu}}, derived from Noether's theorem, provides the readout. The metriplectic formulation admits a natural spectrum of instantiations: the dissipative branch alone yields a screened Poisson equation solved exactly via conjugate gradient; activating the full structure--including the antisymmetric Poisson bracket--gives field dynamics for image recognition and language modeling. We evaluate Metriplector across four domains, each using a task-specific architecture built from this shared primitive with progressively richer physics: F1=1.0 on maze pathfinding, generalizing from 15x15 training grids to unseen 39x39 grids; 97.2% exact Sudoku solve rate with zero structural injection; 81.03% on CIFAR-100 with 2.26M parameters; and 1.182 bits/byte on language modeling with 3.6x fewer training tokens than a GPT baseline.