Search papers, labs, and topics across Lattice.
This paper introduces a unified continuous-time framework for spiking neural networks (SNNs) that leverages charge conservation to achieve deterministic computation in asynchronous neuromorphic systems. By coupling the Law of Charge Conservation with minimal neuron-level constraints, the framework ensures the terminal state depends solely on the aggregate input charge, invariant to temporal stochasticity. The authors prove this invariance for acyclic networks and establish an exact representational correspondence with quantized ANNs, bridging static deep learning and event-driven dynamics.
Neuromorphic systems can achieve deterministic computation despite temporal stochasticity by enforcing charge conservation, enabling a direct mapping to quantized ANNs.
Achieving deterministic computation results in asynchronous neuromorphic systems remains a fundamental challenge due to the inherent temporal stochasticity of continuous-time hardware. To address this, we develop a unified continuous-time framework for spiking neural networks (SNNs) that couples the Law of Charge Conservation with minimal neuron-level constraints. This integration ensures that the terminal state depends solely on the aggregate input charge, providing a unique cumulated output invariant to temporal stochasticity. We prove that this mapping is strictly invariant to spike timing in acyclic networks, whereas recurrent connectivity can introduce temporal sensitivity. Furthermore, we establish an exact representational correspondence between these charge-conserving SNNs and quantized artificial neural networks, bridging the gap between static deep learning and event-driven dynamics without approximation errors. These results establish a rigorous theoretical basis for designing continuous-time neuromorphic systems that harness the efficiency of asynchronous processing while maintaining algorithmic determinism.