Search papers, labs, and topics across Lattice.
This paper reformulates perceptual reconstruction in rate-distortion theory as recovering a sample within a synonymous set (synset) associated with the source, rather than the source itself. They establish a synonymous variational inference (SVI) framework with a synonymous variational lower bound (SVLBO) for synset-oriented compression. The work proves a synonymous RDP tradeoff, showing that the distributional divergence term in RDP arises naturally from the synset-based reconstruction objective, thus providing a theoretical justification for the RDP framework.
Distributional divergence terms in Rate-Distortion-Perception theory, often used as a modeling principle, now have a theoretical justification rooted in synonymous sets.
The fundamental limit of natural signal compression has traditionally been characterized by classical rate-distortion (RD) theory through the tradeoff between coding rate and reconstruction distortion, while the rate-distortion-perception (RDP) framework introduces a divergence-based measure of perceptual quality as a modeling principle rather than a theoretically-derived principle, leaving its theoretical origin unclear. In this paper, motivated by a synonymity-based semantic information perspective, we reformulate perceptual reconstruction as recovering any admissible sample within an ideal synonymous set (synset) associated with the source, rather than the source sample itself, and correspondingly establish a synonymous source coding architecture. On this basis, we develop a synonymous variational inference (SVI) analysis framework with a synonymous variational lower bound (SVLBO) for tractable analysis of synset-oriented compression. Within this framework, we establish a synonymity-perception consistency principle, showing that optimal identification of semantic information is theoretically consistent with perceptual optimization. Based on its derivation result, we prove a synonymous RDP tradeoff for the proposed synonymous source coding. These analytical results show that the distributional divergence term arises naturally from the synset-based reconstruction objective, clarify its compatibility with existing RDP formulations and classical RD theory, and suggest the potential advantages of synonymous source coding.