Accelerating Relative Entropy Coding with Space Partitioning

Relative entropy coding (REC) algorithms encode a random sample following a target distribution$Q$ , using a coding distribution$P$shared between the sender and receiver. Sadly, general REC algorithms suffer from prohibitive encoding times, at least on the order of$2^{D_{\text{KL}}[Q||P]}$ , and fas...

Full description

Saved in:
Bibliographic Details
Main Authors He, Jiajun, Flamich, Gergely, Hernández-Lobato, José Miguel
Format Journal Article
LanguageEnglish
Published 20.05.2024
Subjects
Online AccessGet full text
DOI10.48550/arxiv.2405.12203

Cover

More Information
Summary:Relative entropy coding (REC) algorithms encode a random sample following a target distribution$Q$ , using a coding distribution$P$shared between the sender and receiver. Sadly, general REC algorithms suffer from prohibitive encoding times, at least on the order of$2^{D_{\text{KL}}[Q||P]}$ , and faster algorithms are limited to very specific settings. This work addresses this issue by introducing a REC scheme utilizing space partitioning to reduce runtime in practical scenarios. We provide theoretical analyses of our method and demonstrate its effectiveness with both toy examples and practical applications. Notably, our method successfully handles REC tasks with$D_{\text{KL}}[Q||P]$about three times greater than what previous methods can manage, and reduces the bitrate by approximately 5-15% in VAE-based lossless compression on MNIST and INR-based lossy compression on CIFAR-10, compared to previous methods, significantly improving the practicality of REC for neural compression.
DOI:10.48550/arxiv.2405.12203