Function-space regularized Rényi divergencesWe propose a new family of regularized Rényi divergences parametrized not
only by the order $α$ but also by a variational function space. These new
objects are defined by taking the infimal convolution of the standard Rényi
divergence with the integral probability metric (IPM) associated with the
chosen function space. We derive a novel dual variational representation that
can be used to construct numerically tractable divergence estimators. This
representation avoids risk-sensitive terms and therefore exhibits lower
variance, making it well-behaved when $α>1$; this addresses a notable
weakness of prior approaches. We prove several properties of these new
divergences, showing that they interpolate between the classical Rényi
divergences and IPMs. We also study the $α\to\infty$ limit, which leads to
a regularized worst-case-regret and a new variational representation in the
classical case. Moreover, we show that the proposed regularized Rényi
divergences inherit features from IPMs such as the ability to compare
distributions that are not absolutely continuous, e.g., empirical measures and
distributions with low-dimensional support. We present numerical results on
both synthetic and real datasets, showing the utility of these new divergences
in both estimation and GAN training applications; in particular, we demonstrate
significantly reduced variance and improved training performance.
arxiv.org