Practical and Matching Gradient Variance Bounds for Black-Box Variational Bayesian InferenceUnderstanding the gradient variance of black-box variational inference (BBVI)
is a crucial step for establishing its convergence and developing algorithmic
improvements. However, existing studies have yet to show that the gradient
variance of BBVI satisfies the conditions used to study the convergence of
stochastic gradient descent (SGD), the workhorse of BBVI. In this work, we show
that BBVI satisfies a matching bound corresponding to the $ABC$ condition used
in the SGD literature when applied to smooth and quadratically-growing
log-likelihoods. Our results generalize to nonlinear covariance
parameterizations widely used in the practice of BBVI. Furthermore, we show
that the variance of the mean-field parameterization has provably superior
dimensional dependence.
arxiv.org