Skip to main content
Menu

Yoav Gelberg et al wins Best Paper Award at the GRaM workshop (ICML)

Variational Inference Failures Under Model Symmetries: Permutation Invariant Posteriors for Bayesian Neural Networks

Short summary of the paper: "Parameter symmetries in neural networks, such as permutation symmetries in MLPs, give rise to Bayesian neural network (BNN) posteriors with multiple equivalent modes. This multimodality challenges variational inference (VI) methods, which typically rely on unimodal posterior estimation. Our work demonstrates that these symmetries introduce biases in posterior estimation, compromising predictive performance and posterior fit.
To address this, we introduce a symmetrization framework for constructing invariant variational posteriors. We prove that these invariant posteriors provide a better fit to the true posterior and can be trained using a modified ELBO objective with an adjusted KL regularization term. Experimental results show that our approach mitigates estimation biases, improving predictions and achieving higher ELBO values."

Tweet