ArXiv TLDR

Robust volatility updates for Hierarchical Gaussian Filtering

🐦 Tweet
2605.00966

Christoph Mathys, Nicolas Legrand, Peter Thestrup Waade, Nace Mikus, Lilian Aline Weber

cs.LGcs.NEq-bio.NCstat.ML

TLDR

This paper introduces a robust method for updating volatility in Hierarchical Gaussian Filtering, preventing negative posterior precision errors.

Key contributions

  • Fixes negative posterior precision errors in Hierarchical Gaussian Filtering's volatility updates.
  • Proposes a modified quadratic approximation for variational energy in volatility-coupled nodes.
  • Interpolates between two quadratic expansions, leveraging the Lambert W function for a second mode.
  • Achieves robust HGF updates across the full parameter space, accurately tracking posteriors.

Why it matters

This paper resolves a critical flaw in Hierarchical Gaussian Filtering (HGF) where volatility updates could lead to impossible negative precision. By ensuring robust updates across the entire parameter space, it significantly enhances the reliability and practical applicability of HGF networks for modeling agent belief updates.

Original Abstract

Hierarchical Gaussian Filtering (HGF) networks allow for efficient updating of posterior distributions (beliefs) about hidden states of an agent's environment. HGF parent nodes can target the mean or variance of their children. New information entering at input nodes leads to a cascade of belief updates across the network according to one-step update equations for each node's mean and precision (inverse variance). However, the original form of the update equations for variance-targeting parents(volatility coupling) can in some regions of parameter space lead to negative posterior precision, a logical impossibility which causes the updating algorithm to terminate with an error. In this report, we introduce a modified quadratic approximation to the variational energy of volatility-coupled nodes that avoids negative posterior precision. The key idea is to interpolate between two quadratic expansions of the variational energy: one at the prior prediction and one at a second mode whose location is obtained in closed form via the Lambert W function. The resulting update equations are robust across the entire parameter space and faithfully track the variational posterior even for large prediction errors.

📬 Weekly AI Paper Digest

Get the top 10 AI/ML arXiv papers from the week — summarized, scored, and delivered to your inbox every Monday.