Neuro-Symbolic ODE Discovery with Latent Grammar Flow
Karin Yu, Eleni Chatzi, Georgios Kissas
TLDR
Latent Grammar Flow (LGF) is a neuro-symbolic framework that discovers ordinary differential equations from data using a discrete latent space.
Key contributions
- Introduces Latent Grammar Flow (LGF) for neuro-symbolic ODE discovery from data.
- Embeds grammar-based equations into a discrete latent space with semantic grouping.
- Uses a discrete flow model to recursively generate data-fitting candidate equations.
- Integrates domain knowledge and constraints (e.g., stability) into the discovery process.
Why it matters
This paper offers an interpretable and transferable approach to discovering differential equations, moving beyond opaque black-box models. It leverages neuro-symbolic AI to better understand and model complex natural and engineered systems.
Original Abstract
Understanding natural and engineered systems often relies on symbolic formulations, such as differential equations, which provide interpretability and transferability beyond black-box models. We introduce Latent Grammar Flow (LGF), a neuro-symbolic generative framework for discovering ordinary differential equations from data. LGF embeds equations as grammar-based representations into a discrete latent space and forces semantically similar equations to be positioned closer together with a behavioural loss. Then, a discrete flow model guides the sampling process to recursively generate candidate equations that best fit the observed data. Domain knowledge and constraints, such as stability, can be either embedded into the rules or used as conditional predictors.
📬 Weekly AI Paper Digest
Get the top 10 AI/ML arXiv papers from the week — summarized, scored, and delivered to your inbox every Monday.