Relax, Compensate and then Integrate

Primal Graphs

Abstract

Weighted model integration (WMI) is an appeal-ing framework for probabilistic inference, it allows for expressing the complex dependencies in real-world problems, where variables are both continuous and discrete, via the language of Satisfiability Modulo Theories (SMT), as well as to compute probabilistic queries with complex logical and arithmetic constraints. Yet, existing WMI solvers are not ready to scale to these problems.They either ignore the intrinsic dependency structure of the problem entirely, or they are limited to overly restrictive structures. To narrow this gap,we derive a factorized WMI computation enabling us to devise a scalable WMI solver based onmessage passing, called MP-WMI. Namely, MP-WMI is the first WMI solver that can (i) perform exact inference on the full class of tree-structuredWMI problems, and (ii) perform inter-query amortization, e.g., to compute all marginal densities simultaneously. Experimental results show that our solver dramatically outperforms the existingWMI solvers on a large set of benchmarks.

Publication
Proceedings of the ECML-PKDD Workshop on Deep Continuous-Discrete Machine Learning (DeCoDeML 2020)
Zhe Zeng
Zhe Zeng
Ph.D. student in AI

My research interests lie in the intersection of machine learning (probabilistic modeling, statistical relational learning, neuro-symbolic AI) and formal methods. My research goal is to enable machine learning models to incorporate diverse forms of constraints into probabilistic inference and learning in a principled way.

Related