Scaling up Hybrid Probabilistic Inference with Logical and Arithmetic Constraints via Message Passing

Online Game Matching


Weighted model integration (WMI) is an appeal-ing framework for probabilistic inference, it allows for expressing the complex dependencies in real-world problems, where variables are both continuous and discrete, via the language of Satisfiability Modulo Theories (SMT), as well as to compute probabilistic queries with complex logical and arithmetic constraints. Yet, existing WMI solvers are not ready to scale to these problems.They either ignore the intrinsic dependency structure of the problem entirely, or they are limited to overly restrictive structures. To narrow this gap,we derive a factorized WMI computation enabling us to devise a scalable WMI solver based onmessage passing, called MP-WMI. Namely, MP-WMI is the first WMI solver that can (i) perform exact inference on the full class of tree-structuredWMI problems, and (ii) perform inter-query amortization, e.g., to compute all marginal densities simultaneously. Experimental results show that our solver dramatically outperforms the existingWMI solvers on a large set of benchmarks.

In Proceedings of the 37th International Conference on Machine Learning (ICML 2020)
Zhe Zeng
Zhe Zeng
Ph.D. student in AI

My research interests lie in the intersection of machine learning (tractable probabilistic modeling, statistical relational learning, graphical models, Bayesian deep learning, kernel and non-parametric methods) and formal methods.