Gradient Estimation for Exactly-k Constraints

Abstract

The exactly-k constraint is ubiquitous in machine learning and scientific applications, such as ensuring that the sum of electric charges in a neutral atom is zero. However, enforcing such constraints in machine learning models while allowing differentiable learning is challenging. In this work, we aim to provide a “cookbook” for seamlessly incorporating exactly-k constraints into machine learning models by extending a recent gradient estimator from Bernoulli variables to Gaussian and Poisson variables, utilizing constraint probabilities. We show the effectiveness of our proposed gradient estimators in synthetic experiments, and further demonstrate the practical utility of our approach by training neural networks to predict partial charges for metal-organic frameworks, aiding virtual screening in chemistry. Our proposed method not only enhances the capability of learning models but also expands their applicability to a wider range of scientific domains where satisfaction of constraints is crucial.

Publication
In Proceedings of the NeurIPS Workshop on AI for Scientific Discovery: From Theory to Practice, 2023
Zhe Zeng
Zhe Zeng
Ph.D. student in AI

My research goal is to enable machine learning models to incorporate diverse forms of constraints into probabilistic inference and learning in a principled way, by combining machine learning (probabilistic modeling, neuro-symbolic AI, Bayesian deep learning) and formal methods.