Bayesian structure learning with generative flow networks
Uncertainty in Artificial Intelligence, 2022•proceedings.mlr.press
In Bayesian structure learning, we are interested in inferring a distribution over the directed
acyclic graph (DAG) structure of Bayesian networks, from data. Defining such a distribution
is very challenging, due to the combinatorially large sample space, and approximations
based on MCMC are often required. Recently, a novel class of probabilistic models, called
Generative Flow Networks (GFlowNets), have been introduced as a general framework for
generative modeling of discrete and composite objects, such as graphs. In this work, we …
acyclic graph (DAG) structure of Bayesian networks, from data. Defining such a distribution
is very challenging, due to the combinatorially large sample space, and approximations
based on MCMC are often required. Recently, a novel class of probabilistic models, called
Generative Flow Networks (GFlowNets), have been introduced as a general framework for
generative modeling of discrete and composite objects, such as graphs. In this work, we …
Abstract
In Bayesian structure learning, we are interested in inferring a distribution over the directed acyclic graph (DAG) structure of Bayesian networks, from data. Defining such a distribution is very challenging, due to the combinatorially large sample space, and approximations based on MCMC are often required. Recently, a novel class of probabilistic models, called Generative Flow Networks (GFlowNets), have been introduced as a general framework for generative modeling of discrete and composite objects, such as graphs. In this work, we propose to use a GFlowNet as an alternative to MCMC for approximating the posterior distribution over the structure of Bayesian networks, given a dataset of observations. Generating a sample DAG from this approximate distribution is viewed as a sequential decision problem, where the graph is constructed one edge at a time, based on learned transition probabilities. Through evaluation on both simulated and real data, we show that our approach, called DAG-GFlowNet, provides an accurate approximation of the posterior over DAGs, and it compares favorably against other methods based on MCMC or variational inference.
proceedings.mlr.press
Showing the best result for this search. See all results