A scoring function for learning Bayesian networks based on mutual information and conditional independence tests
Metadata
Show full item recordAuthor
Campos Ibáñez, Luis MiguelEditorial
MIT Press
Materia
Bayesian networks Scoring functions Learning Mutual information Conditional independence tests
Date
2006Referencia bibliográfica
Campos, L.M. A scoring function for learning Bayesian networks based on mutual information and conditional independence tests. Journal of Machine Learning Research, 7: 2149-2187 (2006). [http://hdl.handle.net/10481/32709]
Sponsorship
I would like to acknowledge support for this work from the Spanish ‘Consejería de Innovación Ciencia y Empresa de la Junta de Andalucía’, under Project TIC-276.Abstract
We propose a new scoring function for learning Bayesian networks from data using score+search algorithms. This is based on the concept of mutual information and exploits some well-known properties of this measure in a novel way. Essentially, a statistical independence test based on the
chi-square distribution, associated with the mutual information measure, together with a property of additive decomposition of this measure, are combined in order to measure the degree of interaction between each variable and its parent variables in the network. The result is a non-Bayesian
scoring function called MIT (mutual information tests) which belongs to the family of scores based on information theory. The MIT score also represents a penalization of the Kullback-Leibler divergence between the joint probability distributions associated with a candidate network and with the available data set. Detailed results of a complete experimental evaluation of the proposed scoring
function and its comparison with the well-known K2, BDeu and BIC/MDL scores are also presented.