A scoring function for learning Bayesian networks based on mutual information and conditional independence tests Campos Ibáñez, Luis Miguel Bayesian networks Scoring functions Learning Mutual information Conditional independence tests We propose a new scoring function for learning Bayesian networks from data using score+search algorithms. This is based on the concept of mutual information and exploits some well-known properties of this measure in a novel way. Essentially, a statistical independence test based on the chi-square distribution, associated with the mutual information measure, together with a property of additive decomposition of this measure, are combined in order to measure the degree of interaction between each variable and its parent variables in the network. The result is a non-Bayesian scoring function called MIT (mutual information tests) which belongs to the family of scores based on information theory. The MIT score also represents a penalization of the Kullback-Leibler divergence between the joint probability distributions associated with a candidate network and with the available data set. Detailed results of a complete experimental evaluation of the proposed scoring function and its comparison with the well-known K2, BDeu and BIC/MDL scores are also presented. 2014-07-18T08:37:18Z 2014-07-18T08:37:18Z 2006 info:eu-repo/semantics/article Campos, L.M. A scoring function for learning Bayesian networks based on mutual information and conditional independence tests. Journal of Machine Learning Research, 7: 2149-2187 (2006). [http://hdl.handle.net/10481/32709] 1532-4435 http://hdl.handle.net/10481/32709 eng http://creativecommons.org/licenses/by-nc-nd/3.0/ info:eu-repo/semantics/openAccess Creative Commons Attribution-NonCommercial-NoDerivs 3.0 License MIT Press