Banzhaf random forests: Cooperative game theory based random forests with consistency

Authors: Sun, J., Zhong, G., Huang, K. and Dong, J.

Journal: Neural Networks

Volume: 106

Pages: 20-29

eISSN: 1879-2782

ISSN: 0893-6080

DOI: 10.1016/j.neunet.2018.06.006

Abstract:

Random forests algorithms have been widely used in many classification and regression applications. However, the theory of random forests lags far behind their applications. In this paper, we propose a novel random forests classification algorithm based on cooperative game theory. The Banzhaf power index is employed to evaluate the power of each feature by traversing possible feature coalitions. Hence, we call the proposed algorithm Banzhaf random forests (BRFs). Unlike the previously used information gain ratio, which only measures the power of each feature for classification and pays less attention to the intrinsic structure of the feature variables, the Banzhaf power index can measure the importance of each feature by computing the dependency among the group of features. More importantly, we have proved the consistency of BRFs, which narrows the gap between the theory and applications of random forests. Extensive experiments on several UCI benchmark data sets and three real world applications show that BRFs perform significantly better than existing consistent random forests on classification accuracy, and better than or at least comparable with Breiman's random forests, support vector machines (SVMs) and k-nearest neighbors (KNNs) classifiers.

https://eprints.bournemouth.ac.uk/33294/

Source: Scopus

Banzhaf random forests: Cooperative game theory based random forests with consistency.

Authors: Sun, J., Zhong, G., Huang, K. and Dong, J.

Journal: Neural Netw

Volume: 106

Pages: 20-29

eISSN: 1879-2782

DOI: 10.1016/j.neunet.2018.06.006

Abstract:

Random forests algorithms have been widely used in many classification and regression applications. However, the theory of random forests lags far behind their applications. In this paper, we propose a novel random forests classification algorithm based on cooperative game theory. The Banzhaf power index is employed to evaluate the power of each feature by traversing possible feature coalitions. Hence, we call the proposed algorithm Banzhaf random forests (BRFs). Unlike the previously used information gain ratio, which only measures the power of each feature for classification and pays less attention to the intrinsic structure of the feature variables, the Banzhaf power index can measure the importance of each feature by computing the dependency among the group of features. More importantly, we have proved the consistency of BRFs, which narrows the gap between the theory and applications of random forests. Extensive experiments on several UCI benchmark data sets and three real world applications show that BRFs perform significantly better than existing consistent random forests on classification accuracy, and better than or at least comparable with Breiman's random forests, support vector machines (SVMs) and k-nearest neighbors (KNNs) classifiers.

https://eprints.bournemouth.ac.uk/33294/

Source: PubMed

Banzhaf random forests: Cooperative game theory based random forests with consistency

Authors: Sun, J., Zhong, G., Huang, K. and Dong, J.

Journal: NEURAL NETWORKS

Volume: 106

Pages: 20-29

eISSN: 1879-2782

ISSN: 0893-6080

DOI: 10.1016/j.neunet.2018.06.006

https://eprints.bournemouth.ac.uk/33294/

Source: Web of Science (Lite)

Banzhaf random forests: Cooperative game theory based random forests with consistency

Authors: Sun, J., Zhong, G., Huang, K. and Dong, J.

Journal: Neural Networks

Volume: 106

Publisher: Elsevier

ISSN: 0893-6080

Abstract:

Random forests algorithms have been widely used in many classification and regression applications. However, the theory of random forests lags far behind their applications. In this paper, we propose a novel random forests classification algorithm based on cooperative game theory. The Banzhaf power index is employed to evaluate the power of each feature by traversing possible feature coalitions. Hence, we call the proposed algorithm Banzhaf random forests (BRFs). Unlike the previously used information gain ratio, which only measures the power of each feature for classification and pays less attention to the intrinsic structure of the feature variables, the Banzhaf power index can measure the importance of each feature by computing the dependency among the group of features. More importantly, we have proved the consistency of BRFs, which narrows the gap between the theory and applications of random forests. Extensive experiments on several UCI benchmark data sets and three real world applications show that BRFs perform significantly better than existing consistent random forests on classification accuracy, and better than or at least comparable with Breiman’s random forests, support vector machines (SVMs) and k-nearest neighbors (KNNs) classifiers.

https://eprints.bournemouth.ac.uk/33294/

Source: Manual

Banzhaf random forests: Cooperative game theory based random forests with consistency.

Authors: Sun, J., Zhong, G., Huang, K. and Dong, J.

Journal: Neural networks : the official journal of the International Neural Network Society

Volume: 106

Pages: 20-29

eISSN: 1879-2782

ISSN: 0893-6080

DOI: 10.1016/j.neunet.2018.06.006

Abstract:

Random forests algorithms have been widely used in many classification and regression applications. However, the theory of random forests lags far behind their applications. In this paper, we propose a novel random forests classification algorithm based on cooperative game theory. The Banzhaf power index is employed to evaluate the power of each feature by traversing possible feature coalitions. Hence, we call the proposed algorithm Banzhaf random forests (BRFs). Unlike the previously used information gain ratio, which only measures the power of each feature for classification and pays less attention to the intrinsic structure of the feature variables, the Banzhaf power index can measure the importance of each feature by computing the dependency among the group of features. More importantly, we have proved the consistency of BRFs, which narrows the gap between the theory and applications of random forests. Extensive experiments on several UCI benchmark data sets and three real world applications show that BRFs perform significantly better than existing consistent random forests on classification accuracy, and better than or at least comparable with Breiman's random forests, support vector machines (SVMs) and k-nearest neighbors (KNNs) classifiers.

https://eprints.bournemouth.ac.uk/33294/

Source: Europe PubMed Central

Banzhaf random forests: Cooperative game theory based random forests with consistency.

Authors: Sun, J., Zhong, G., Huang, K. and Dong, J.

Journal: Neural Networks

Volume: 106

Pages: 20-29

ISSN: 0893-6080

Abstract:

Random forests algorithms have been widely used in many classification and regression applications. However, the theory of random forests lags far behind their applications. In this paper, we propose a novel random forests classification algorithm based on cooperative game theory. The Banzhaf power index is employed to evaluate the power of each feature by traversing possible feature coalitions. Hence, we call the proposed algorithm Banzhaf random forests (BRFs). Unlike the previously used information gain ratio, which only measures the power of each feature for classification and pays less attention to the intrinsic structure of the feature variables, the Banzhaf power index can measure the importance of each feature by computing the dependency among the group of features. More importantly, we have proved the consistency of BRFs, which narrows the gap between the theory and applications of random forests. Extensive experiments on several UCI benchmark data sets and three real world applications show that BRFs perform significantly better than existing consistent random forests on classification accuracy, and better than or at least comparable with Breiman’s random forests, support vector machines (SVMs) and k-nearest neighbors (KNNs) classifiers.

https://eprints.bournemouth.ac.uk/33294/

Source: BURO EPrints