A non-parametric hierarchical clustering model
Authors: Mohamad, S., Bouchachia, A. and Sayed-Mouchaweh, M.
Journal: 2015 IEEE International Conference on Evolving and Adaptive Intelligent Systems, EAIS 2015
ISBN: 9781467366977
DOI: 10.1109/EAIS.2015.7368803
Abstract:We present a novel non-parametric clustering model using Gaussian mixture model (NHCM). NHCM uses a novel Dirichlet process (DP) prior allowing for more flexible modeling of the data, where the base distribution of DP is itself an infinite mixture of Gaussian conjugate prior. NHCM can be thought of as hierarchical clustering model, in which the low level base prior governs the distribution of the data points forming sub-clusters, and the higher level prior governs the distribution of the sub-clusters forming clusters. Using this hierarchical configuration, we can maintain low complexity of the model and allow for clustering skewed complex data. To perform inference, we propose a Gibbs sampling algorithm. Empirical investigations have been carried out to analyse the efficiency of the proposed clustering model.
https://eprints.bournemouth.ac.uk/29870/
Source: Scopus
A Non-parametric Hierarchical Clustering Model
Authors: Mohamad, S., Bouchachia, A. and Sayed-Mouchaweh, M.
Journal: 2015 IEEE INTERNATIONAL CONFERENCE ON EVOLVING AND ADAPTIVE INTELLIGENT SYSTEMS (EAIS)
ISSN: 2330-4863
https://eprints.bournemouth.ac.uk/29870/
Source: Web of Science (Lite)
A non-parametric hierarchical clustering model
Authors: Saad, M., Bouchachia, A. and Sayed Mouchaweh, M.
Conference: . The 2015 IEEE Conference on Evolving and Adaptive Intelligent Systems –EAIS 2015
Dates: 2-4 December 2015
https://eprints.bournemouth.ac.uk/29870/
Source: Manual
A Non-parametric Hierarchical Clustering Model
Authors: Mohamad, S., Bouchachia, A. and Sayed-Mouchaweh, M.
Conference: 2015 IEEE International Conference on Evolving and Adaptive Intelligent Systems (EAIS)
Dates: 1-3 December 2015
Journal: 2015 IEEE INTERNATIONAL CONFERENCE ON EVOLVING AND ADAPTIVE INTELLIGENT SYSTEMS (EAIS)
ISBN: 9781467366977
DOI: 10.1109/EAIS.2015.7368803
Abstract:We present a novel non-parametric clustering model using Gaussian mixture model (NHCM). NHCM uses a novel Dirichlet process (DP) prior allowing for more flexible modeling of the data, where the base distribution of DP is itself an infinite mixture of Gaussian conjugate prior. NHCM can be thought of as hierarchical clustering model, in which the low level base prior governs the distribution of the data points forming sub-clusters, and the higher level prior governs the distribution of the sub-clusters forming clusters. Using this hierarchical configuration, we can maintain low complexity of the model and allow for clustering skewed complex data. To perform inference, we propose a Gibbs sampling algorithm. Empirical investigations have been carried out to analyse the efficiency of the proposed clustering model.
https://eprints.bournemouth.ac.uk/29870/
Source: Manual
A non-parametric hierarchical clustering model
Authors: Mohamad, S., Bouchachia, A. and Sayed-Mouchaweh, M.
Conference: 2015 IEEE International Conference on Evolving and Adaptive Intelligent Systems (EAIS)
ISBN: 9781467366977
Abstract:© 2015 IEEE. We present a novel non-parametric clustering model using Gaussian mixture model (NHCM). NHCM uses a novel Dirichlet process (DP) prior allowing for more flexible modeling of the data, where the base distribution of DP is itself an infinite mixture of Gaussian conjugate prior. NHCM can be thought of as hierarchical clustering model, in which the low level base prior governs the distribution of the data points forming sub-clusters, and the higher level prior governs the distribution of the sub-clusters forming clusters. Using this hierarchical configuration, we can maintain low complexity of the model and allow for clustering skewed complex data. To perform inference, we propose a Gibbs sampling algorithm. Empirical investigations have been carried out to analyse the efficiency of the proposed clustering model.
https://eprints.bournemouth.ac.uk/29870/
Source: BURO EPrints