An evolving classification cascade with self-learning

Authors: Bouchachia, A.

Journal: Evolving Systems

Volume: 1

Issue: 3

Pages: 143-160

eISSN: 1868-6486

ISSN: 1868-6478

DOI: 10.1007/s12530-010-9014-x

Abstract:

Incremental learning is mostly accomplished by an incremental model relying on appropriate and adjustable architecture. The present paper introduces a hybrid evolving architecture for dealing with incremental learning. Consisting of two sequential and incremental learning modules: growing Gaussian mixture model (GGMM) and resource allocating neural network (RAN), the rationale of the architecture rests on two issues: incrementality and the possibility of partially labeled data processing in the context of classification. The two modules are coherent in the sense that both rely on Gaussian functions. While RAN trained by the extended Kalman filter is used for prediction, GGMM is dedicated to self-learning or pre-labeling of unlabeled data using a probabilistic framework. In addition, an incremental feature selection procedure is applied for continuously choosing the meaningful features. The empirical evaluation of the cascade studies various aspects in order to discuss the efficiency of the proposed hybrid learning architecture. © Springer-Verlag 2010.

Source: Scopus

Preferred by: Hamid Bouchachia

An evolving classification cascade with self-learning.

Authors: Bouchachia, A.

Journal: Evol. Syst.

Volume: 1

Pages: 143-160

DOI: 10.1007/s12530-010-9014-x

Source: DBLP