Online Bayesian shrinkage regression

Authors: Jamil, W. and Bouchachia, A.

Journal: Neural Computing and Applications

Publisher: Springer Nature

eISSN: 1433-3058

ISSN: 0941-0643

Abstract:

The present work introduces an original and new online regression method that extends the shrinkage via limit of Gibbs sampler (SLOG) in the context of online learning. In particular, we theoretically show how the proposed online SLOG (OSLOG) is obtained using the Bayesian framework without resorting to the Gibbs sampler or considering a hierarchical representation. Moreover, in order to define the performance guarantee of OSLOG, we derive an upper bound on the cumulative squared loss. It is the only online regression algorithm with sparsity that gives logarithmic regret. Furthermore, we do an empirical comparison with two state-of-the-art algorithms to illustrate the performance of OSLOG relying on three aspects: normality, sparsity and multicollinearity showing an excellent achievement of trade-off between these properties.

https://eprints.bournemouth.ac.uk/40849/

Source: Manual