Competitive Normalized Least-Squares Regression
Authors: Jamil, W. and Bouchachia, A.
Journal: IEEE Transactions on Neural Networks and Learning Systems
Volume: 32
Issue: 7
Pages: 3262-3267
eISSN: 2162-2388
ISSN: 2162-237X
DOI: 10.1109/TNNLS.2020.3009777
Abstract:Online learning has witnessed an increasing interest over the recent past due to its low computational requirements and its relevance to a broad range of streaming applications. In this brief, we focus on online regularized regression. We propose a novel efficient online regression algorithm, called online normalized least-squares (ONLS). We perform theoretical analysis by comparing the total loss of ONLS against the normalized gradient descent (NGD) algorithm and the best off-line LS predictor. We show, in particular, that ONLS allows for a better bias-variance tradeoff than those state-of-the-art gradient descent-based LS algorithms as well as a better control on the level of shrinkage of the features toward the null. Finally, we conduct an empirical study to illustrate the great performance of ONLS against some state-of-the-art algorithms using real-world data.
https://eprints.bournemouth.ac.uk/34331/
Source: Scopus