Competitive Normalized Least-Squares Regression

Authors: Jamil, W. and Bouchachia, A.

Journal: IEEE Transactions on Neural Networks and Learning Systems

Volume: 32

Issue: 7

Pages: 3262-3267

eISSN: 2162-2388

ISSN: 2162-237X

DOI: 10.1109/TNNLS.2020.3009777

Abstract:

Online learning has witnessed an increasing interest over the recent past due to its low computational requirements and its relevance to a broad range of streaming applications. In this brief, we focus on online regularized regression. We propose a novel efficient online regression algorithm, called online normalized least-squares (ONLS). We perform theoretical analysis by comparing the total loss of ONLS against the normalized gradient descent (NGD) algorithm and the best off-line LS predictor. We show, in particular, that ONLS allows for a better bias-variance tradeoff than those state-of-the-art gradient descent-based LS algorithms as well as a better control on the level of shrinkage of the features toward the null. Finally, we conduct an empirical study to illustrate the great performance of ONLS against some state-of-the-art algorithms using real-world data.

https://eprints.bournemouth.ac.uk/34331/

Source: Scopus

Competitive Normalized Least-Squares Regression.

Authors: Jamil, W. and Bouchachia, A.

Journal: IEEE Trans Neural Netw Learn Syst

Volume: 32

Issue: 7

Pages: 3262-3267

eISSN: 2162-2388

DOI: 10.1109/TNNLS.2020.3009777

Abstract:

Online learning has witnessed an increasing interest over the recent past due to its low computational requirements and its relevance to a broad range of streaming applications. In this brief, we focus on online regularized regression. We propose a novel efficient online regression algorithm, called online normalized least-squares (ONLS). We perform theoretical analysis by comparing the total loss of ONLS against the normalized gradient descent (NGD) algorithm and the best off-line LS predictor. We show, in particular, that ONLS allows for a better bias-variance tradeoff than those state-of-the-art gradient descent-based LS algorithms as well as a better control on the level of shrinkage of the features toward the null. Finally, we conduct an empirical study to illustrate the great performance of ONLS against some state-of-the-art algorithms using real-world data.

https://eprints.bournemouth.ac.uk/34331/

Source: PubMed

Competitive Normalized Least-Squares Regression

Authors: Jamil, W. and Bouchachia, A.

Journal: IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS

Volume: 32

Issue: 7

Pages: 3262-3267

eISSN: 2162-2388

ISSN: 2162-237X

DOI: 10.1109/TNNLS.2020.3009777

https://eprints.bournemouth.ac.uk/34331/

Source: Web of Science (Lite)

Competitive Normalised Least Squares Regression

Authors: Waqas, J. and Bouchachia, A.

Journal: IEEE Transactions on Neural Networks and Learning Systems

Publisher: Institute of Electrical and Electronics Engineers

ISSN: 1045-9227

Abstract:

Online learning has witnessed an increasing interest over the recent past due to its low computational requirements and its relevance to a broad range of streaming applications. In this paper, we focus on online regularised regression. We propose a novel efficient online regression algorithm, called Online Normalised Least Squares (ONLS). We perform theoretical analysis, by comparing the total loss of ONLS against the Normalised Gradient Descent algorithm (NGD) and the best offline LS predictor. We show in particular that ONLS allows for a better bias-variance tradeoff than those state-of-the-art gradient descent-based LS algorithms as well as a better control on the level of shrinkage of the features towards null. Finally, we conduct an empirical study to illustrate the great performance of ONLS against some state-of-the-art algorithms using real-world data.

https://eprints.bournemouth.ac.uk/34331/

Source: Manual

Competitive Normalized Least-Squares Regression.

Authors: Jamil, W. and Bouchachia, A.

Journal: IEEE transactions on neural networks and learning systems

Volume: 32

Issue: 7

Pages: 3262-3267

eISSN: 2162-2388

ISSN: 2162-237X

DOI: 10.1109/tnnls.2020.3009777

Abstract:

Online learning has witnessed an increasing interest over the recent past due to its low computational requirements and its relevance to a broad range of streaming applications. In this brief, we focus on online regularized regression. We propose a novel efficient online regression algorithm, called online normalized least-squares (ONLS). We perform theoretical analysis by comparing the total loss of ONLS against the normalized gradient descent (NGD) algorithm and the best off-line LS predictor. We show, in particular, that ONLS allows for a better bias-variance tradeoff than those state-of-the-art gradient descent-based LS algorithms as well as a better control on the level of shrinkage of the features toward the null. Finally, we conduct an empirical study to illustrate the great performance of ONLS against some state-of-the-art algorithms using real-world data.

https://eprints.bournemouth.ac.uk/34331/

Source: Europe PubMed Central

Competitive Normalised Least Squares Regression

Authors: Waqas, J. and Bouchachia, A.

Journal: IEEE Transactions on Neural Networks and Learning Systems

Volume: 32

Issue: 7

Pages: 3262-3267

ISSN: 1045-9227

Abstract:

Online learning has witnessed an increasing interest over the recent past due to its low computational requirements and its relevance to a broad range of streaming applications. In this paper, we focus on online regularised regression. We propose a novel efficient online regression algorithm, called Online Normalised Least Squares (ONLS). We perform theoretical analysis, by comparing the total loss of ONLS against the Normalised Gradient Descent algorithm (NGD) and the best offline LS predictor. We show in particular that ONLS allows for a better bias-variance tradeoff than those state-of-the-art gradient descent-based LS algorithms as well as a better control on the level of shrinkage of the features towards null. Finally, we conduct an empirical study to illustrate the great performance of ONLS against some state-of-the-art algorithms using real-world data.

https://eprints.bournemouth.ac.uk/34331/

Source: BURO EPrints