Competitive regularised regression

Authors: Jamil, W. and Bouchachia, A.

Journal: Neurocomputing

Volume: 390

Pages: 374-383

eISSN: 1872-8286

ISSN: 0925-2312

DOI: 10.1016/j.neucom.2019.08.094

Abstract:

Regularised regression uses sparsity and variance to reduce the complexity and over-fitting of a regression model. The present paper introduces two novel regularised linear regression algorithms: Competitive Iterative Ridge Regression (CIRR) and Online Shrinkage via Limit of Gibbs Sampler (OSLOG) for fast and reliable prediction on “Big Data” without making distributional assumption on the data. We use the technique of competitive analysis to design them and show their strong theoretical guarantee. Furthermore, we compare their performance against some neoteric regularised regression methods such as Online Ridge Regression (ORR) and the Aggregating Algorithm for Regression (AAR). The comparison of the algorithms is done theoretically, focusing on the guarantee on the performance on cumulative loss, and empirically to show the advantages of CIRR and OSLOG.

https://eprints.bournemouth.ac.uk/32713/

Source: Scopus

Competitive regularised regression

Authors: Jamil, W. and Bouchachia, A.

Journal: NEUROCOMPUTING

Volume: 390

Pages: 374-383

eISSN: 1872-8286

ISSN: 0925-2312

DOI: 10.1016/j.neucom.2019.08.094

https://eprints.bournemouth.ac.uk/32713/

Source: Web of Science (Lite)

Competitive Regularised Regression

Authors: Jamil, W. and Bouchachia, A.

Journal: Neurocomputing

Publisher: Elsevier

ISSN: 0925-2312

Abstract:

Regularised regression uses sparsity and variance to reduce the complexity and over-fitting of a regression model. The present paper introduces two novel regularised linear regression algorithms: Competitive Iterative Ridge Regression (CIRR) and Online Shrinkage via Limit of Gibbs Sampler (OSLOG) for fast and reliable prediction on "Big Data" without making distributional assumption on the data. We use the technique of competitive analysis to design them and show their strong theoretical guarantee. Furthermore, we compare their performance against some neoteric regularised regression methods such as On-line Ridge Regression (ORR) and the Aggregating Algorithm for Regression (AAR). The comparison of the algorithms is done theoretically, focusing on the guarantee on the performance on cumulative loss, and empirically to show the advantages of CIRR and OSLOG.

https://eprints.bournemouth.ac.uk/32713/

Source: Manual

Competitive Regularised Regression

Authors: Jamil, W. and Bouchachia, A.

Journal: Neurocomputing

Volume: 390

Issue: May

Pages: 374-383

ISSN: 0925-2312

Abstract:

Regularised regression uses sparsity and variance to reduce the complexity and over-fitting of a regression model. The present paper introduces two novel regularised linear regression algorithms: Competitive Iterative Ridge Regression (CIRR) and Online Shrinkage via Limit of Gibbs Sampler (OSLOG) for fast and reliable prediction on "Big Data" without making distributional assumption on the data. We use the technique of competitive analysis to design them and show their strong theoretical guarantee. Furthermore, we compare their performance against some neoteric regularised regression methods such as On-line Ridge Regression (ORR) and the Aggregating Algorithm for Regression (AAR). The comparison of the algorithms is done theoretically, focusing on the guarantee on the performance on cumulative loss, and empirically to show the advantages of CIRR and OSLOG.

https://eprints.bournemouth.ac.uk/32713/

Source: BURO EPrints