Towards cost-sensitive adaptation: When is it worth updating your predictive model?

Authors: Žliobaite, I., Budka, M. and Stahl, F.

Journal: Neurocomputing

Volume: 150

Issue: Part A

Pages: 240-249

eISSN: 1872-8286

ISSN: 0925-2312

DOI: 10.1016/j.neucom.2014.05.084

Abstract:

Our digital universe is rapidly expanding, more and more daily activities are digitally recorded, data arrives in streams, it needs to be analyzed in real time and may evolve over time. In the last decade many adaptive learning algorithms and prediction systems, which can automatically update themselves with the new incoming data, have been developed. The majority of those algorithms focus on improving the predictive performance and assume that model update is always desired as soon as possible and as frequently as possible. In this study we consider potential model update as an investment decision, which, as in the financial markets, should be taken only if a certain return on investment is expected. We introduce and motivate a new research problem for data streams - cost-sensitive adaptation. We propose a reference framework for analyzing adaptation strategies in terms of costs and benefits. Our framework allows us to characterize and decompose the costs of model updates, and to asses and interpret the gains in performance due to model adaptation for a given learning algorithm on a given prediction task. Our proof-of-concept experiment demonstrates how the framework can aid in analyzing and managing adaptation decisions in the chemical industry.

https://eprints.bournemouth.ac.uk/22860/

Source: Scopus

Towards cost-sensitive adaptation: When is it worth updating your predictive model?

Authors: Zliobaite, I., Budka, M. and Stahl, F.

Journal: Neurocomputing

Volume: (In press)

Abstract:

Our digital universe is rapidly expanding,more and more daily activities are digitally recorded, data arrives in streams, it needs to be analyzed in real time and may evolve over time. In the last decade many adaptive learning algorithms and prediction systems, which can automatically update themselves with the new incoming data, have been developed. The majority of those algorithms focus on improving the predictive performance and assume that model update is always desired as soon as possible and as frequently as possible. In this study we consider potential model update as an investment decision, which, as in the financial markets, should be taken only if a certain return on investment is expected. We introduce and motivate a new research problem for data streams – cost-sensitive adaptation. We propose a reference framework for analyzing adaptation strategies in terms of costs and benefits. Our framework allows to characterize and decompose the costs of model updates, and to asses and interpret the gains in performance due to model adaptation for a given learning algorithm on a given prediction task. Our proof-of-concept experiment demonstrates how the framework can aid in analyzing and managing adaptation decisions in the chemical industry.

https://eprints.bournemouth.ac.uk/22860/

Source: Manual

Preferred by: Marcin Budka

Towards cost-sensitive adaptation: When is it worth updating your predictive model?

Authors: Zliobaite, I., Budka, M. and Stahl, F.T.

Journal: Neurocomputing

Volume: 150

Pages: 240-249

DOI: 10.1016/j.neucom.2014.05.084

https://eprints.bournemouth.ac.uk/22860/

Source: DBLP

Towards cost-sensitive adaptation: When is it worth updating your predictive model?

Authors: Zliobaite, I., Budka, M. and Stahl, F.

Journal: Neurocomputing

Volume: 150

Issue: Part A

Pages: 240-249

ISSN: 0925-2312

Abstract:

Our digital universe is rapidly expanding, more and more daily activities are digitally recorded, data arrives in streams, it needs to be analyzed in real time and may evolve over time. In the last decade many adaptive learning algorithms and prediction systems, which can automatically update themselves with the new incoming data, have been developed. The majority of those algorithms focus on improving the predictive performance and assume that model update is always desired as soon as possible and as frequently as possible. In this study we consider potential model update as an investment decision, which, as in the financial markets, should be taken only if a certain return on investment is expected. We introduce and motivate a new research problem for data streams – cost-sensitive adaptation. We propose a reference framework for analyzing adaptation strategies in terms of costs and benefits. Our framework allows us to characterize and decompose the costs of model updates, and to asses and interpret the gains in performance due to model adaptation for a given learning algorithm on a given prediction task. Our proof-of-concept experiment demonstrates how the framework can aid in analyzing and managing adaptation decisions in the chemical industry.

https://eprints.bournemouth.ac.uk/22860/

Source: BURO EPrints