Illustrative Discussion of MC-Dropout in General Dataset: Uncertainty Estimation in Bitcoin
Authors: Alarab, I., Prakoonwit, S. and Nacer, M.I.
Journal: Neural Processing Letters
Volume: 53
Issue: 2
Pages: 1001-1011
eISSN: 1573-773X
ISSN: 1370-4621
DOI: 10.1007/s11063-021-10424-x
Abstract:The past few years have witnessed the resurgence of uncertainty estimation generally in neural networks. Providing uncertainty quantification besides the predictive probability is desirable to reflect the degree of belief in the model’s decision about a given input. Recently, Monte-Carlo dropout (MC-dropout) method has been introduced as a probabilistic approach based Bayesian approximation which is computationally efficient than Bayesian neural networks. MC-dropout has revealed promising results on image datasets regarding uncertainty quantification. However, this method has been subjected to criticism regarding the behaviour of MC-dropout and what type of uncertainty it actually captures. For this purpose, we aim to discuss the behaviour of MC-dropout on classification tasks using synthetic and real data. We empirically explain different cases of MC-dropout that reflects the relative merits of this method. Our main finding is that MC-dropout captures datapoints lying on the decision boundary between the opposed classes using synthetic data. On the other hand, we apply MC-dropout method on dataset derived from Bitcoin known as Elliptic data to highlight the outperformance of model with MC-dropout over standard model. A conclusion and possible future directions are proposed.
https://eprints.bournemouth.ac.uk/35067/
Source: Scopus
Illustrative Discussion of MC-Dropout in General Dataset: Uncertainty Estimation in Bitcoin
Authors: Alarab, I., Prakoonwit, S. and Nacer, M.I.
Journal: NEURAL PROCESSING LETTERS
Volume: 53
Issue: 2
Pages: 1001-1011
eISSN: 1573-773X
ISSN: 1370-4621
DOI: 10.1007/s11063-021-10424-x
https://eprints.bournemouth.ac.uk/35067/
Source: Web of Science (Lite)
Illustrative Discussion of MC-dropout Method in General Dataset: Uncertainty Estimation in Bitcoin
Authors: Alarab, I., Prakoonwit, S. and Nacer, M.I.
Journal: Neural Processing Letters
Publisher: Springer Nature
ISSN: 1370-4621
DOI: 10.1007/s11063-021-10424-x
Abstract:The past few years have witnessed the resurgence of uncertainty estimation generally in neural networks. Providing uncertainty quantification besides the predictive probability is desirable to reflect the degree of belief in the model's decision about a given input. Recently, Monte-Carlo dropout (MC-dropout) method has been introduced as a probabilistic approach based Bayesian approximation which is computationally efficient than Bayesian neural networks. MC-dropout has revealed promising results on image datasets regarding uncertainty quantification. However, this method has been subjected to criticism regarding the behaviour of MC-dropout and what type of uncertainty it actually captures. For this purpose, we aim to discuss the behaviour of MC-dropout on classification tasks using synthetic and real data. We empirically explain different cases of MC-dropout that reflects the relative merits of this method. Our main finding is that MC-dropout captures datapoints lying on the decision boundary between the opposed classes using synthetic data. On the other hand, we apply MC-dropout method on dataset derived from Bitcoin known as Elliptic data to highlight the outperformance of model with MC-dropout over standard model. A conclusion and possible future directions are proposed.
https://eprints.bournemouth.ac.uk/35067/
http://link.springer.com/article/10.1007/s11063-021-10424-x
Source: Manual
Illustrative Discussion of MC-dropout Method in General Dataset: Uncertainty Estimation in Bitcoin.
Authors: Alarab, I., Prakoonwit, S. and Nacer, M.I.
Journal: Neural Processing Letters
Volume: 53
Pages: 1001-1011
ISSN: 1370-4621
Abstract:The past few years have witnessed the resurgence of uncertainty estimation in deep neural networks (DNNs). Providing uncertainty besides the predictions is desirable to provide a degree of belief about the predicted output in neural networks (NNs). Recent researches have introduced probabilistic approaches which are computationally less expensive than Bayesian neural networks (BNNs). Out of the existing approaches, we focus on probabilistic approach based Bayesian approximations known as Monte-Carlo dropout (MC-dropout). In this paper, we aim to provide an overview of the misconception arisen about MC-dropout, wherein criticism occurs. We fulfill our opinion using a 2-D synthetic dataset to derive insights about the empirical study. On the other hand, previous researches have often applied MC-dropout on classification tasks using image dataset. While, we provide an illustrative discussion of MC-dropout using a general dataset derived from Bitcoin blockchain known as Elliptic data. Using Elliptic data, we highlight the performance of uncertainty estimation using different sets of features. We further discuss the effect of MC-dropout regarding the uncertainty metrics when dealing with imbalanced data. The overall model have provided adequate results in terms of uncertainty measurements yielded by MC-dropout.
https://eprints.bournemouth.ac.uk/35067/
Source: BURO EPrints