Hyper-parameter Optimisation by Restrained Stochastic Hill Climbing
Authors: Stubbs, R., Wilson, K. and Rostami, S.
Journal: Advances in Intelligent Systems and Computing
Volume: 1043
Pages: 189-200
eISSN: 2194-5365
ISBN: 9783030299323
ISSN: 2194-5357
DOI: 10.1007/978-3-030-29933-0_16
Abstract:Machine learning practitioners often refer to hyper-parameter optimisation (HPO) as an art form and a skill that requires intuition and experience; Neuroevolution (NE) typically employs a combination of manual and evolutionary approaches for HPO. This paper explores the integration of a stochastic hill climbing approach for HPO within a NE algorithm. We empirically show that HPO by restrained stochastic hill climbing (HORSHC) is more effective than manual and pure evolutionary HPO. Empirical evidence is derived from a comparison of: (1) a NE algorithm that solely optimises hyper-parameters through evolution and (2) a number of derived algorithms with random search optimisation integration for optimising the hyper-parameters of a Neural Network. Through statistical analysis of the experimental results it has been revealed that random initialisation of hyper-parameters does not significantly affect the final performance of the Neural Networks evolved. However, HORSHC, a novel optimisation approach proposed in this paper has been proven to significantly out-perform the NE control algorithm. HORSHC presents itself as a solution that is computationally comparable in terms of both time and complexity as well as outperforming the control algorithm.
https://eprints.bournemouth.ac.uk/32508/
Source: Scopus
Hyper-parameter Optimisation by Restrained Stochastic Hill Climbing
Authors: Stubbs, R., Wilson, K. and Rostami, S.
Journal: ADVANCES IN COMPUTATIONAL INTELLIGENCE SYSTEMS (UKCI 2019)
Volume: 1043
Pages: 189-200
eISSN: 2194-5365
ISBN: 978-3-030-29932-3
ISSN: 2194-5357
DOI: 10.1007/978-3-030-29933-0_16
https://eprints.bournemouth.ac.uk/32508/
Source: Web of Science (Lite)
Hyper-parameter Optimisation by Restrained Stochastic Hill Climbing
Authors: Stubbs, R. and Rostami, S.
Conference: UK Workshop on Computational Intelligence (UKCI)
Dates: 4-6 September 2019
https://eprints.bournemouth.ac.uk/32508/
Source: Manual
Hyper-parameter Optimisation by Restrained Stochastic Hill Climbing
Authors: Stubbs, R., Rostami, S. and Wilson, K.
Conference: UKCI: 19th Annual UK Workshop on Computational Intelligence
Abstract:Abstract. Machine learning practitioners often refer to hyper-parameter optimisation (HPO) as an art form and a skill that requires intuition and experience; Neuroevolution (NE) typically employs a combination of manual and evolutionary approaches for HPO. This paper explores the integration of a stochastic hill climbing approach for HPO within a NE algorithm. We empirically show that HPO by restrained stochastic hill climbing (HORSHC) is more effective than manual and pure evolutionary HPO. Empirical evidence is derived from a comparison of: (1) a NE algorithm that solely optimises hyper-parameters through evolution and (2) a number of derived algorithms with random search optimisation integration for optimising the hyper-parameters of a Neural Network. Through statistical analysis of the experimental results it has been revealed that random initialisation of hyper-parameters does not significantly affect the final performance of the Neural Networks evolved. However, HORSHC, a novel optimisation approach proposed in this paper has been proven to significantly out-perform the NE control algorithm. HORSHC presents itself as a solution that is computationally comparable in terms of both time and complexity as well as outperforming the control algorithm.
https://eprints.bournemouth.ac.uk/32508/
https://www.ukci2019.port.ac.uk/
Source: BURO EPrints