Machine Learning Driven Latency Optimization for Internet of Things Applications in Edge Computing
Authors: Awada, U., Zhang, J., Chen, S., Li, S. and Yang, S.
Journal: ZTE Communications
Volume: 21
Issue: 2
Pages: 40-52
ISSN: 1673-5188
DOI: 10.12142/ZTECOM.202302007
Abstract:Emerging Internet of Things (IoT) applications require faster execution time and response time to achieve optimal performance. However, most IoT devices have limited or no computing capability to achieve such stringent application requirements. To this end, compu⁃ tation offloading in edge computing has been used for IoT systems to achieve the desired performance. Nevertheless, randomly offloading ap⁃ plications to any available edge without considering their resource demands, inter-application dependencies and edge resource availability may eventually result in execution delay and performance degradation. We introduce Edge-IoT, a machine learning-enabled orchestration framework in this paper, which utilizes the states of edge resources and application resource requirements to facilitate a resource-aware offloading scheme for minimizing the average latency. We further propose a variant bin-packing optimization model that co-locates applica⁃ tions firmly on edge resources to fully utilize available resources. Extensive experiments show the effectiveness and resource efficiency of the proposed approach.
https://eprints.bournemouth.ac.uk/38647/
Source: Scopus
Machine Learning Driven Latency Optimization for Internet of Things Applications in Edge Computing
Authors: Awada, U., Zhang, J., Chen, S., Li, S. and Yang, S.
Journal: ZTE Communications
https://eprints.bournemouth.ac.uk/38647/
Source: Manual
Machine Learning Driven Latency Optimization for Internet of Things Applications in Edge Computing
Authors: Awada, U., Zhang, J., Chen, S., Li, S. and Yang, S.
Journal: ZTE Communications
Volume: 21
Issue: 2
Pages: 40-52
ISSN: 1673-5188
Abstract:Emerging Internet of Things (IoT) applications require faster execution time and response time to achieve optimal performance. However, most IoT devices have limited or no computing capability to achieve such stringent application requirements. To this end, computation offloading in edge computing has been used for IoT systems to achieve the desired performance. Nevertheless, randomly offloading applications to any available edge without considering their resource demands, inter-application dependencies and edge resource availability may eventually result in execution delay and performance degradation. We introduce Edge-IoT, a machine learning-enabled orchestration framework in this paper, which utilizes the states of edge resources and application resource requirements to facilitate a resource-aware offloading scheme for minimizing the average latency. We further propose a variant bin-packing optimization model
https://eprints.bournemouth.ac.uk/38647/
Source: BURO EPrints