Skip to main content
Article
An Efficient Online Computation Offloading Approach for Large-Scale Mobile Edge Computing via Deep Reinforcement Learning
IEEE Transactions on Services Computing
  • Zheyuan Hu
  • Jianwei Niu
  • Tao Ren
  • Bin Dai
  • Qingfeng Li
  • Mingliang Xu
  • Sajal K. Das, Missouri University of Science and Technology
Abstract

Mobile edge computing (MEC) has been envisioned as a promising paradigm that could effectively enhance the computational capacity of wireless user devices (WUDs) and quality-of-experience of mobile applications. One of the most crucial issues of MEC is computation offloading, which decides how to offload WUDs' tasks to edge severs for further intensive computation. Conventional mathematical programming-based offloading approaches could face troubles in dynamic MEC environments due to the time-varying channel conditions (caused primarily by WUD mobility). To address the problem, reinforcement learning (RL) based offloading approaches have been proposed, which develop offloading policies by mapping MEC states to offloading actions. However, these approaches could fail to converge in large-scale MEC due to the exponentially-growing state and action spaces. In this paper, we propose a novel online computation offloading approach that could effectively reduce task latency and energy consumption in dynamic MEC with large-scale WUDs. First, a RL-based computation offloading and energy transmission algorithm is proposed to accelerate the learning process. Then, a joint optimization method is adopted to develop the allocating algorithm, which obtains near-optimal solutions for energy and computation resources allocation. Simulation results show that the proposed approach can converge efficiently and achieve significant performance improvements over baseline approaches.

Department(s)
Computer Science
Publication Status
Early Access
Keywords and Phrases
  • Computation Offloading,
  • Computational Modeling,
  • Dynamic Scheduling,
  • Heuristic Algorithms,
  • Mobile Edge Computing,
  • Processor Scheduling,
  • Reinforcement Learning,
  • Resource Allocation,
  • Resource Management,
  • Task Analysis,
  • Wireless Communication
Document Type
Article - Journal
Document Version
Citation
File Type
text
Language(s)
English
Rights
© 2021 Institute of Electrical and Electronics Engineers (IEEE), All rights reserved.
Publication Date
9-29-2021
Publication Date
29 Sep 2021
Disciplines
Citation Information
Zheyuan Hu, Jianwei Niu, Tao Ren, Bin Dai, et al.. "An Efficient Online Computation Offloading Approach for Large-Scale Mobile Edge Computing via Deep Reinforcement Learning" IEEE Transactions on Services Computing (2021) ISSN: 1939-1374
Available at: http://works.bepress.com/sajal-das/236/