Dissertations and Theses
Date of Award
2023
Document Type
Dissertation
Department
Engineering
First Advisor
Myung Jong Lee
Keywords
Machine Learning, Edge Cloud, Resource Allocation, Software Defined Networking, 5G/6G
Abstract
Next generation mobile and immersive applications (e.g., Augmented Reality (AR), Virtual Reality (VR), Extended reality (XR)), and Internet of Things (IoTs) provide richer functionalities which possess resource-hungry and real-time constraints. To conserve energy and improve performance of such devices, certain computationally heavy tasks can be executed remotely by offloading them to the back-end cloud (BC) and utilizing its abundant compute resources. However, the long distance between a mobile/IoT device and the BC causes huge network delay, thus, deteriorating the user experience of real-time applications. Edge-cloud (EC) and beyond 5G (B5G) wireless communication are envisioned to cope with the above compute and real-time constraints by minimizing the network latency and providing compute resources at the edge of the network. However, EC possesses a limited amount of computational resources compared to the back-end cloud. Additionally, in the presence of excessive number of connected devices in the EC environment, a large amount of concurrent traffic can be anticipated. Thus, wireless resources, which connect the devices with the EC, also become a bottleneck for the system. Intelligent resource management techniques become imperative in such resource constrained environment. Long-established approaches, such as mathematical optimization-based methods and game theory-based techniques have been adapted to solve resource management problems. However, such approaches are not suitable to solve resource allocation problem in non-stationary and large-scale environments due to heavy computational loads. The proven success of Machine Learning (ML) based techniques has stimulated the adoption of ML algorithms to solve control and management problems for mobile and IoT applications in EC and B5G wireless and software-defined networks (SDN).
In this dissertation, we first develop a new model for the bandwidth allocation problem of diverse and large-scale IoTs and mobile applications with varying quality of service (QoS) requirements in an SDN-enabled EC environment. Secondly, the bandwidth allocation model is expanded to a multi-resource allocation (MRA) problem by considering both wireless bandwidth and compute resources simultaneously. Thirdly, to effectively solve the MRA problem jointly in the EC and BC, we propose innovative, scalable, and state-of-the-art ML algorithms that combine a robust and distributed approach for faster training in large models with a heuristic-based novel priority experience replay buffer for efficient learning. Finally, we propose a novel in-network RL inference framework to allocate bandwidth at the line rate for ultra-reliable low-latency communication (URLLC) applications in 6G networks. We adopt a match-action table mapping-based strategy to achieve in-network RL inference and develop our proposed framework using the programming protocol-independent packet processor (P4) data plane programming language for efficient resource management at the line rate.
Recommended Citation
Qadeer, Arslan Dr, "Machine Learning driven Resource Allocation in Edge Cloud" (2023). CUNY Academic Works.
https://academicworks.cuny.edu/cc_etds_theses/1128