Dynamic Resource Prediction and Allocation in C-RAN with Edge Artificial Intelligence
Artificial Intelligence (AI) is one of the important technologies for industrial applications but it needs a lot of computing resources and sensing data to support. Therefore, big data transmission is a challenge for current network architectures. In order to have high-performance computing requirements, this study proposes an emerging network architecture that combines edge computing and cloud computing to reduce the transmission of useless data and solve bottleneck problems. Moreover, we define the resource allocation problem about multiple Remote Radio Heads (RRHs) and multiple Base Band Units pools (BBU pools) in the Cloud Radio Access Network (C-RAN) for 5G. The Long Short-Term Memory (LSTM) is used to predict dynamic throughput and GA-based Resource Allocation Algorithm (GARAA) is used to optimize resource allocation. The simulation results represented that the proposed mechanism can achieve high resource utilization and reduce power consumption.
Internet of things, Edge computing, Long-short term memory, Metaheuristic algorithms, C-RAN