Loading...
 
[Show/Hide Left Column]

Post Doc AI        

Post doc 

AI - based 5G networks for intelligent caching 

Axe & tâche scientifique DigiCosme : ComEx - Tâche 2
Coordinateurs : Steven MARTIN - Dana MARINCA
Nom & Prénom du Candidat : Nhan NGUYEN Thaanh
Laboratoires : LRI & DAVID
Laboratoire gestionnaire : DAVID
Adossé à l'action DigiCosme : GT Future Access Networks
Durée & Dates de la mission : 1 an - septembre 2017/août 2018


Contexte :
With the explosively increasing demand for high-speed data applications, such as mobile multimedia services, the rapid growth of mobile data traffic poses an enormous challenge to 5G networks 1, 2. To address this challenge, caching of popular contents during off-peak traffic periods at mobile network edges has emerged as an effective approach to reduce the duplicated transmissions of the content downloads and improves network resource efficiency. Recent emerging Mobile Edge Computing (MEC) 3 based on the 5G evolution architecture has accelerated the development of intelligent distributed caching at mobile network edges. In particular, MEC servers, which operate as local content delivery nodes and serve cached contents, may perform in a cooperative way and effectively share their caching information with each other. Hence, in order to improve the network energy and resource efficiency 4, it is important to develop new intelligent content caching 5 and distribution mechanisms by leveraging the data storage and computing in the MEC capability. To reach this ambitious goal, 5G networks must rely imperatively on Artificial Intelligence (AI) functionalities to gain the ability to interact with the environment (e.g., traffic load, service characteristics, user behaviour, etc.) and allocate resources efficiently.

Objectif :
The majority of the literature assume that either content items’ instantaneous demands, or content popularity matrix are known non-causally. However, in reality, the content popularity matrix could change over time; hence, it is necessary to learn it dynamically during the caching procedure. The user QoE (Quality of Experience) soliciting contents are related to several parameters: spatial and temporal changes in the traffic demand, end-user mobility, and variability in radio channel conditions and social context (social events or activity on social networks) can strongly influence the popularity of related contents. Thus, we will model the content popularity as a multidimensional variable and study the correlation between its components. Learning from the context and the past evolution, we will predict the popularity based on different machine learning algorithms. Each algorithm can have pros and cons depending on the context at hand. Hence, the mechanism based on experts and forecasters allows comparing the performances of different algorithms in a given context. To optimally manage network resources, the experts-forecaster mechanism can dynamically decide to activate the most efficient algorithm for the current context. Finally, it is unclear from the literature whether a cache enabled BS can improve the EE of a downlink system. This is because if most requested files are stored at the BS cache, the traffic load in the backhaul will be effectively reduced and the resulting energy consumption can be saved, but the energy consumed for storage will grow. In fact, the power consumed for backhauling is not negligible for the EE of downlink transmission. All these reasons motivate us to design intelligent caching strategies from the perspective of energy efficiency 7, 8.

Productions Scientifiques :