IEEE Global Communications Conference
9-13 December 2019 // Waikoloa, HI, USA
Revolutionizing Communications

Technical Tutorials

Back to Technical Program Page >>

Monday, 9 December 2019

09:00-12:30
TUT01: "Communication Networks Design: Model-Based, AI-Driven, or Both?"
TUT14: "URLLC for 5G and Beyond: Physical, MAC and Network Design and Solutions"
TUT03: "Accessing from the Sky: A Tutorial on UAV Communications for 5G and Beyond"
TUT04: "Towards 6G Wireless Systems: Challenges and Opportunities"
TUT05: "Optical Wireless Communication: Fundamental Limits, New Advances and Future Perspectives"
TUT06: "Precision Radio Propagation Measurements to Optimize 5G Spectrum and Systems"

14:00-17:30
TUT07: "Incremental Redundancy Tutorial at Globecom 2019"
TUT08: "Artificial Intelligence Enabled Integrated Aerial/Terrestrial Wireless Access Networks for 2030s"
TUT09: "A Communications Theory Perspective on Web Privacy"
TUT10: "Mobile Edge Artificial Intelligence: Opportunities and Challenges"
TUT11: "Networked-Flying Platforms: Paving the Way Towards Global Wireless Connectivity"
TUT12: "mmWave Massive MIMO: How to Harness More with Less?"

Friday, 13 December 2019

09:00-12:30
TUT13: "Tactile Internet with Human-in-the-Loop"
TUT15: "Millimeter-wave Beam Management for 5G-NR and Beyond"
TUT16: "Quantum Internet: From Communication to Teleportation" (CANCELLED)
TUT17: "Machine Learning for Wireless Networks: Basics, Applications, and Trends"
TUT18: "Artificial Intelligence for Wireless Signal Processing: from Compressive Sensing to Deep Learning"

14:00-17:30
TUT02: "Softwarization Concepts (SDN, NFV, ICN) and Practice in 5G Communication Systems and Beyond" 
TUT19: "Communications and Networking in Droplet-based Microfluidic Systems" (CANCELLED)
TUT20: "Recent Advances in NOMA Techniques: Signal Processing Solutions and Emerging Applications"
TUT21: "SDN and NFV for Wireless Networks - Emerging Research and Standardization Trends"
TUT22: "Artificial Intelligence for Revolutionizing Communications: A Multi-disciplinary Perspective"
TUT23: "Network Slicing in the Era of 5G and Beyond"
TUT24: "Coding for Distributed Computing"


Monday, 9 December 2019, 09:00-12:30

TUT01: "Communication Networks Design: Model-Based, AI-Driven, or Both?"
Room:
Kohala 1
Presenters: Marco Di Renzo (Paris-Saclay University / CNRS, France); Alessio Zappone (CentraleSupelec, France); Mérouane Debbah (Huawei, France)

Recently, deep learning has received significant attention as a technique to design and optimize wireless communication systems and networks. The usual approach to use deep learning consists of acquiring large amount of empirical data about the system behavior and employ it for performance optimization (data-driven approach). We believe, however, that the application of deep learning to communication networks design and optimization offers more possibilities. As opposed to other fields of science, such as image classification and speech recognition, mathematical models for communication networks optimization are very often available, even though they may be simplified and inaccurate. We believe that this a priori expert knowledge, which has been acquired over decades of intense research, cannot be dismissed and ignored. In this tutorial, in particular, we put forth a new approach that capitalizes on the availability of (possibly simplified or inaccurate) theoretical models, in order to reduce the amount of empirical data to use and the complexity of training artificial neural networks (ANNs). We concretely show, with the aid of some examples, that synergistically combining prior expert knowledge based on analytical models and data-driven methods constitutes a suitable approach towards the design and optimization of communication systems and networks with the aid of deep learning based on ANNs. The tutorial is structured in three main parts: Data-driven design of wireless networks Model-based design of wireless networks Embedding expert knowledge into deep learning and will be based on the following recent publications of the authors: A. Zappone, M. Di Renzo, M. Debbah, ``From Model-Based to Data-Driven Wireless Communications. When is Deep Learning the Answer?'', invited paper to IEEE Transactions on Communications, submitted, 2018. A. Zappone, M. Di Renzo, M. Debbah, T. T. Lam, X. Qian, "Model-Aided Wireless Artificial Intelligence: Embedding Expert Knowledge in Deep Neural Networks Towards Wireless Systems Optimization", submitted, August 2018, available online at http://de.arxiv.org/pdf/1808.01672.pdf.

TUT14: "URLLC for 5G and Beyond: Physical, MAC and Network Design and Solutions"
Room:
Kohala 2
Presenters: Branka Vucetic, Yonghui Li and Mahyar Shirvanimoghaddam (University of Sydney, Australia); Rana Abbas (The University of Sydney, Australia); Changyang She (University of Sydney, Australia)

The world is currently witnessing the rise of many mission critical applications such as tele-surgery, intelligent transportation, industry automation, virtual reality and augmented reality, vehicular communications, etc. Some of these applications will be enabled by the fifth-generation of cellular networks (5G), which will provide the required ultra-reliable low latency communication (URLLC). However, guaranteeing these stringent reliability and end-to-end latency requirements continues to prove to be quite challenging, due to the significant shift in paradigms required in both theoretical fundamentals of wireless communications as well as design principles. For instance, the fourth generation of cellular networks (4G) currently provide an unpredictable latency that can range from 50ms to several seconds, with block error rates as high as 10-1. On the other hand, industry is demanding URLLC provide 1 ms end-to-end latency and overall packet loss probabilities as low as 10-5 - 10-7 . Motivated by the above, in this tutorial, we cover the challenges and potential solutions for 5G and beyond 5G to support URLLC, in terms of error control coding for improving reliability, channel access protocols for reducing latency, and multi-connectivity for improving network availability.

TUT03: "Accessing from the Sky: A Tutorial on UAV Communications for 5G and Beyond"
Room:
Kohala 3
Presenters: Rui Zhang (National University of Singapore, Singapore); Yong Zeng (Southeast University, P.R. China)

The integration of Unmanned Aerial Vehicles (UAVs) into existing (4G) and future (5G and beyond) cellular networks has drawn significantly growing interest in both academia and industry recently. Compared to the conventional terrestrial communications, UAVs' communications face new challenges due to their high altitude above the ground and great flexibility of movement in the 3D space. Several critical issues arise, including the line-of-sight (LoS) dominant UAV-ground channels and resultant strong aerial-terrestrial network interference, the distinct communication quality of service (QoS) requirements for UAV control messages versus payload data, the stringent constraints imposed by the size, weight and power (SWAP) limitations of UAVs, as well as the exploitation of the new design degree of freedom (DoF) brought by the highly controllable 3D UAV mobility. In this tutorial, we will give an overview of the recent advances in UAV communications to address the above issues. We will first introduce the fundamentals of UAV communications, including UAV-ground channel model, antenna model, UAV energy consumption model, and the mathematical framework for designing UAV trajectory and communication jointly. Then we will partition our discussions into two promising research and application frameworks of UAV communications, namely UAV-assisted wireless communications and cellular-connected UAVs, where UAVs serve as aerial communication platforms and users, respectively. In particular, we will focus on new and efficient techniques to mitigate the strong air-ground interference and design UAV trajectories for communication performance optimization. We will also discuss other related topics to provide promising directions for future research and investigation.

TUT04: "Towards 6G Wireless Systems: Challenges and Opportunities"
Room:
Kohala 4
Presenters: Walid Saad (Virginia Tech, USA); Mehdi Bennis (Centre of Wireless Communications, University of Oulu, Finland)

The ongoing deployment of 5G cellular systems is continuously exposing the inherent limitations of this system, compared to its original premise as an enabler for Internet of Everything applications. These 5G drawbacks are currently spurring worldwide activities focused on defining the next-generation 6G wireless system that can truly integrate far-reaching applications ranging from autonomous systems to extended reality and haptics. Despite recent 6G initiatives , the fundamental architectural and performance components of the system remain largely undefined. In this tutorial, we present a holistic, forward-looking vision that defines the tenets of a 6G system. We opine that 6G will not be a mere exploration of more spectrum at high-frequency bands, but it will rather be a convergence of upcoming technological trends driven by exciting, underlying services. In this regard, we first identify the primary drivers of 6G systems, in terms of applications and accompanying technological trends. Then, we propose a new set of service classes and expose their target 6G performance requirements. We then identify the enabling technologies for the introduced 6G services and outline a comprehensive research agenda that leverages those technologies. We provide a step-by-step tutorial on a variety of problems that pertain to emerging 6G technologies and services. We conclude the tutorial by providing concrete recommendations for the roadmap toward 6G. Ultimately, the intent of this tutorial is to provide an in-depth exposition of 6G research and to serve as a basis for stimulating more out-of-the-box research around 6G.

TUT05: "Optical Wireless Communication: Fundamental Limits, New Advances and Future Perspectives"
Room:
Kona 1
Presenters: Anas Chaaban (University of British Columbia, Canada); Zouheir Rezki (University of Idaho, USA); Mohamed-Slim Alouini (King Abdullah University of Science and Technology (KAUST), Saudi Arabia)

Optical wireless communications (OWC) has recently gained a lot of interest among industrial and academic communities. The main inhibitor factor of this resurgence of interest is the fact that radio-frequency (RF) spectrum is getting too crowded to handle the increasingly high demand, and hence exploring higher frequency spectrum, including the optical range, would be a relief. Another reason behind such an interest resides in the relatively simple deployment of OWC systems. However, before a real deployment of OWC systems, there is a persistent need to establish its fundamental limits and extract design guidelines to build efficient OWC systems. Indeed, due to different propagation channels and different transmit constraints, RF communications and OWC are fundamentally quite different. For instance, the popular Intensity-Modulation Direct-Detection (IM/DD), which is a favourable scheme for OWC due to its simplicity, has some subtle differences in comparison with radio-frequency (RF) channels manifested in the nonnegativity of the transmit signal, in addition to constraints on the average and peak of the signal. These in turn make the capacity and the optimal transmission schemes for IM/DD OWC channels different from those for RF channel. The goal of this half-day tutorial is to approach the OWC channel from an information-theoretic perspective to highlight the fundamental differences with RF. Consequently, this tutorial will introduce and discuss the most recent information-theoretic results related to OWC, including single-user channels, multi-user channels (broadcast and multiple-access channels), and multi-aperture channels (parallel and MIMO channels), with and without secrecy constraints. This will make researchers acquainted with these results which can be very useful for better analysis and understanding of OWC in the future.

TUT06: "Precision Radio Propagation Measurements to Optimize 5G Spectrum and Systems"
Room:
Kona 2
Presenters: Christopher R. Anderson (United States Naval Academy, USA); Kenneth Baker (University of Colorado at Boulder, USA); Robert T Johnk (NIST, USA); Chriss Hammerschmidt (National Telecommunications and Information Administration & Institute for Telecommunication Sciences, USA)

Behind every initiative to improve efficient use and greater access to spectrum are models of radiowave propagation. These models are designed to predict how far a signal will travel before interfering with another user. Furthermore, models need to incorporate interactions with trees, hills, buildings or other local site-specific features that scatter energy and cause fading. The key to unlocking the full potential of efficient spectrum usage is a deep understanding of physical layer radio propagation. This requires a multitude of high-fidelity measurements, accurate models, and validation of wireless propagation in diverse and complex environments. Recent advances in machine learning and big data, in conjunction with high-fidelity databases, have led to increasingly powerful site-specific propagation, sharing, and networking models. High-precision propagation measurements are key to the creation and validation of these models. Although it is commonly assumed that collecting measurement data is a trivial matter, our decades of experience have given us insight into the unique challenges associated with precision measurements. Moreover, measurement science and techniques are a specialized field that is no longer part of the background of today's wireless researchers. This tutorial is directed at those individuals who have a need to collect or utilize wireless propagation measurements as an enabler for state-of-the-art wireless networking research. Our tutorial is designed not only for someone new to the field, but also experienced researchers interested in exploring high-level refinements of their measurement systems and techniques.

 

Monday, 9 December 2019,14:00-17:30

TUT07: "Incremental Redundancy: Maximizing the Benefit of Variable-Length Coding and Feedback"
Room:
Kohala 1
Presenters: Richard Wesel (University of California, Los Angeles, USA)

Attendees to this tutorial will learn how to use variable length coding and feedback to maximize throughput while using short block-lengths on the order of 50-500 channel transmissions.  The techniques also apply to longer transmissions, but the focus is on short block lengths where feedback provides the most benefit. Attendees will understand the proofs that show how variable-length codes with feedback can approach capacity with average block lengths fewer than 500 transmissions will be able to plot achievable rates that can be obtained on the binary symmetric channel and the additive white Gaussian noise channel.  Attendees will learn how to design variable length codes using both convolutional codes and low-density parity-check codes. Attendees will learn how to optimize the number and length of incremental redundancy transmissions for a variable-length code with feedback (i.e. a type-II hybrid ARQ). Attendees will learn how to optimize cyclic redundancy checks (CRCs) that are used to control many variable-length codes. Attendees will also learn how to avoid entirely the overhead of a CRC by directly computing the reliability of codeword decisions.  Finally, attendees will learn about a novel communications architecture that allows the use of incremental redundancy even without feedback.

TUT08: "Artificial Intelligence Enabled Integrated Aerial/Terrestrial Wireless Access Networks for 2030s"
Room:
Kohala 4
Presenters: Halim Yanikomeroglu (Carleton University, Canada); Haris Gacanin (Nokia Bell Labs, Belgium)

The 5G standards are currently being finalized with a scheduled completion date of late-2019; the 5G wireless networks are expected to be deployed globally throughout 2020s. As such, it is time to reinitiate a brainstorming endeavour followed by the technical groundwork towards the subsequent generation wireless networks of 2030s. In this tutorial, we will present a 5-layer vertical architecture composed of fully integrated terrestrial and non-terrestrial layers for the networks of 2030s. The above described complex network architecture necessitates a high degree of autonomous operation which would require the utilization of the emerging artificial intelligence (AI) and machine learning (ML) based tools. We address the shortcomings of contemporary rule-based optimization protocols and re-thinking our wireless operations for boosting the system autonomy. Specifically, a paradigm shift toward the confluence of computer science and communication engineering would be necessary to embrace and study interactions between network design and user experience.

TUT09: "A Communications Theory Perspective on Web Privacy"
Room:
Kohala 3
Presenters: Farhad Shirani, Siddharth Garg and Elza Erkip (New York University, USA)

In this tutorial, we provide a new framework for the study of fundamental limits of web privacy and investigate the design and analysis of practical deanonymization attacks in social networks and database systems. This framework brings together tools from communications theory, information theory, large deviations theory and probability and puts forth a systematic technique for deriving theoretical guarantees for web privacy in several scenarios of interest such as online fingerprinting attacks, social network graph matching attacks, and database matching attacks, among others.

TUT10: "Mobile Edge Artificial Intelligence: Opportunities and Challenges"
Room:
Kohala 2
Presenters: Yuanming Shi (ShanghaiTech University, P.R. China)

With the availability of massive data sets, high performance computing platforms, as well as sophisticated algorithms and software toolkits, AI has achieved remarkable successes in many application domains, e.g., computer vision and natural language processing. AI tasks are computationally intensive and normally trained, developed, and deployed at data centers with custom-designed servers. Given the fast growth of intelligent devices, it is expected that a large number of high-stake applications (e.g., drones, autonomous cars, AR/VR) will be deployed at the edge of wireless networks in near future. As such, the intelligent wireless network will be designed to leverage advanced wireless communications and mobile computing technologies to support AI-enabled applications at various edge mobile devices with limited communication, computation, hardware and energy resources. The aim of this tutorial is to present recent advances in sparse and low-rank techniques for optimizing both the intelligent wireless networks and the deep neural networks, with a comprehensive coverage including modeling, algorithm design, and statistical analysis. Through typical examples, including taming nonconvexity in deep learning models and on-device distributed machine learning, the powerfulness of this set of tools will be demonstrated, and their abilities in making low-latency, reliable and private intelligent decisions at network edge will be highlighted.

TUT11: "Networked-Flying Platforms: Paving the Way Towards Global Wireless Connectivity"
Room:
Kona 1
Presenters: Muhammad Zeeshan Shakir (University of the West of Scotland, United Kingdom (Great Britain)); Mohamed-Slim Alouini (King Abdullah University of Science and Technology (KAUST), Saudi Arabia)

Driven by an emerging use of Networked Flying Platforms (NFPs) such as unmanned aerial vehicles (UAVs) and unmanned balloons in future network applications and the challenges that the 5G and beyond networks exhibit, the focus of this tutorial is to demonstrate the evolution of the NFPs as a novel architectural enabler for radio access network (RAN) and their integration with the future cellular access and backhaul/fronthaul networks. NFPs are networked, flying and a potential way to offer high data rate, high reliability and ultra-low latent access and backhaul/fronthaul to 5G and beyond wireless networks. Such large scale deployable platforms and frameworks will guarantee the global information and communication requirements in future smart and resilient cities and solve the ubiquitous connectivity problems in many challenging network environments, e.g., coverage or capacity enhancements for remote or sparsely populated, social gathering and disaster affected areas, etc. This tutorial will provide balanced coverage on recent trends, challenges and future research and development on the integration of NFPs with 5G and beyond networks.

TUT12: "mmWave Massive MIMO: How to Harness More with Less?"
Room:
Kona 2
Presenters: Liuqing Yang (Colorado State University, USA); Xiang Cheng (Peking University, P.R. China)

Answering the urgent call for high-performance low-complexity 5G solutions, we present the cutting-edge mmWave massive MIMO (mMIMO) designs ranging from the first beamspace modulation not limited by the number of transmitter RF chains, the first doubly-selective channel estimator for mmWAVE mMIMO with significantly improved accuracy and reduced complexity, and the first general wide-band multi-user precoder tailored for mmWave mMIMO. We will also discuss challenges and opportunities in this field to stimulate future research and development in academia and industry.

 

Friday, 13 December 2019, 09:00-12:30

TUT13: "Tactile Internet with Human-in-the-Loop"
Room:
Kohala 1
Presenters: Frank H.P. Fitzek (Technische Universität Dresden & ComNets - Communication Networks Group, Germany); Gerhard P. Fettweis (Technische Universität Dresden, Germany)

The aim of the Tactile Internet with Human-in-the-Loop (TaHiL) is to democratise access to skills and expertise to promote equity for people of different genders, ages cultural backgrounds, or physical limitations.

TUT15: "Millimeter-wave Beam Management for 5G-NR and Beyond"
Room:
Kohala 3
Presenters: Danijela Cabric (University of California Los Angeles, USA); Han Yan (University of California, Los Angeles, USA)

Millimeter wave (mmWave) communications with massive antenna array is key technique for the future cellular systems. While large arrays enable high gain, directionality and user multiplexing, practical realizations face challenges in radio hardware design and cross layer processing. The tutorial starts with review of the emerging hybrid array architecture. The corresponding MIMO processing that heavily relies on analog beam steering is referred as beam management. The basic system model, approach, and performance of beam oriented MIMO is surveyed and compared with conventional MIMO that requires explicit channel estimation. Four procedures that are interleaved between physical layer and higher layer are the main components of such system, namely beam acquisition, tracking, association and handover. The second part of tutorial reviews the beam management in 5G NR, i.e., 3GPP release 15 and the upcoming release 16. We focus on changes in frame structure and procedure that facilitate beam management. The third part covers recent research for future mmWave cellular systems with emphasis in scalability with the increased array size, user number and cell density. We review signal processing techniques for beam management that exploits sparsity, approaches using low resolution fully digital array, and recent advances in QoS aware cell/beam association and handover.

TUT16: "Quantum Internet: From Communication to Teleportation" (CANCELLED)
Room: Kohala 4
Presenters: Marcello Caleffi (University of Naples "Federico II", Italy); Angela Sara Cacciapuoti (University of Naples Federico II, Italy)

Abstract coming soon.

TUT17: "Machine Learning for Wireless Networks: Basics, Applications, and Trends"
Room:
Kona 1
Presenters: Ekram Hossain (University of Manitoba, Canada)

This tutorial will provide a friendly introduction to the different machine learning (ML) techniques with their applications to design and optimization of wireless communications networks. After motivating the potential applications of machine learning for the evolving future cellular networks (e.g. 5G and beyond 5G [B5G] cellular networks), it will introduce the basics of machine learning (ML) tools and the related mathematical preliminaries. In particular, the basics of supervised, unsupervised, and reinforcement learning techniques as well as artificial neural networks will be discussed. The basics of deep learning and deep reinforcement learning will be also provided. Then, applications of ML techniques to different ``wireless" problems including resource allocation, mobility prediction, channel estimation, as well as coverage and capacity optimization will be discussed and the current state-of-the-art will be reviewed. This will be followed by three case studies on using (i) supervised and unsupervised learning for cooperative spectrum sensing, (ii) a deep supervised learning technique for resource allocation, and (iii) reinforcement learning techniques for mobile computation offloading in cellular networks. Finally, the current trends, open research challenges and future research directions on using ML techniques in wireless networks will be discussed.

TUT18: "Artificial Intelligence for Wireless Signal Processing: from Compressive Sensing to Deep Learning"
Room:
Kona 2
Presenters: Yue Gao (Queen Mary University of London, United Kingdom (Great Britain))

Sparse representation can efficiently model signals using different number of parameters to facilitate processing. It has been widely used in different applications, such as image processing, audio signal processing, and wireless signal processing. In this tutorial, we will discuss the sparse signal processing in wireless communications, with focus on the most recent compressive sensing (CS) and deep learning (DL) enabled sparse representation. This tutorial starts from the general framework of sparse representation including the CS and the DL-based approaches. Then we will present the recent research progress on applying CS to address the major issues and challenges in wireless communications, where the wideband spectrum sensing is provided as a sub-Nyquist example. Particularly, both the latest theoretic contributions and practical implementation platforms will be discussed through a GHz bandwidth sensing system as an embedded artificial intelligence (AI) approach. The third part of this tutorial will talk about the DL- enabled sparse representation, with particular emphasis on DL-enabled wireless signal detection and channel estimation.This tutorial will benefit researchers looking for cross-pollination of their experience with other areas, such as data-driven based channel estimation and wideband spectrum sensing, and provide audience a clear picture of how to exploit the sparse properties, DL and embedded AI to process wireless signals for different scenarios.

 

Friday, 13 December 2019, 14:00-17:30

TUT02: "Softwarization Concepts (SDN, NFV, ICN) and Practice in 5G Communication Systems and Beyond" 
Room:
Kona 1
Presenters: Fabrizio Granelli (University of Trento, Italy); Frank H.P. Fitzek (Technische Universität Dresden & ComNets - Communication Networks Group, Germany)

IMPORTANT NOTICE FOR ATTENDEES: The speakers will use a specific mininet/docker distribution for hands-on demos, called ComNetsEmu, which is being prepared for the upcoming book by the authors "Computing in Communication Networks", published by Elsevier.
The ComNetsEmu environment is now publicly accessible: https://git.comnets.net/public-repo/comnetsemu
The above website contains detailed documentation of how it can be installed /used. Participants are encouraged to download and install the ComNetsEmu onto their machines beforehand (Using option 1).

A big step lies ahead, when moving from today's 4G cellular networks to tomorrows 5G network. Today, the network is used for content delivery, e.g. voice, video, data. Tomorrow, the 5G network (and maybe beyond 5G) will be fully softwarized and programmable, with new degrees of freedom. The aim of the tutorial is to illustrate how the emerging paradigms of Software Defined Networking, Network Function Virtualization, and Information Centric Networking will impact on the development of "5G and beyond" systems and networks, both from the theoretical/formal as well as from the practical perspective. The tutorial is split into two major sections: (i) the first one, where the attendees will learn the basics of SDN, NFV and ICN and their application in the framework of 5G (network slicing, mobile edge computing); (ii) the second one, where the attendees will deploy simple yet complete solutions for network slicing and mobile edge computing using their own laptops.

TUT19: "Communications and Networking in Droplet-based Microfluidic Systems" (CANCELLED)
Room: Kona 1
Presenters: Werner Haselmayr (Johannes Kepler University Linz, Austria); Andrea Zanella (University of Padova, Italy)

This tutorial introduces the emerging field of communications and networking in droplet-based microfluidic systems, where tiny volumes of fluids, so-called droplets, are used for communication and/or addressing purposes in microfluidic chips. With this research, an important step towards the next generation of Lab-on-Chip devices is made. In order to lower the entry barrier for this exciting area, the tutorial starts with an accessible introduction of the fundamentals of droplet-based microfluidics. Then, we describe various communication aspects, including information encoding and noise models. We present microfluidic switches as the key building block for microfluidic networks and discuss different network topologies. Moreover, we present two promising healthcare applications for microfluidic networks and show the latest experimental results. For example, the world's first text transmission on a microfluidic chip using droplets. The tutorial concludes with a discussion of the most important open problems in this new field and we show the opportunities for communications researchers to contribute to this area.

TUT20: "Recent Advances in NOMA Techniques: Signal Processing Solutions and Emerging Applications"
Room:
Kohala 1
Presenters: Yuanwei Liu (Queen Mary University of London, United Kingdom (Great Britain)); Lajos Hanzo (University of Southampton, United Kingdom (Great Britain)); Zhiguo Ding (University of Manchester, United Kingdom (Great Britain))

Mobile data traffic, especially mobile video traffic and small but numerous IoT packets have dramatically increased in recent years with the emergence of smart phones, tablets, and various new applications. It is hence crucial to increase the network capacity to accommodate these bandwidth-thirsty applications and services. Non-orthogonal multiple access (NOMA), which has been recently proposed for the 3rd generation partnership project's long-term evolution advanced (3GPP-LTE-A), constitutes a promising technique of enhancing the spectral efficiency and supporting massive connectivity in the emerging 5G networks by accommodating several users within the same orthogonal resource block, via multiplexing at different power levels. By doing so, significant spectral efficiency improvements can be attained over conventional orthogonal multiple access (OMA) techniques. The main objective of this tutorial is to present the basic concepts as well as to address the associated research challenges of next-generation mobile communication systems.

TUT21: "SDN and NFV for Wireless Networks - Emerging Research and Standardization Trends"
Room:
Kona 2
Presenters: Pranav Jha, Indian Institute of Technology Bombay, India; Abhay Karandikar, IIT Bombay    

Software Defined Networking (SDN) and Network Function Virtualization (NFV) have emerged as one of the key technologies for an efficient and flexible design of communication networks. The upcoming 5G wireless communication network, is also being designed to enable extensive usage of SDN and NFV technologies. The tutorial highlights how the ongoing 5G standardization initiatives under various Standard Development Organizations (SDOs), e.g., 3GPP, IEEE, ETSI, IETF are incorporating the SDN & NFV technologies in the basic architectural framework of the emerging wireless networks. The details are provided through elaboration of the relevant standardization efforts. The tutorial also discusses how such efforts may lead to significant transformation of the wireless networks especially Multi-RAT Wireless Networks. Further, it details out the latest research trends in this area especially the research proposals for SDN and NFV based Radio Access Network. The tutorial also explains how SDN/NFV enables new use cases of the wireless networks, such as, Network Slicing, In-network Content Caching & Delivery etc. Additionally, it explains how efficient solution to some other requirements, such as, Dual/Multi Connectivity Support, Wireless Backhaul, and Radio Resource Management (e.g., RAT Selection in a Multi-RAT network, Load Balancing in RAN, and Interference Management) is made possible with the help of SDN/NFV. The tutorial also covers several interesting algorithms that have been proposed recently to tackle some of these requirements.

TUT22: "Artificial Intelligence for Revolutionizing Communications: A Multi-disciplinary Perspective"
Room:
Kohala 2
Presenters: Erik Mannens and Lieven De Marez (Ghent University, Belgium); Steven Latré (UAntwerpen, Belgium)

Whether its network information, user data or their mobile DNA, the sheer volume and frequency of the data is huge on all layers. Making sense of this huge amount of mobile data is an important challenge to tackle for future networks as it generates insights, which can be used for future optimization. In this tutorial, we aim at giving an overview of the state of the art of machine learning and provide a hands-on view on how AI can help in the domain of communications. We identify this through three diverse but very challenging and concrete use cases:

  • Using machine learning for wireless spectrum management. Starting from the experience of one of the finalists of the DARPA Spectrum Collaboration Challenge (https://www.spectrumcollaborationchallenge.com), we explain how deep learning can help in optimizing spectrum usage in future wireless networks.
  • Using Linked Data for personal data management. Starting from the work with the team of Tim Berner's Lee on SOLID jointly at MIT and Ghent University, we explain how the internet will (as was foreseen in the first place) again be an enormous pool of decentralised user data where applications will act as views on these federated personal data pods.
  • Analyzing your mobile DNA to manage screen time. In a hands-on session we will explain how machine learning can help to unravel one's personal mobile DNA or generic screen time insights into recognizable patterns that makes 'controlling screen time' more actionable. Fueling more efficient behavioral (smartphone) change by making latent patterns more tacit.

TUT23: "Network Slicing in the Era of 5G and Beyond"
Room:
Kohala 3
Presenters: Adlen Ksentini (Eurecom, France)

5G is assumed to not only increase the physical data rate, but also support a diverse set of new services coming from the vertical industries (e.g., automotive, eHealth and IOT). These services are known to have different needs in terms of network performance, such as low latency access, high communication reliability and the support of massive numbers of devices. According to their type, 5G classifies services into: enhanced Mobile BroadBand (eMBB), ultra Reliable Low Latency (uRLLC) and massive Machine Type Communication (mMTC). Therefore, the "one size fits all architecture" currently in use by 4G is no more sufficient, leading to rethink the network architecture at all system levels. In this context, network slicing is envisioned as the key solution to create virtual network instances tailored to vertical services, on top of a shared network infrastructure. Network Slicing relies on the advances made in network softwarization, i.e. Network Functions Virtualization (NFV), Software Defined Networking (SDN) and cloud computing, to provide flexible and dynamic virtual networks. This tutorial aims to give insight to the concept of Network Slicing and its usage in 5G and beyond, by presenting the recent advances in terms of algorithms and mechanisms, architecture and technology enablers. The tutorial starts by introducing concepts related to network softwarization, such as SDN, NFV and Mobile Edge Computing (MEC). Afterwards, the tutorial introduces the newly 5G architecture, based on Network Slicing, as defined by 3GPP and ETSI, covering: (i) Radio Access Network (RAN); (ii) Core Network (CN) and (iii) Life Cycle Management (LCM) of network slices. An implementation of an end-to-end network slice using OpenAirInetrface(OAI) will be used as example to show the enabling technologies. The tutorial will then discuss the relation between MEC and 5G, particularly on solutions to ensure MEC slicing. Finally, conclusions and research perspectives will be discussed towards 5G long term evolution.

TUT24: "Coding for Distributed Computing"
Room:
Kohala 4
Presenters: Alexandre Graell i Amat (Chalmers University of Technology, Sweden); Eirik Rosnes (Simula UiB, Norway); Albin Severinson (University of Bergen & Simula UiB, Norway)

Distributed computing systems have emerged as one of the most effective ways of solving increasingly complex computational problems in a wide range of domains, e.g., in large-scale machine learning and data analytics. For example, Google routinely performs computations over several thousands of servers. Furthermore, applications such as the Internet of Things (IoT), intelligent transportation systems, and robotics may benefit from offloading computationally-intensive tasks to the cloud. Distributed computing faces significant challenges, among them the problems of straggling servers and data shuffling between servers, which severely penalize the computational latency, and the unreliability of nodes and communication links in cloud and fog computing, which not only impact the latency, but also impair the accuracy of the computation. Furthermore, performing computations over possibly untrustable (and even potentially byzantine) remote servers, pose serious concerns about security and privacy. This tutorial will show how channel coding is a powerful tool to overcome these challenges, bringing significant improvements in terms of latency and accuracy, as well as providing security and privacy. Besides theoretical components, the tutorial will include practical hands-on exercises designed to build intuition and reinforce understanding. The tutorial assumes a basic understanding of linear algebra and probability theory.

Patrons

Exhibitors

Innovation Center Exhibitors

Partners and Supporters