Welcome to Edge Networks Group!
Our research focus is on the design of edge networked systems to support emerging applications from the Cyber-Physical Systems and the Internet of Things. We investigate novel performance metrics and algorithms to cater to the requirement of sensing/collecting (data), communication/offloading, and actuation/inference for these applications. Our primary goal is to achieve low delay, and low energy consumption for edge devices (operating over wireless), by only collecting useful data from the environment with which an application interacts, thereby also reducing the bandwidth requirements both over the wireless and the core networks. The topics of current interest include (but are not limited to): Learning at the Edge, Age of Information (AoI), Computation Offloading, and Wireless Communications.
Distributed ML Inference at the Edge
In the era of Edge Intelligence, i.e., the confluence of edge computing and artificial intelligence, an increasing number of monitoring applications at the network edge are using Machine Learning (ML) inference, in particular Deep Neural Networks (DNN) inference. On one hand, resource-constrained edge devices, such as IoT sensors, can only support small-size ML models, e.g., TinyML models, which provide lower inference accuracy on the data albeit using lower energy. On the other hand, offloading the inference jobs to a computationally powerful edge server results in higher inference accuracy. However, there are several non-trivial aspects that need to be carefully considered: 1) transmission energy consumption at the edge device, 2) energy consumption at the edge server (this is important as we aim for Green Solution), and 3) delay incurred in processing the data at the edge device versus offloading it and processing it at the edge server. Thus, we study a novel three-way trade-off between inference accuracy, delay, and total system energy consumption in this system.
AoI Analysis and Optimization
AoI is a freshness metric that measures the time elapsed since the generation time of the freshest packet available at the Receiver. In contrast to system delay, AoI increases linearly between the packet receptions through which it accounts for the frequency of sampling the Information Source. We analyze AoI for fundamental queueing systems and also study optimal sampling and transmission strategies for minimizing AoI in these systems.
Edge Computing Offloading Algorithms
Edge computing or fog computing, where computational resources are placed close to (e.g. one hop away) entities that offload computational tasks or data for processing, is a key architectural component of 5G and future wireless networks. Offloading computational tasks from mobile devices to edge servers instead of the cloud results in internet bandwidth savings and circumvents the long delays involved in communicating the data load of the offloaded tasks to a cloud data centre residing somewhere on the internet. Above all, edge computing augments the compute and memory limitations of edge devices.
Transient Delay Analysis and Optimization
Most of the research dealing with general network performance analysis using queuing theory consider systems in steady-state. For example, for simple M/M/1 or more general Markovian queuing systems, the steady-state is governed by the (conceptually simple) flow balance equations. In contrast, transient analysis of these systems results in intractable differential equations. Using Stochastic Network Calculus, we derived the end-to-end delay violation probability for a sequence of time-critical packets, given the transient network state (queue backlogs) when the time-critical packets enter the network. Leveraging this analysis we compute good resource allocation strategies for wireless protocols such as WirelessHART to support the QoS requirements of time-critical industrial applications.