31 July 2020

SDN / NFV

Edge computing and virtualization—living on the edge

8 minutes reading

Edge computing and virtualization—living on the edge

Cloud offers considerable benefits to businesses. Yet in some cases where low latency is critical, it has serious drawbacks. An emerging paradigm of edge computing is the answer to these problems.

In 2016, Cisco announced the beginning of the Zettabyte Era. Since then, the amount of data produced and processed has increased zettafold, if you will. This is especially true when you look at M2M applications (machine-to-machine or more commonly referred to as Internet of Things—IoT) such as video surveillance, healthcare monitoring, the smart home, smart meters and the like. According to Cisco’s Annual Internet Report (2018–2023), by 2023 M2M connections will account for 50 percent of the total devices and connections, globally reaching almost 14.7 billion connections. This will be the fastest growing category among global devices and connections. Additionally, applications based on artificial intelligence, autonomous vehicles, and virtual reality are steadily gaining ground in our everyday lives. They all require huge amounts of data to be sent and processed extremely rapidly and with minimal response times.

Given the increased data flows, traditional centralized infrastructure may become nonoptimal and thus obsolete, especially when low latency and real-time data processing are critical business requirements. Edge computing is indeed going to transform data centers as we know them, mostly due to the sharp increase in the amount of data they process. Spreading computing power out over an ecosystem of devices may optimize data center performance. This will be done by building local autonomy instead of transferring all the data to and from data centers. That’s why edge computing is on the rise.

What is edge computing?

At the center of edge computing is the notion of processing data as close to its source as possible, rather than using a cloud or centralized location that can be very far away. The main reason for choosing edge computing is to avoid latency issues which may affect an application’s performance. Additionally, in some cases processing data locally instead of sending it to the cloud can help companies save money. As the volumes of data to process rise, the trend takes off right alongside them. Processing or preprocessing data requires devices to be packed with the technology that makes data transfer unnecessary, or at least reduces the need. Heavy data processing is moved out to the edges of the system.

Of course, the definition of edge computing varies by industry. A telco company will look at the problem from the network perspective, and seek to build its network infrastructure as close to the client as possible. Tech behemoths like Google, Amazon and Microsoft, on the other hand, are offering their own edge computing solutions. They enable you to use their cloud computing and storage capabilities locally by installing their dedicated software and hardware (AWS Outposts, Azure Stack Edge or Cloud IoT Edge) directly on your premises. Finally, there are open source initiatives such as LFEdge, the wing of the Linux Foundation establishing an interoperable framework for edge computing, and ONF’s CORD, which provides a complete, integrated platform for creating an operational edge data center mainly for network-related use-cases.

how edge computing works

Fig. 1 How edge computing works

Edge computing use case—micro data center

Micro data centers are used wherever access to a full-scale database is limited, but there is great need for computing power. They can be installed in remote facilities, in factories to power up IoT devices or in any location where putting up a traditional data center would be impossible and moving computations into the cloud would be costly.

The CORD project—a reference NFV implementation that was constructed using commodity hardware and open-source software—is a good example of such a solution. CORD is a general-purpose service delivery platform that can be configured to provide services for residential, enterprise or mobile customers, or indeed any combination of the three.

SDN/NFV are more than suitable for Edge

In a less common take on edge computing, SDN and NFV techniques can be employed in the data center to optimize base performance. Cloud computing tends to be seen as infinite computing power that sits a mere arm’s length away. And from the client’s point of view, it is indeed that. But the provider or operator sees the other side, and must provide computing power by utilizing hardware to the greatest extent possible. Empowering the cloud with SDN and NFV at the hardware level leads to improved asset management and a reduced need for computing power for operations.

Micro data centers are even more reliant on SDN and NFV technologies than traditional data centers, as they need to be space-effective. Given this, virtualization is one of the best ways (and clearly the most convenient one) to run all the necessary devices (router, load-balancer, firewall etc.) inside the data center.

Leading hardware providers thus deliver SDN-enabled edge devices like routers that can be reprogrammed while their internal operations are redesigned with SDN and NFV techniques. The final goal is to extend the cloud beyond the data center and stop thinking about the network of devices as a sum of separate machines, but rather much more. We need seamless and secure connectivity to the cloud, so we can easily migrate and manage workloads. Edge can be treated as an extension of the Cloud.

Towards decentralized data processing

The edge computing paradigm is a response to the growing amounts of data generated today. As the need to process and store data locally rises, traditional, centralized infrastructure like cloud or corporate data centers will become nonoptimal. IoT devices, machine learning, autonomous cars, AI-based applications—they all are getting more and more grounded in our everyday lives. Expect edge computing to grow right alongside them.

SDN and NFV CodiLime services

Jarosław

Jarosław Ganczarenko

Content writer