With deployments of IoT devices and the arrival of 5G fast wireless, placing compute and analytics close to where data is created is making the case for edge computing.

Original Article: By Network World

Edge computing is transforming the way data is being handled, processed, and delivered from millions of devices around the world. The explosive growth of internet-connected devices – the IoT – along with new applications that require real-time computing power, continues to drive edge-computing systems.

Faster networking technologies, such as 5G wireless, are allowing for edge computing systems to accelerate the creation or support of real-time applications, such as video processing and analytics, self-driving cars, artificial intelligence and robotics, to name a few.

While early goals of edge computing were to address the costs of bandwidth for data traveling long distances because of the growth of IoT-generated data, the rise of real-time applications that need processing at the edge will drive the technology ahead.

What is edge computing?

Gartner defines edge computing as “a part of a distributed computing topology in which information processing is located close to the edge – where things and people produce or consume that information.”

Edge computing was developed due to the exponential growth of IoT devices, which connect to the internet for either receiving information from the cloud or delivering data back to the cloud. And many IoT devices generate enormous amounts of data during the course of their operations.

Network World - How Edge Computing Works [diagram]Network World / IDG

Think about devices that monitor manufacturing equipment on a factory floor, or an internet-connected video camera that sends live footage from a remote office. While a single device producing data can transmit it across a network quite easily, problems arise when the number of devices transmitting data at the same time grows. Instead of one video camera transmitting live footage, multiply that by hundreds or thousands of devices. Not only will quality suffer due to latency, but the costs in bandwidth can be tremendous.

Edge-computing hardware and services help solve this problem by being a local source of processing and storage for many of these systems. An edge gateway, for example, can process data from an edge device, and then send only the relevant data back through the cloud, reducing bandwidth needs. Or it can send data back to the edge device in the case of real-time application needs.

These edge devices can include many different things, such as an IoT sensor, an employee’s notebook computer, their latest smartphone, the security camera or even the internet-connected microwave oven in the office break room. Edge gateways themselves are considered edge devices within an edge-computing infrastructure.

Why does edge computing matter?

For many companies, the cost savings alone can be a driver towards deploying an edge-computing architecture. Companies that embraced the cloud for many of their applications may have discovered that the costs in bandwidth were higher than they expected.

Increasingly, though, the biggest benefit of edge computing is the ability to process and store data faster, enabling for more efficient real-time applications that are critical to companies. Before edge computing, a smartphone scanning a person’s face for facial recognition would need to run the facial recognition algorithm through a cloud-based service, which would take a lot of time to process. With an edge computing model, the algorithm could run locally on an edge server or gateway, or even on the smartphone itself, given the increasing power of smartphones. Applications such as virtual and augmented reality, self-driving cars, smart cities and even building-automation systems require fast processing and response.

“Edge computing has evolved significantly from the days of isolated IT at ROBO [Remote Office Branch Office] locations,” says Kuba Stolarski, a research director at IDC, in the “Worldwide Edge Infrastructure (Compute and Storage) Forecast, 2019-2023” report. “With enhanced interconnectivity enabling improved edge access to more core applications, and with new IoT and industry-specific business use cases, edge infrastructure is poised to be one of the main growth engines in the server and storage market for the next decade and beyond.”

Companies such as NVIDIA have recognized the need for more processing at the edge, which is why we’re seeing new system modules that include artificial intelligence functionality built into them. The company’s latest Jetson Xavier NX module, for example, is smaller than a credit-card, and can be built into smaller devices such as drones, robots and medical devices. AI algorithms require large amounts of processing power, which is why most of them run via cloud services. The growth of AI chipsets that can handle processing at the edge will allow for better real-time responses within applications that need instant computing.

Privacy and security

However, as is the case with many new technologies, solving one problem can create others. From a security standpoint, data at the edge can be troublesome, especially when it’s being handled by different devices that might not be as secure as a centralized or cloud-based system. As the number of IoT devices grow, it’s imperative that IT understand the potential security issues around these devices, and to make sure those systems can be secured. This includes making sure that data is encrypted, and that the correct access-control methods and even VPN tunneling is utilized.

Furthermore, differing device requirements for processing power, electricity and network connectivity can have an impact on the reliability of an edge device. This makes redundancy and failover management crucial for devices that process data at the edge to ensure that the data is delivered and processed correctly when a single node goes down.

What about 5G?

Around the world, carriers are deploying 5G wireless technologies, which promise the benefits of high bandwidth and low latency for applications, enabling companies to go from a garden hose to a firehose with their data bandwidth. Instead of just offering the faster speeds and telling companies to continue processing data in the cloud, many carriers are working edge-computing strategies into their 5G deployments in order to offer faster real-time processing, especially for mobile devices, connected cars and self-driving cars.

In its recent report “5G, IoT and Edge Compute Trends,” Futuriom writes that 5G will be a catalyst for edge-compute technology. “Applications using 5G technology will change traffic demand patterns, providing the biggest driver for edge computing in mobile cellular networks,” the firm writes. It cites low-latency applications that include IoT analytics, machine learning, virtual reality, autonomous vehicles as those that “have new bandwidth and latency characteristics that will require support from edge-compute infrastructure.”

In its predictions for 2020, Forrester also cited the need for on-demand compute and real-time application engagements would play a role in driving the growth of edge computing in 2020.

It’s clear that while the initial goal for edge computing was to reduce bandwidth costs for IoT devices over long distances, the growth of real-time applications that require local processing and storage capabilities will drive the technology forward over the coming years.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.