What is edge computing?

Edge computing is, as the name suggests, computing at the edge of corporate networks. The “edge” is defined as the place where endpoint devices connect to the rest of the network—things like phones, laptops, industrial robots, and sensors.

Previously, the edge was a place where these devices connected to deliver data to a central data center or the cloud, receiving instructions from it and downloading software updates.

With the rapid development of the Internet of Things (IoT), this model suffers from some drawbacks. IoT devices collect massive amounts of data, requiring larger and more expensive connections to data centers and the cloud.

The nature of the work these IoT devices perform also requires much faster connections between the data center or cloud and the devices. For example, if sensors in an oil refinery’s valves detect dangerously high pressure in the pipes, they must be shut down as quickly as possible. With pressure data being analyzed in remote processing centers, automatic shutdown instructions may come too late. However, by placing processing power locally on edge devices, latency is lower, and round-trip time can be significantly reduced, potentially saving downtime, property damage, and even lives. Even with the introduction of edge devices that provide local computing and storage, they will still need to be connected to data centers, whether on-site or in the cloud. For example, temperature and humidity sensors in agricultural fields collect valuable data, but this data doesn’t need to be analyzed or stored in real-time. Edge devices can collect, sort, and perform initial analysis of the data, then send it where it’s needed: to centralized applications or some form of long-term storage, either on-site or in the cloud. Because this data may not be time-sensitive, it’s slower and less expensive, and can use less expensive connections—perhaps over the internet. And because the data is pre-sorted, the amount of data required for transmission can be reduced. Therefore, the upside of edge computing is the faster response times for applications requiring it, slowing the growth of expensive long-range connections to processing and storage centers.

The downside can be security. As data is collected and analyzed at the edge, it’s important to include security for the IoT devices connected to the edge, and for the edge devices themselves. They contain valuable data, but they are also network elements, and their exploitation could compromise other devices that store valuable assets.

As the importance of edge computing grows, it’s also important to ensure that the edge devices themselves don’t become a single point of failure. Network engineers need to build redundancy and provide disaster recovery contingency plans to avoid crippling downtime in the event of a major node failure. The industry has already come a long way toward meeting the demands of edge computing, and it’s becoming mainstream. Its importance is likely to increase as the use of real-time applications increases.

Leave a Comment