May
9

Edge Computing vs. Cloud Computing

Edge Computing vs. Cloud Computing

May 9
By

Of all the IT buzzwords to come around, none have been filled with more hype and promise that “the cloud”. Cloud computing took the world by storm, and in the process, proved that it was far more than hoopla. Though it’s still a big deal, there is talk that the cloud may soon be “edged” out in the race for the Internet of Things (IoT) supremacy. Enter edge computing.

Edge computing gets its name by processing data at the edge of the network. What exactly does that mean? Well instead of housing processing power in the cloud or a centralized data center, data processing occurs in multiple smaller data centers located at, or near the source. The goal of edge computing is to push data as close to the actual device as possible — typically within a range of 100 square feet or less.

If you’re familiar with cloud computing, then you understand that it’s an entirely different model. The cloud allows you to remotely access your data, applications, and infrastructure from anywhere. However, because those resources are stored across offsite servers, latency and performance often become an issue. By processing data locally, edge computing can overcome the shortcomings of cloud computing and fit ideally into a multitude of business applications.

Edge Computing at Work

digital thief stealing code

Edge computing is already paying dividends in oil and gas infrastructures. These operations rely on sophisticated IoT devices that monitor the health of critical aspects such as pressure, temperature, and humidity. Thanks to edge computing, the massive amount of data those devices collect can be analyzed and delivered directly to control centers instead of traveling across a network. From there, end-users have real-time access to insights that can help detect malfunctions and avert disaster before incidents occur.

An ‘Edge’ for Business

Below we have identified three ways organizations can potentially benefit from edge computing.

1. Faster performance: While the cloud is capable of processing data quickly, the aforementioned latency issues can result in slower response times for devices connected to the internet. In contrast, edge computing increases the speed of data delivery, allowing those same devices to enjoy a noticeably better performance.

2. Cost savings: The upfront expenses associated with storage, bandwidth, and processing power can make IoT a cost-prohibitive investment for any organization. With edge computing, there is a lot fewer data traveling between data centers. Since that data can be processed at the local level, businesses have the luxury to choose what they want to run locally, and what they want to run in the cloud. The result is a more cost-effective IoT solution.

3. Improved security: To this day, security remains one of the biggest barriers to cloud adoption. Not only does edge computing reduce the amount of cloud traffic, it lets you filter and stores sensitive data locally. As a result, organizations can build an infrastructure around their individual security and compliance requirements.

Disaster Recovery Implications

Smartphone and tablet data synchronization, woman syncing files and documents on personal wireless electronic devices at home, selective focus with shallow depth of field.

Researchers from Georgia Tech suggest that edge computing can also prove useful in disaster response efforts. Various internet services were knocked offline in the wake of hurricanes Harvey and Erma. The research team believes edge computing would allow routers, smartphones, and other devices to continually gather data amid disasters even without the internet. Emergency managers and responders can then access that data and use it to assist those affected by a disaster. In this scenario, edge computing could potentially help businesses continue to operate and save lives as well.

Out with the Cloud?

A report by MarketsandMarkets projects that edge computing will reach $6.72 billion by 2022, surging at a compound annual growth rate of 35 percent from the forecast period starting in 2017. Does that mean it’s ready to supplant the cloud as the leading technology for IoT and beyond? Not so fast. While the shortcomings we spoke of earlier are essentially driving adoption to the edge, there is plenty of research to suggest that the cloud will continue to see steady growth.

At the end of the day, this one may not be a case of the best technology, but the best mix. If you can effectively determine what to funnel through the cloud and what to run locally at the edge, you may be able to enjoy the best of both worlds.