Centralized Cloud to De-Centralized Edge Computing
Cloud computing has matured rapidly in the past decade, and even those outside the data center industry know its many benefits. However, as the cloud has become ubiquitous, its limitations have also come into sharp focus. From latency and bandwidth issues to security problems to the threat of being stuck without Internet access, the problems inherent in cloud computing need solutions more urgently than ever.
Image Source Credit: IBM Prefabricated Modular Data Center
Enter: edge computing, a broad set of technologies that operates on a common principle of creating speed, security and performance through localized computing power. It gets its name from its key difference from the cloud computing model. Where cloud computing is centralized, edge computing pushes its functions out to the “edge” of a network, closer to the end user. In this article, we’ll take a dive into the applications of edge computing that are already happening or in the pipeline, and look at how they have the potential to revolutionize how data centers operate.
Video Source Credit: AT&T
1. Combating Latency Issues
Data latency is still frustratingly prevalent in many of the cloud apps we use every day–and it’s much more costly than you might realize. Amazon’s study on latency found that every 100 milliseconds of latency shaved a percentage point off their sales, and Google found that just an extra half-second of latency resulted in a 20 percent hit to their traffic.
Edge computing is well-positioned to combat latency because it positions the computing power much closer to the data and to the user. Rather than relaying large amounts of data through a centralized network, it brings key elements together to create a more efficient process though localization of computing resources. That speed improvement might not seem critical on a small scale, but when applied to the daily millions or billions of computations across a data center, it adds up quickly.
2. Creating a More Localized Approach
The new millennium has brought us a renewed focus on local food and goods—now, it seems that local computing is the next frontier. The cloud is highly centralized and requires a great deal of upstream and downstream activity to perform its operations. Edge computing is decentralized, mobile and modular, deploying computing power where and when it’s needed through services such as modular data centers. That ultimately translates to improved flexibility and a greater ability to allocate computing resources where they’re most effective.
3. Improving Security Practices
With cloud computing centralizing more sensitive data into a smaller number of providers, cybercriminals have seen a golden opportunity. Even a small hole in a data center’s network security can provide access to millions of users’ most sensitive data, and large-scale breaches 
have been common since cloud computing began. Edge computing presents an opportunity to mitigate these flaws. When sensitive data is stored away from the cloud on local data center networks inside secure air-gapped server electronics enclosures, it’s both considerably more difficult for bad actors and much faster for authorized users to access.
In addition to secure electronic server enclosures, computers deployed in an edge network also feature a hardware-ingrained security module known as Trusted Platform Module 1.2 and 2.0 that provides a layer of encryption security at the hardware level. This standard is set by the Trusted Computing Group, a organization that was formed by computing leaders like AMD, Hewlett-Packard, IBM, Intel and Microsoft to implement Trusted Computing concepts across personal computers. A major goal of the Trusted Computing Group is to “secure cloud computing and virtualized environments for technologies deployed in: enterprise systems, storage systems, networks, embedded systems, and mobile devices.” Especially as thousands of devices shift to mobile and remote locations away from centralized data center security infrastructure, it’s imperative hardware-encrypted security layers are implemented where data is generated and collected. For example, embedded and rugged edge compute systems collect and process sensitive data become targets for cyberattacks as they connect into the “Internet of Things” to communicate. Intellectual property, personal data and machine commands become increasingly susceptible to exfiltration or manipulation with increased connectivity.
4. Allowing a Greater Role for Machine Learning and Eventually, Artificial Intelligence
Artificial intelligence (AI) is one of the most important use cases for edge computing. The data processing needs of a complex AI, such as the ones that power self-driving cars, demand not only improved capacity but exceptional reliability and consistency. A driverless car’s AI can’t stop working simply because it goes offline for a minute, and it must keep up the pace of processing hundreds of gigabytes of data per minute. To meet those needs, computing power needs to be available on board and on demand—which is, of course, where edge computing comes in. From vehicle-to-vehicle communication to integration with “smart city” features, the necessary functions of advanced vehicle AI will require distributed computing power on a level never seen before—which is why engineers are already hard at work on making these advanced solutions possible.
5. Providing Space for the Internet of Things
AI isn’t the only thing eating through massive volumes of data storage. Internet of Things technologies, from home voice assistants to smart medical devices in handheld enclosures , and edge IoT gateways often generate huge amounts of data that need to be processed close to their sources—something that edge computing excels at. Edge computing could help ensure that, for example, a smart heart monitor is able to process its data quickly enough to detect a pattern of dangerous inconsistencies and alert medical professionals.
6. Addressing Needs Offline
A predictable consequence of the cloud’s ubiquity is that the effects of an Internet outage for a data center have evolved from irritating to potentially catastrophic. Technologies like wireless hotspots can provide some relief, but they’re ultimately a Band-Aid rather than a real solution, and it doesn’t offer much help for the kind of computing that goes on in data centers. Edge computing has the potential to greatly reduce how much industry and commerce relies on inconsistent WiFi and cellular networks, creating “best of both worlds” networks that have the advantages of the cloud while reducing its risks and inconsistencies. Our business and tech world demand constant connection—so, when that connection goes down, we’ll have the reassurance and consistency of edge computing power to draw on.
7. Taking the Workload Off the Cloud to Create Balance
Despite the many virtues of cloud computing, it won’t be enough by itself to provide the performance that our ultra-connected world needs. Big network providers like Verizon are already struggling to provide enough network spectrum in large markets, and the impending rollout of 5G won’t be a permanent or comprehensive solution. Instead, those in the know are looking toward edge computing to provide the computing muscle that will create a sustainable, affordable and secure infrastructure for 21st century data centers.
Edge computing is a great example of how “everything old is new again.” The cloud took us away from on-site computing. Now, edge computing is bringing it back with a vengeance. The key, however, is that edge computing won’t compete with cloud infrastructure. It will work together with it to create the complex and robust computing solutions that our new economy needs.
For example, many of the leading cloud solutions providers like Amazon’s AWS and Microsoft’s Azure recognize this shift, and now provide additional certification programs and services for OEMs to enable the cloud when needed for end-users. This hybrid relationship with IoT devices feeding telemetry data into a central Hub, showcases the importance to manage critical communication between billions of IoT devices. This unique combination of decentralized edge computing that connects into a centralized cloud when needed, is what continues to drive new business opportunities but also help address complex challenges for many enterprise businesses. Now, data that is generated from connected IoT devices provide an enormous amount of value and can be analyzed for better insights and decision-making. As more and more IoT devices connect online, there are several key driving factors that will continue to shape edge computing into its beneficial reality for many new emerging markets:
- Lower costs on compute and the Internet of Things, and Big Data
- Increased multi-core processing performance in smaller footprints
- Exponential growth of big data and bandwidth limitations
- Wireless LTE Connectivity and interoperability between machines-to-machines, eventually 5G bandwidth speeds
- Advancement in machine learning towards learning and decision-making intelligence
For example, a controversial and trending market that heavily relies on these factors below is the technology behind autonomous vehicles and fleet management mentioned earlier. There continues to be proven business opportunities in the autonomous vehicle market and how it has a direct relationship with the factors mentioned above. The convergence of lower costs, robust computing, faster connectivity, and machine learning provides enormous potential into an autonomous future.
Final Word: A Shift and Balance of Computing Workloads at the “Rugged Edge”
In a past blog and business insight post, the "Rugged Edge" was defined with an explanation about how it relates to edge computing. Read "When IoT Expands its Reach and Intelligence with "Rugged Edge Computing" and why specialized computers are required in the harshest edge deployments.
As more of the computing power is localized and made present in edge deployments , industrial and remote markets require specific needs in order to deliver reliability in real-time. These specific markets require hardware solutions that have been designed to withstand rugged applications and harsh environments. Although the benefits of deploying edge computing solutions are clear, however, when deployed in rugged environments that are harmful to electronic components, specialized equipment like embedded, industrial and rugged edge computers are required to bring mobile edge computing capabilities to new frontiers.
Rugged edge computers are specifically developed to withstand the rigors of harsh usage conditions and are validated to achieve extreme durability by incorporating ruggedized features throughout its entire product design. From the external enclosure to carefully selected industrial-grade components, every piece of a rugged edge computer is purpose-built through a combination of mechanical and thermal engineering to address the issues of strong vibrations (MIL-STD810G), extreme temperatures and wet or dusty situations. Embedded computing engineers focus on the following requirements for deployments in the “rugged edge”:
Engineering Checklist for Rugged Embedded Computers:
- Fanless and Silent Designs for Better Reliability
- One-Piece, Cableless, and Validated to endure Shock and Vibration
- Extended Operating Temperature Range for Harsh Environments
- Wide-Voltage Power Protection
- Modular and Expandable I/O options designed for workload consolidation
5 Must Have Hardware Requirements for Scalable Rugged Edge Computing:
- Wireless Connectivity
- Mobility and Remote Deployment
- Performance Accelerators
- Variety of I/O Ports and PCIe/PCI Expansion Slots
- Ruggedness and Security