Cloud Computing Paradigm
The cloud computing paradigm has evolved from the on-premises deployment model. With cloud computing, you can transmit voluminous industrial data from plants, equipment, machines, vehicles, controllers etc. to IT/OT applications via an internet connection, but the infrastructure details behind the scenes are hidden from the End User.
Because of this approach, cloud computing provides many advantages such as scalability, cost effectiveness and simplicity. You can ramp up cloud services as needed and gain operational flexibility without hard investments. Since the responsibility of managing the software and back-end infrastructure is upon the cloud services provider, you can save on the personnel and infrastructure investments. It's also easier to abstract the complexity of the hardware away and request additional computing resources as need.
At the same time, an issue with cloud services is that it needs to be always connected to the internet and is a poor fit for industry use cases involving either no or intermittent network connectivity. Also, cloud computing is bandwidth intensive because a lot of data needs to be transmitted to the servers where the computation and storage happens. This can be rather expensive in scenarios where vast information is generated—such as an industrial setting. Due to the roundtrip network delay, application response time can take from a few seconds to several minutes. This can be a problem for use cases where near real-time response time or decision-making is needed. So, cloud computing cannot be the only answer for all IX use cases.
Edge computing
Enter the edge computing paradigm. The shift here is to place computing resources closer to the user or the device, at the “edge” of the network, rather than in a hyperscale cloud datacenter that might be many miles away in the “core” of the network. The edge approach emphasizes reduced latency and provides more processing of data close to the source—eliminating a lot of round-trip data movement.
Thus, the edge computing model is useful for use cases that involve time-sensitive and data-intensive applications. These applications can deliver near real-time performance by computing resources closer to the source of data generation. Moreover, these applications help prevent overloading of the network backhaul by processing more data locally and being selective about the amount and frequency of data sent to the cloud. By keeping data local, you also achieve better security, privacy and data sovereignty.
IDC, a renowned analyst firm, sees a strong outlook for edge solutions as they attract the attention of C-suite executives. According to an IDC survey, 73% of senior IT and line-of-business decision makers view edge as a strategic investment[1]. These organizations are looking to edge as a way of increasing productivity and improving security, leading to faster, more informed decision making. IDC also predicts that by 2023, over 50% of new enterprise IT infrastructure will be deployed at the edge rather than corporate infrastructure; by 2024, there will be an 800% increase in the number of applications at the edge.[2]
As the edge computing paradigm is evolving and gaining interest, it is impacting the digital ecosystem in both discrete and continuous process applications and empowering manufacturing organizations to focus on production-centric outcomes. Companies are leveraging edge computing on assets, machines, and production lines to improve plant reliability and overall equipment efficiency through applications like HMI/SCADA, machine analytics, and asset performance.
Cloud Versus Edge computing
So where do you go from here? Which computing paradigm is best for your IX initiatives?
The most likely industrial scenario is that an OT application will not only live in the edge but will also need to communicate and interact with other cloud or on-premises workloads. This is borne out by an earlier Automation World survey, which finds that manufacturers are taking the middle path of generally not choosing between computing paradigms[3]. Instead, they are deploying a range of cloud and edge technologies depending on their specific business use cases and are ultimately leveraging the paradigms as complementary. The key, per the experience of practitioners, is mapping out an architecture and strategy designed to encompass both paradigms.
System architects who adapt to both paradigms to the best advantage of the overall system will create value for their organizations. They will build flexibility into the architecture so that data that goes to the cloud might someday be leveraged on-premises too. The overall architecture will need to encompass edge and cloud architectures so that they can play well together for future business needs.
Conclusion
With edge-to-cloud deployments increasingly becoming the norm, industrial organizations need to stop thinking about where to deploy data and applications and instead focus on the underlying business need. IX leaders should consider the requirements relating to cost, security, latency, and a reliable internet connection—and then choose between edge or cloud.
To sum it up, edge and cloud computing are not competing technologies. They just solve different needs. Cloud computing is apt for on-demand, scalable applications that need to be ramped up or wound down. Edge computing is great for real-time response applications that generate a lot of data. In short, both cloud and edge computing have their use-cases and must be chosen according to the application in question.
[1] Source: “Edge Computing Solutions Powering the Fourth Industrial Revolution,” by IDC, sponsored by Lumen and Intel, based on a survey of 802 business decision-makers worldwide
[2] Source: The Impact of the Edge on the Future of Enterprises by IDC, sponsored by Akamai Technologies
[3] https://www.automationworld.com/process/iiot/article/21952832/pandemic-accelerates-edgetocloud-digital-transformation