Processing intelligent data for real-time decision-making is the next frontier for infrastructure evolution, says Andy Rowland, Head of Digital Manufacturing, BT.
Adopting edge computing is the next important step in future-proofing your infrastructure.
By moving data processing towards the ‘edge’, you bring real-time decision-making to where it’s needed. This supports whatever capabilities will be critical tomorrow, from Internet of Things (IoT) technologies to Artificial Intelligence (AI) powered applications.
Edge computing will be bigger than cloud computing
I’ve been working at the heart of edge computing for several years now, tracking the evolution of the technology and developing ways for industry to harness its potential. Edge computing is the new growth area, and I believe it will ultimately eclipse the take up we’ve seen for cloud.
I’ve noticed a change in how organisations are approaching data; they’re starting to think about how many versions of data they keep, as well as how they store and manage it. This is tying in with increasing concerns about the amount of energy used by data centres from a cost and sustainability point of view. Organisations are finding it makes sense to move processing close to where they’re creating and using the data.
Based on my experience, here are the top eight future-proofing benefits of adopting edge computing:
#1 Ensuring business critical applications are always available
Hosting business critical applications in the cloud is a high-risk strategy because connectivity is vulnerable to interruption, for example a network cable being severed by accident. An edge computing solution supports smoother operations without disruption, even in remote areas. Reliability increases because the solution is less exposed to external interruptions and so its risk of failure falls.
This reliability, combined with the real-time processing that can support so many technologies that improve the end-user experience, can be transformative. Edge computing is an enabler for IoT technologies and AI-powered applications that unlock new, more efficient ways of operating that improve productivity.
#2 Facilitating real-time decision-making
Bringing processing to the edge means data isn’t making a roundtrip to central data centres or clouds to be processed, so latency improves to the levels needed to support real-time analysis and decision-making.
This near instant decision-making is critical to addressing so many emerging and future needs across industry - from optimising manufacturing processes and production scheduling, to running closed loop applications to optimise energy usage and reduce the carbon footprint.
#3 Improving sustainability
Edge computing shifts the organisation towards more effective ways of operating that optimise energy use and reduce carbon emissions. It reduces the amount of data centre capacity needed by cutting the volumes of data sent to the core.
In many cases, running some IT processing alongside Operational Technology (OT) processing at the edge drives efficiencies such as consolidating cooling requirements and combining maintenance visits.
#4 Reducing data and operational costs
Data is the lifeblood of global organisations and the volumes involved are increasing all the time. As data traffic grows, the costs of the bandwidth to support it are spiralling upwards, with no sign of stopping.
Continuing to send vast quantities of data to core data centres or clouds for analysis isn’t sustainable, and the costs of managing and storing this data are growing, too. Edge computing breaks these patterns, so that only intelligent, processed data needs to make the journey to the core.
#5 Meeting data sovereignty regulations
Data sovereignty legislation is already rigorous, and this will continue impacting on organisations’ ability to extract value from data. Edge computing is a flexible way to stay compliant, keeping data storage and processing in-country rather than sending it out of country into a main data centre or public cloud.
#6 Supporting innovative applications
Talking to our edge computing partners, the biggest use cases they’re meeting at the moment involve private 5G networks and remote ways of bringing expertise into operating environments with Augmented Reality (AR) and Virtual Reality (VR).
It makes sense that, after tasting the possibilities during the pandemic, organisations don’t want to go back to flying experts out to locations for training or maintenance, for example. Instead, they’re using smart glasses and AR apps to guide maintenance remotely and using VR for training. Edge computing is critical to delivering the ultra-low latency these applications need.
#7 Supporting the needs of remote locations
Sometimes edge is the only option. For much of the natural resources sector, cloud connectivity is either non-existent, highly limited and / or very expensive. For remote mining sites and oil fields, edge processing is often the only choice for hosting apps to reduce expensive unplanned downtime and supporting local engineers with VR training for health and safety.
Recently we’ve been approached by clients keen to improve the energy efficiency of their bulk ore carriers and LNG tankers. In both cases, cloud connectivity is very expensive as the only option is via satellite, so edge processing on the vessel to run applications to optimise the use of marine diesel is the only viable option.
#8 Supporting faster deployment of updates and in-life change requests
Edge computing delivers local processing power with central control, and this can transform the arduous process of updating local information.
Take digital signage in retail, for example. Controlled centrally, it enables consistency over the customer experience and makes it possible to change all store displays at the touch of a button. Plus, centralised, remote configuration ensures consistency by reducing the chance of missing software patches.