Uptime Institute senior vice-president of strategy, Mark Harris, discusses the importance of ensuring infrastructure ‘at the edge’ is as reliable as installations within central data centre sites
The digitalisation of business is reframing just about everything we have learned as modern information technology professionals about leveraging economies of scale in the delivery of critical IT services for the corporate world. While there are numerous new enabling technologies now available for the IT industry, the biggest change can actually be attributed to looking at the problem from the opposite direction – focusing on the consumer of information rather than the delivery of it. This change in perspective is forcing the entire IT function to be recast. The organisations that embrace this transformation will be rewarded, while those that don’t will be left behind.
In the 1950s, IBM rolled out the mainframe. Since that time, most would agree that ‘real computing’ happens in ‘real data centres’. Many believe that concrete and steel, massive power feeds, rows and rows of racks filled with servers, switches and storage are prerequisites for real computing. In fact, when the average IT professional thinks about what actually makes up their world of computing, they still visualise small numbers of these massive structures delivering the required computing.
In reality, the trend is quite different. Most analysts who follow computing trends would estimate that the distributed or ‘edge’ computing market will grow by a compound annual growth rate (CAGR) of more than 30% per year for the next five years, compared with an overall IT segment CAGR of just 3%. Clearly the spend on edge computing is something to stop and take notice of.
It is true that the distribution of some IT technology has been happening for years. At any retail or financial institution, there is a significant amount of IT technology in each store or branch. But when you focus on the function being delivered by those distributed edge deployments, what you will find is that technology is mostly aggregating user transactions to be transported and then processed in a centralised data centre far away. On paper these deployments look like distributed edge computing environments, but the biggest opportunity in the digital age is to enable complete processing of information at the edge itself, not just extend user access points.
New technologies such as the Cloud, the Internet of Things and carrier 5G deployments are enabling exciting new applications that demand a level of performance that is only possible with processing at the edge. So, distributing computing to the edge is becoming the baseline for digital business. As the world scales its digital footprint, we can no longer afford to rely so heavily on processing in centralised mega data centres. The reason? Physics gets in the way.
Simply put, the total distance that information needs to flow is directly proportional to the time it takes to complete the transaction. The greater the distance, the longer it takes to respond to the user. Longer delays may have been fine in years past, where applications were simpler, less mobile and less real-time, but when users are expecting instant gratification in a digital world, any delay is counter-productive. Even minor delays in certain real-time applications – like self-driving cars – could be catastrophic.
As a result, there are now literally millions of small computing cabinets in the back of retail stores, remote offices and bank branches, spread across campuses, or even at the base of a cellular towers that are quickly becoming full micro-zones of edge computing. These physically distributed micro-zones are being logically amalgamated into the bigger corporate processing function.
While it has been fairly common for remote sites to go offline from time to time in those historical deployments, the new digital business world must always be on and available, exhibiting the same level of reliability as their centralised data centre counterparts. That means IT practitioners must address these distributed sites with the same level of mission critical operational practices that they have provided for years in their centralised data centres.
Distributed or ‘edge’ computing is the new digital business platform. The various technologies required to create and aggregate distributed computing have matured, and the deployments of such technologies are well past the pilot stages. This modern edge-focused IT delivery strategy requires that organisations be able to actively manage, maintain and defend hundreds or thousands of these sites with the same prowess they have successfully demonstrated in centralised data centres.
So how should businesses look to ensure a strategy for resilience for their distributed infrastructure? Businesses must take a holistic approach to addressing overall resiliency at the service level, rather than at the component level. From a hardware and data centre perspective a multi-site hybrid data centre environment is typically comprised of enterprise, colocation and cloud data centres, provided by multiple providers. From a software perspective, applications and data are run on the appropriate platform and location that provides the best service to the customer for a given demand, time of day. In fact it is quite common for workloads to migrate dynamically over the course of the day, delivering the ultimate business service transparently to the consumer.
From a strategic point of view, it is the continuous delivery of the business service that matters, and it is the understanding of all of the linkages and sub-components which each need to be optimised to realise that goal.
Uptime Institute has worked closely with 451 Research to develop a protocol that describes the considerations that are critical to delivering business services in a hybrid environment. There are five key components involved in the predictable delivery business services at the right time and at the right cost:
1) the physical platforms
2) the connectivity between those platforms
3) the resilience of the data centres and colocation sites themselves
4) the application dependencies
5) the surrounding organisation and its ability to support this dynamic work-shifting environment
Since many of these components are no longer under the direct control of the CIO, a protocol is needed to assure the delivery of business services upon this hybrid infrastructure is closely aligned with the business needs themselves.
Uptime Institute’s methodology can be applied to any given hybrid structure to assess its ability to deliver specific business services. Its Hybrid Resiliency Assessment provides a structured review of the five key criteria and provides a metric for each in the delivery of each studied specific business service. The assessment brings focus to deficiencies or other critical path items that will affect the ability to deliver a specific business service in various demand and stress situations.
Ultimately, how well businesses manage the transformational challenge to build and maintain truly distributed edge computing infrastructures will determine their ultimate viability as a company in the digital age.