Open Compute Project’s John Laban says a ‘hidden revolution’ is set to dominate the data centre sector. But does the industry know it is coming? Louise Frampton reports
There is an “unknown world” beyond the walls of the enterprise data centre and experts claim that operators are falling behind the times – missing out on huge cost savings and reductions in energy. John Laban from the Open Compute Project believes the sector is being held back by a lack of awareness of the technologies being used by Facebook and other leaders in the hyperscale space.
Speaking to delegates at Data Centres Ireland, he warned: “The industry you are in belongs to the last century – things are changing. It is all about being collaborative… An unknown world is about to eat you alive.”
The revolution Laban refers to is the Open Source movement. One of the main drivers behind this approach has been social media giant Facebook.
Following rapid growth in its popularity, Facebook realised that it had to rethink its infrastructure and set out to design the world’s most energy efficient data centre. The decision was taken to design and build the company’s infrastructure from the ground up, including software, servers, racks, power supplies and cooling. The result was a facility that was 38% more energy efficient to build and 24% less expensive to run.
In 2011, Facebook shared its designs with the public and, along with Intel and cloud computing company Rackspace, investment bank Goldman Sachs and billionaire Andy Bechtolsheim, launched the Open Compute Project.
Today, the social media company has become a pioneer in technology within the data centre space, looking at areas such as virtual and augmented reality, artificial intelligence and connectivity.
“Our scale is huge – we have to serve billions of people around the world using our platforms; so we have to be reliable,” comments Niall McEntegart, Facebook’s data centre site operations director (EMEA).
“We provide free services, and we intend to keep them that way; so we have to run extremely efficient and cost-effective infrastructure or we would go out of business.”
McEntegart explains that this was a key driver for the Open Compute Project: “We realised our approach in the data centre space just wasn’t going to cut it long term and keep up with our pace of growth.”
The company needed to take control of its own destiny by redesigning its servers, cooling and power. Today, its PUE is 1.07 compared with an average of 1.5.
“A lot of the low-hanging fruit for PUE has now gone,” comments McEntegart. “There is only so much efficiency you can squeeze out of your infrastructure unless you start doing onsite generation. However, there are still opportunities around how you use your assets in the data centre.
“There is a lot of focus around power that is coming off the grid versus what you are using as your IT load. But the really useful work is focused on what the IT load is actually doing.
“Production facilities around the world have talked about ‘sweating their assets’ for years. This is where the next great opportunity is – it is about how we sweat our servers and network to do as much useful work as they can. This is both for the hardware and the compute side, but also in terms of the applications. There are huge opportunities for efficiency gains.”
In a few short years, Facebook went from breaking ground on its first data centre to becoming a company with an enormous infrastructure footprint around the world. This includes a significant footprint in Ireland.
In November 2018, the company announced plans to expand its investment in the region by acquiring a long-term lease of 5.6ha in Ballsbridge for a new campus development at the Bank Centre. This also followed the construction of Facebook’s ¤300m data centre park in Clonee, Co Meath, which opened in September 2018.
This rapid expansion has been supported through efficiency gains and Facebook estimates that, since using OCP hardware and designs, it has saved enough electricity to power 287,000 homes for a year.
Renewables and OCP
Facebook is now combining the efficiencies gained through OCP with a strategy around the use of renewable power. Since its first purchase of wind power in 2013, Facebook has signed contracts for more than 3GW of new solar and wind energy, including more than 2,500MW in one year (for the period of August 2017 to August 2018).
In 2015, the company set a goal of supporting 50% of its facilities with renewable energy by 2018. It achieved this goal a year early, reaching 51% clean and renewable energy in 2017.
“By 2020, we plan to use 100% renewables for our data centres and corporate offices.We are not just buying credits, we are actively investing in new renewables,” comments McEntegart.
In May 2018, Facebook announced it had invested in 300MW of new wind energy in the Nordics market. Wind power was chosen to supply 100% of Facebook’s data centre in Odense, Denmark, and a portion of its campus in Luleå, Sweden. Approximately 70 4.2MW wind turbines will provide 294MW of new renewable capacity to the latter this year.
The company’s Clonee facility is also powered 100% by renewable wind energy and the company is considering other ways of promoting a sustainable future for data centres in the region.
“We are looking at a community heating scheme and the intention is to heat 7,000 homes, for free, in Ireland,” says McEntegart.
While the combination of OCP technology and investment in renewable energy is part of Facebook’s plan for sustainable growth, Laban reported that the data centre sector exhibited poor awareness of OCP strategies. This is despite the fact that OCP Foundation server technology alone is set to increase from 20% in 2016 to over 80% by 2025.
“The market is growing extremely fast – it dominates the hyperscale space, it is moving into the telco space and now it is moving into the enterprise space. It will have a radical impact on the data centre market,” he said.
OCP approaches are claimed to result in 50% lower capex, compared with traditional Tier III data centres, and a 50% reduction in opex. Energy use is also 50% lower, according to OCP Foundation calculations.
“This is why the hyperscalers are interested – shaving off cost is a no-brainer,” Laban continued.
He went on to explain how colo data centres will be disrupted by Open Cord (central office re-architected as a data centre) and described the impact that this will have on the current status quo.
“Telcos are taking the hardware solutions of the hyperscalers and rolling this out into telephone exchanges. To get an idea of the scale, in the US, AT&T is converting 4,700 telephone exchanges. This is because they are turning everything into a virtualised facility; it will save a fortune.
“AT&T is seeing a 70% reduction in capex by using the technologies that the hyperscalers have been using since 2011. The opportunities are huge and the telcos cannot move fast enough,” said Laban.
The world’s telephone exchanges are now turning into data centres and there are already 100,000 installations scheduled to be undertaken within a period of three to four years.
Companies involved in Open Source Cord conversion are growing at a rapid pace and there have been some high-profile business deals in Open Source technology in the past six months – Microsoft purchased GitHub, described as “the largest Open Source community in the world”, and Microsoft has also been engaged in the sharing of its patents. The intention is to accelerate innovation through collaboration.
According to Laban, Nokia’s new portfolio is based around OCP hardware, while Cisco’s recent acquisitions have all been focused on software businesses.
“This is because the technology is eating them alive,” said Laban. “You need to track what is happening.”
Laban warned the sector that “whatever market open source enters it eventually dominates. If you look at the efficiency of an OCP data centre, it makes all the traditional enterprise data centres look pathetic.
“This is why they are closing at 10% per annum… by the end of this decade, 50% of all the world’s servers will be Open Source.”