Ian Bitterlin examines whether data centres are ready for demand-side response and considers the risks
Demand-side response (DSR) and Short Time Operating Reserve (STOR) are energy schemes that support the utility in times of pressure upon capacity. Such schemes are a consequence of the denationalisation of the power system and the reduction in spinning reserve to avoid an unplanned peak demand requiring an idle power station to be brought on line with very short notice. In its simplest form the idea is that the capacity from local generation, such as standby diesel sets, can provide additional capacity or reduce the demand.
The fact that carbon emissions are increased locally is a downside but, in theory at least, it avoids (in carbon-dominated grids such as the UK) the greater carbon emissions involved in starting and stopping a large power station. Smaller, local, generation can certainly be activated very quickly compared with a large turbine, typically sub-minute compared with an hour or more – which is valuable if rapid support is required.
To put it into context, reciprocating engine diesel-generation produces about 860gCO2/kWh, compared with the UK grid’s annual average of circa 350gCO2/kWh at the 11kV distribution point – so the penalty in not inconsiderable.
We are burning less coal than ever before, on many days close to zero, and the consumption of natural gas has largely contributed to a very much cleaner utility. Since nearly all data centres in the UK have standby emergency generation, and the installed capacity of data centre power probably represents 45% of all of Europe, they have attracted attention for the application of STOR/DSR for several years, albeit so far without widespread adoption.
In this respect it is also worth noting that the UK has no ‘hyperscale’ facilities, such as Google or Facebook, which generally do not have facility-wide standby generation, and so our diesel generation capacity is higher than countries with such hyperscale installations.
So, why have such schemes not found widespread support? It is complicated…
The data centre operator can participate by signing an agreement with the regional utility distribution company for it to be able to call upon the facility to reduce demand. This has, in the historical case of STOR contracts, been compensated by a fixed annual fee per MW (eg £25k/MW) plus an enhanced payment for electrical energy avoided (eg 30p/kWh) and limited to a number of hours per year (eg 100h). So, a medium-sized London data centre of 5MW grid connection would be paid £125K per annum just for the ‘commitment’ and up to an additional £150K in one year for the energy demand avoided.
It may sound like a lot of revenue but we should look at it in context with the business flowing through and supported by the average 5MW UK data centre, which could be more than £5bn – and much, much more for a finance/banking operation.
This calculation highlights one of the highest barriers to the data centre signing up for STOR/DSR – any risk (which we will explore further) is not rewarded enough when the value of the data centre compute/storage load is considered. Failure of the data centre might involve loss of data, revenue, reputation or clients and one or more of these may negate any financial gain from STOR/DSR.
There are now three kinds of ‘support’ that a data centre can offer the grid.
The simplest, and with perhaps the least operational risk, is ‘islanding’. This is where the data centre disconnects from the utility and runs on its generators for the period the utility requires. This has an advantage for the data centre in that it is good for the gensets to run at load.
However, this is only true if each engine is loaded higher than 30% (to avoid longer-term service problems) and herein lays the main problem with islanding: most of the enterprise and collocation facilities run at partial load, so the demand reduction provided to the utility may, for example, only be 30-35% of the connection capacity.
The more complex STOR/DSR solution, with higher support provision but increased operational risk, is to run the generators in parallel with the utility and backfeed the surplus energy. In the previous condition of 30-35% partial load, this may produce 65-70% of the utility connection capacity. Any N+1 redundant generation capacity on site cannot be used, as the distribution transformer and cabling to site cannot carry the load. While the attraction might be increased revenue, the risk cannot be ignored.
If the generators are back-feeding the utility and the utility has a fault (eg a cable burst failure or distribution transformer winding short-circuit), there is a high probability that the generators will attempt to feed the short-circuit current. However, (due the high impedance of moist commercial gensets) the protection relays will trip the output circuit breaker – leaving the facility with no utility feed and no generator supply in reserve. Then it is a race between a cooling alarm in the data centre load starting to shut down servers and someone being able to reset the generator supply.
If damage has been caused to the generators themselves or their switchgear, then it is ‘good-night Vienna’.
Both islanding and back-generation use existing technology already installed in most UK facilities, such as meeting G59 regulations for paralleling with the utility. No no new technology is required, although G59 is usually applied for a few seconds rather than several tens of hours.
The latest kid on the block is the UPS feature of being able to discharge UPS standby batteries through the input rectifier back into the utility. This is limited to the battery autonomy, which has been gradually reducing towards five minutes and, with the latest data centre rated ‘short run-time, ultra-thin pure-lead plate VRLA’ could easily be one minute. This has the same risk in that a failure in the utility could disconnect, or even damage, the UPS and the restoration time may be insufficient. The conservative nature of most UK enterprise and collocation facilities probably means widespread adoption is very far off.
But first we should try to establish the scale of the support that UK data centres could offer the utility. The UK’s ICT load is estimated to consume about 10% of the utility, in the order of 3.5GW. Of that, about a third can be attributed to data centres, about 1.2GW.
For reasons of return on investment and practicality with the electrical protection system, STOR/DSR schemes usually apply to individual site loads of 2MW and higher. Of that 1.2GW, perhaps only 50% is above that threshold. So, the maximum opportunity is 600MW, which is (only) 30% of Dinorwig, the UK’s pumped storage hydroelectric scheme in Wales. That STOR scheme can supply a maximum power of 1.73GW and its sole purpose is to provide a fast response to short-term rapid changes in power demand.
Of course, not all 600MW of those UK data centres is suitable for, or would join, such a scheme. Suitability would include the fact most UK facilities are provisioned with standby rated gensets (ESP, Emergency Standby Power to ISO 8528) that should only be used for less than 70% load over any 24-hour running period and no more than 200h/year.
Even the ‘best’ facilities are only provisioned with Prime rated gensets (PRP, Prime Rated Power) that should only be used for less than 70% load over any 24-hour running period, albeit for unlimited running hours/year. There are also suitability issues for the fuel-storage on site, ie do you have enough to meet the STOR requirements without refuelling 24/7/365?
Going against the grain, I know of one very large multi-MW facility in the UK that has participated in STOR for nearly 10 years and, I am informed but can’t guarantee, that it has never been asked to start its engines. One reason for that could be, like Mark Twain’s quipp “reports of my death are greatly exaggerated”, that we have not been at real risk of ‘rolling blackouts’ as have been forecast for the past decade. We should note that domestic consumption has steadily fallen and heavy industry is a weak shadow of its former self, resulting in more than 20% of load reduction in the very recent past.
That raises the two questions: can UK data centres effectively be used to provide enough STOR/DSR and would they if they could?
I would argue that the likely capacity that could be provided is not enough to make any meaningful difference to the utility and it is the government’s job to set adequate requirements for supply-side response. Like another Dinorwig?
On the data centre side, it is clear that one sentiment overrides nearly everything else: “Our gensets are there to protect us from, and not support, the vagaries of the utility, and when the utility needs such help, that is precisely the time when we need them to be available only to protect the business.”