Previously on Radlett Wire: two local firms have secured outline planning permission to build Europe’s biggest data centre right here in Hertsmere. We don’t know much more than that. For instance we don’t know who the final client is or who will use the facility once it’s live. Likewise, we have only the most basic information about the facility itself. If you know more than we do about any of this or if you have anything you’d like us to know about the development, leave a comment under this post or email DC01UK@radlettwire.co.uk.

So, in this, our second post about DC01UK (here’s the first, a detailed round-up of what we know so far), we’ll look at what we don’t know and what we should be keeping an eye on as this development proceeds. In the next post we’ll look at the bigger trends – in business, geopolitics and in AI – that might stop it from happening all together. Later we’ll look at the reaction of the affected communities: the people of Potters Bar and the settlements around it who will have to put up with this huge construction project and its long-term operation.
Seven key measures
The data centre business is a mature one. The first dedicated facilities built specially to host computer servers in Britain actually precede the Internet – built mainly for businesses in the City of London – and are now almost forty years old. The industry, over the years, has developed a pretty complete set of standards and measurements (mostly simple ratios) that are used to compare facilities and to judge efficiency. As the development progresses we can use these ratios to work out whether DC01UK is doing a conscientious, thoughtful job or just chucking up a big dumb data barn. As neighbours and Hertsmere electors we think we have a right to hope that this development is clean, efficient and up-to-date – especially as the project appears to be a fait accompli in planning terms (National and local government have signed off on it and expect it to go ahead). At the moment, though, we’re going to have to rely on a fair amount of guesswork because they’ve told us hardly anything (next to each measure we’ve put an indication of how much we know now, in italics – hopefully this will change as the development proceeds).
- Electricity
PUE (power usage effectiveness) basic information
This is the biggie, obviously. Data centres draw more electricity than almost any other kind of industrial facility and they’re the fastest-growing power draw on the grid right now, almost everywhere. Recent developments (crypto and now AI) have accelerated this growth. The only useful datapoint we’ve got from DC01UK so far relates to electricity use and it’s a helpful one because we can use it to calculate some of the others. The original announcement from Hertsmere council says that the National Grid has reserved a 400MVA substation for the facility. That translates quite precisely into a peak power draw of 360MW. St Albans, for comparison (population 150,000), can draw about a third of that – 135MW. DC01UK is going to be huge. - Water
WUE (water usage effectiveness) unknown
Data centres get very hot (put your hand round the back of your computer and feel the heat it’s putting out – now multiply that by millions) and they’re cooled these days using a mxture of conventional air-conditioning and water pumped through the servers. Although no figures for water use have been published we can use the electricity figure to calculate that if it’s built to current cooling standards DC01UK will need about 250 million litres per year. That’s about 660,000 litres per day, a vast multiple of the water needed by the agricultural land it will be built on, to state the obvious. Eastern England is one of the most water-stressed parts of Britain and already experiences shortages in dry spells. ‘Zero-water’ data centre cooling systems exist – they recycle cooling water instead of just allowing it to evaporate. Can we expect a system like this from DC01UK? - CO2
CUE (carbon usage effectiveness) unknown
This is a measure of how much CO2 is emitted per unit of IT energy consumed in the data centre. The lower the CUE the better the environmental performance. This is another factor for which we have no figures at all. The developers will have to calculate this number and they may already have done so. A CUE of 0.0 is possible but only if 100% renewable electricity is used. In a developed economy a figure of 0.2 is more likely. In the UK where there’s no coal in the energy mix but where there’s still a lot of gas, the number is likely to be between 0.1 and 0.2. It’s going to be fascinating to see DC01UK’s figures once they’re public. - Energy reuse
ERE (energy reuse effectiveness) unknown
This one’s closely related to the CO2 figure above and to the heat figure below. If energy is used efficiently and not wasted – if heat from CPU cooling is reused elsewhere in the building, for instance – the CUE will come down. 1.0 is the worst here; it means all the energy that’s not used is wasted. A number lower than that indicates greater efficiency. - Renewable energy
REF (renewable energy factor) unknown
In Britain in 2025 (and certainly in 2030 when DC01UK is meant to be finished) there’s no reason why electricity for a single facility like this one shouldn’t be from 100% renewable sources. Likewise there’s no reason why electricity, once generated, shouldn’t be stored – either on-site or elsewhere – to improve efficiency and reduce cost. It’s another unknown but perhaps the presence of a green energy business in the DC01UK partnership might help here. - Heat
HRE (heat reuse effectiveness) unknown
In Britain we tend to heat our homes individually, using gas boilers. Elsewhere, especially in Europe, buildings or whole districts are heated from a single source. So, it’s probably a bit unrealistic to expect that the spare heat produced by DC01UK be used to heat homes in South Mimms and Potters Bar. It’ll probably just be vented to the air. Heat reuse effectiveness (HRE) is a simple percentage – what proportion of the heat generated is re-used? The most efficient industrial heat sources in the world, including some data centres, achieve HREs of above 50% – less than half of waste heat is thrown away. Unless the DC01UK developers have a surprise in store for us – perhaps a groundbreaking plan to heat local housing or greenhouses – then we expect this to be a very low number: somewhere between 1 and 5%. - Land
LUE (land use efficiency) basic information
This is another calculation that’s fairly easy to do, so we can make a decent guess at how efficiently this data centre will use the land allocated to it. Dividing the published figures for the power draw of the DC01UK data centre (360MW) by its area (93,000 sqare metres) and assuming a 75% usage rate gives us just below 3 kW/m², a measure which puts it in the high efficiency category for all buildings but somewhere in the middle for a data centre. It will be fascinating to learn just how ambitious the developers intend to be.
There’s an irony in the fact that the best possible way to cool a huge computing facility like DC01UK would be, well, to build it somewhere else, somewhere really cold to be specific – somewhere like Northern Norway or Alaska. Cooling is then effectively free – you just find a way to let the cold air from outside in (and once you’ve warmed it up you make sure you’re not just venting it back to the outside but using it to heat the housing estate or the university next door). And it doesn’t need to be in the arctic – simply moving your datacentre seven or eight degrees North – say from Potters Bar to Stockholm – would reduce the cooling costs of a 360MW data cantre by 100,000–150,000 MWh per year.
A data centre is a very 21st Century beast. Once it’s built, most people won’t know it’s there and it’ll hum away on the horizon, behind its careful landscaping, relatively innofensively. And since all of its critical inputs and outputs will be travelling over fibre-optic and copper cables buried in the ground there’ll be no queue of 40-foot trucks, no rail siding, no enormous car park. But we shouldn’t be fooled into thinking that DC01UK is going to have no impact. Its need for electricity alone will be vast – amounting to almost 1% of the whole country’s demand for power (this is our estimate – we’ve calculated an annual draw for DC01UK of 2.37TWh at a relatively conservative 75% utilisation and the UK’s demand in 2023 was 266TWh).
A data centre is the ultimate globalised business. Once it’s switched on billions of transactions will take place every day in that inconspicuous high-tech shed next to the M25, connecting people, businesses and machines in every corner of the world.
- In a difficult period for the world economy, data centres are one of the few growth areas so a bit of Googling will yield plenty of freely-available data and background information, from governments, NGOs and consultancies:
Energy trends in the UK (PDF) – Department for Energy Security and Net Zero.
Energy Consumption in Data Centres and Broadband Communication Networks in the EU (PDF) – European Commission.
Data Centres and Data Transmission Networks – International Energy Agency.
UK Government Sustainable ICT – a UK government blog – a few years old but some useful background on the issues.
The energy conundrum – a report from Savills, a property consultancy. - When placing a data centre, it’s not just the average temperature that matters. Humidity is also important. The drier the climate the easier it is to cool the facility using evaporative cooling. This explains why enormous data centres can still be viable even when built in extremely hot locations like the high desert of Arizona. You won’t need us to tell you that the climate in South Mimms will not permit much evaporative cooling.
- The DC01UK web site is full of data but, understandably, none of it is about the environmental impact or the resource demands. It’s all about the contribution the facility will make to the local economy.
- Bookmark this page to keep up with our updates on DC01UK. If you’re one of those weirdos who still uses an RSS reader, follow our feed or this Google news alert (RSS explained). We’ll almost certainly be tweeting about this of course. Email DC01UK@radlettwire.co.uk.