DC01UK – Radlett Wire https://radlettwire.co.uk For over ten years - since way before everything went weird - we've been droning on about Radlett and the Hertsmere Parliamentary constituency Mon, 07 Apr 2025 11:54:54 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 https://radlettwire.co.uk/wp-content/uploads/2022/04/cropped-Radlett-Wire-logo-512-square-32x32.jpg DC01UK – Radlett Wire https://radlettwire.co.uk 32 32 DC01UK: data centre drama https://radlettwire.co.uk/2025/04/dc01uk-data-hold-ups/ https://radlettwire.co.uk/2025/04/dc01uk-data-hold-ups/#respond Fri, 04 Apr 2025 18:01:04 +0000 https://radlettwire.co.uk/?p=3057 Continue reading DC01UK: data centre drama]]> It’s a few months since we learnt that Europe’s biggest data centre might be built here in Hertsmere – what could stop it from happening?

In part three of our DC01UK deep dive we’ll look at the various obstacles that must be overcome before it goes live on the Internet in 2030.

Five ducklings in a row on a white table, walking across the frame right to left.

It might fall at the first hurdle. The scheme has outline planning permission from Hertsmere Borough Council so the developers must get their ducks in a row and submit a final plan. If we’re honest, though, this doesn’t look like a major concern: the council has given the project its enthusiastic backing and the UK government has cleared the way by adding data centres to the list of developments that can be defined as nationally significant infrastructure projects, alongside energy, transport, water, waste water, and waste projects (Technology Secretary Peter Kyle has even mentioned DC01UK in a speech). Unless something else goes wrong, the project is probably guaranteed to happen. So what else could go wrong?

Hertsmere MP Oliver Dowden stands in a field with a group of Potters BAr residents who are objecting to the construction of a new data centre in the area

The neighbours might object. As with any big development – especially one planned for land that is 100% in the green belt – local people are upset about DC01UK and have begun a campaign. Sadly for the locals, though, this scheme is going to be very hard to stop. As we said in an earlier post, this is the kind of land Angela Rayner calls ‘grey belt’ and even the green belt lobby seems to have given up. Our MP has met with the local campaigners but it doesn’t sound like he was able to give them much hope: “I encourage residents to submit their own views on the matter directly to the council,” he says. The fact that the developers plan to leave half the land as green open space and have promised substantial enhancements to the local environment will not help the opposition’s cause.

Blue whale logo and logotype for Chinese AI firm Deepseek

The demand might not be there. The trigger for DC01UK – and hundreds of projects like it all over the world – was the massive surge in demand for data centre capacity that we wote about in our first post, almost entirely the product of the AI and machine learning revolution – apparently an unstoppable and unarguable fact of the modern world. But the launch, only two months ago, of a new large language model (LLM), from a Chinese firm called Deepseek, suggests the direction for AI might not be quite as ‘up and to the right’ as had been hoped by investors. Deepseek wasn’t supposed to be possible. American sanctions have stopped the sale of the latest versions of the specialist chips needed to train and run serious LLMs to Chinese firms. Deepseek was trained on chips from the top manufacturer NVIDIA but these chips had been deliberately downgraded so as to slow Chinese progress in AI. That a group of brilliant computer scientists was able to coax top-tier AI performance from second-tier hardware suggests that this might not be the brute-force business we thought it was to begin with.

The Deepseek engineers made such resourceful use of the hobbled chips’ capacity that they were able to get around the punitive sanctions regime and keep China in the AI game. And, more to the point, if one Chinese firm can make more efficient use of AI hardware then the American giants, then so can anyone. Suddenly the AI game doesn’t look so one-sided and the soaring demand for newer and faster hardware doesn’t look so nailed-on. If more really can be done with less, then maybe the world doesn’t need the vast additional computing and data centre capacity that’s now being built. So do we think that the spreadsheets that justified DC01UK’s grand plans have been dragged to the trash? No, we don’t. The underlying growth in demand for the kind of cloud services that run in data centres like this one is unabated – and most of it has no need of LLMs – but has the gloss come off the AI data centre business? Just a bit. We’d like to have been a fly on the wall in a post-Deepseek DC01UK planning meeting that’s for sure.

A vivid, graphical splash of water moving left to right through the air

It’s the water stupid. In our previous post we wrote about the extraordinary demands that a data centre on this scale makes of resources like electricity (to power the servers) and water (for cooling). The power is, apparently, already sorted. The water, though, may not be so straightforward. Our region, the East of England, is already classified as ‘severely water stressed‘ and environmental groups calculate possible daily shortages of up to 800,000 litres by 2050. We’ve calculated that DC01UK alone will need 250 million litres per year (660,000 litres per day) to keep its servers cool. Where will this vast quantity of water come from? In fairness to the developers they may be considering an approach to DC01UK that doesn’t need any water at all – at least not after the initial top-up. It’s possible to build a server farm with a ‘closed’ cooling system that recycles the cooling water used – condensing it after it’s evaporated and pumping it back through the system (Microsoft is testing this approach). It’s not easy, though, and you need to engineer your data centre from the ground up to take advantage of this approach, pumping water right through the computers to cool the chips directly. There are even more advanced solutions – like the one from Google’s parent company Alphabet that will site a direct air capture facility next door to your data centre, producing CO2 to be stored forever underground and clean water that can be used to cool the servers. Magic. But very expensive. And a relatively small firm like DC01UK probably doesn’t want to be adding cost to a low-margin business like a data centre if they don’t have to. Where will the new data centre get its water? And, as shortages bite, will DC01UK just dry up all together?

Black dog running across a meadow full of flowers

The money. Obviously. The companies behind DC01UK are not the final operators and won’t be funding the project. Can they guarantee the billions of pounds necessary to get a fitted-out, ready-to-launch DC01UK to market in 2030? Of course not. So this comes down to who is actually providing the money and to the absolute forest of unknowns – global recession, soaring borrowing costs in the UK, a tech crash that crushes demand – that might bring the thing to a grinding halt and leave that nice dog-walking field in South Mimms just as it is now.

US President Donald Trump walks toward the camera holding up his hand in front of him perhaps in greeting. Alongside him many flags including a stars and stripes

Then there’s Donald Trump. This one’s tricky. Will a new worldwide trade framework be a good thing or a bad thing for a UK data centre? We don’t think anyone knows right now and there are many contradictory factors. We think, though, that this global re-arrangement could actually be a good thing for DC01UK – and for firms like it outside the USA. Services – like those provided by DC01UK to big corporations – are not covered by the new American tariffs. That has to be a good thing in itself: the biggest national customer for data centre services is the United States by about a mile and worldwide data centres will be able to continue selling their capacity to American firms on the current terms. Since the physical location of a data centre is important – you want your servers to be close to your customers to reduce latency – local facilities like DC01UK will continue to be important.

The hardware that goes into data centres is, as you’d expect, mostly made in the far East. NVIDIA’s AI chips, for instance, are made in Taiwan – and US tariffs have been applied there. It is possible that the hike in price for these products in the USA could be beneficial to firms in the rest of the world – if there’s a sudden oversupply of hardware made in China, South Korea, Taiwan and Japan a glut of unsold kit could cause prices to drop here. Fitting out DC01UK could turn out be cheaper than planned. This fear of ‘dumping’ by lower-cost countries is putting the fear of God into UK manufacturers but since no computer hardware is made in Britain, this cannot be a concern. Although it’ll obviously be a years before any computers are purchased for DC01UK, it’s just possible that the geopolitical chaos unleashed by Donald Trump will turn out not to be an obstacle at all but, in fact, an advantage.


]]>
https://radlettwire.co.uk/2025/04/dc01uk-data-hold-ups/feed/ 0
DC01UK: a demanding new neighbour https://radlettwire.co.uk/2025/03/dc01uk-a-demanding-new-neighbour/ https://radlettwire.co.uk/2025/03/dc01uk-a-demanding-new-neighbour/#respond Tue, 18 Mar 2025 12:08:22 +0000 https://radlettwire.co.uk/?p=2902 Continue reading DC01UK: a demanding new neighbour]]> Previously on Radlett Wire: two local firms have secured outline planning permission to build Europe’s biggest data centre right here in Hertsmere. We don’t know much more than that. For instance we don’t know who the final client is or who will use the facility once it’s live. Likewise, we have only the most basic information about the facility itself. If you know more than we do about any of this or if you have anything you’d like us to know about the development, leave a comment under this post or email DC01UK@radlettwire.co.uk.

Aerial view of the proposed DC01UK data centre at South Mimms in Hertfordshire. Three large, rectangular buildings are shown. Labels read: 'site entrance', 'emergency access' and 'woodland'. A red outline indicates the extent of the site.

So, in this, the second post in our DC01UK deep dive, we’ll look at what we don’t know and what we should be keeping an eye on as this development proceeds. In the next post we’ll look at the bigger trends – in business, geopolitics and in AI – that might stop it from happening all together. Later we’ll look at the reaction of the affected communities: the people of Potters Bar and the settlements around it who will have to put up with this huge construction project and its long-term operation.

Seven key measures

The data centre business is a mature one. The first dedicated facilities built specially to host computer servers in Britain actually precede the Internet – built mainly for businesses in the City of London – and are now almost forty years old. The industry, over the years, has developed a pretty complete set of standards and measurements (mostly simple ratios) that are used to compare facilities and to judge efficiency. As the development progresses we can use these ratios to work out whether DC01UK is doing a conscientious, thoughtful job or just chucking up a big dumb data barn. As neighbours and Hertsmere electors we think we have a right to hope that this development is clean, efficient and up-to-date – especially as the project appears to be a fait accompli in planning terms (National and local government have signed off on it and expect it to go ahead). At the moment, though, we’re going to have to rely on a fair amount of guesswork because they’ve told us hardly anything (next to each measure we’ve put an indication of how much we know now, in italics – hopefully this will change as the development proceeds).

  1. Electricity
    PUE (power usage effectiveness) basic information
    This is the biggie, obviously. Data centres draw more electricity than almost any other kind of industrial facility and they’re the fastest-growing power draw on the grid right now, almost everywhere. Recent developments (crypto and now AI) have accelerated this growth. The only useful datapoint we’ve got from DC01UK so far relates to electricity use and it’s a helpful one because we can use it to calculate some of the others. The original announcement from Hertsmere council says that the National Grid has reserved a 400MVA substation for the facility. That translates quite precisely into a peak power draw of 360MW. St Albans, for comparison (population 150,000), can draw about a third of that – 135MW. DC01UK is going to be huge.
  2. Water
    WUE (water usage effectiveness) unknown
    Data centres get very hot (put your hand round the back of your computer and feel the heat it’s putting out – now multiply that by millions) and they’re cooled these days using a mxture of conventional air-conditioning and water pumped through the servers. Although no figures for water use have been published we can use the electricity figure to calculate that if it’s built to current cooling standards DC01UK will need about 250 million litres per year. That’s about 660,000 litres per day, a vast multiple of the water needed by the agricultural land it will be built on, to state the obvious. Eastern England is one of the most water-stressed parts of Britain and already experiences shortages in dry spells. ‘Zero-water’ data centre cooling systems exist – they recycle cooling water instead of just allowing it to evaporate. Can we expect a system like this from DC01UK?
  3. CO2
    CUE (carbon usage effectiveness) unknown
    This is a measure of how much CO2 is emitted per unit of IT energy consumed in the data centre. The lower the CUE the better the environmental performance. This is another factor for which we have no figures at all. The developers will have to calculate this number and they may already have done so. A CUE of 0.0 is possible but only if 100% renewable electricity is used. In a developed economy a figure of 0.2 is more likely. In the UK where there’s no coal in the energy mix but where there’s still a lot of gas, the number is likely to be between 0.1 and 0.2. It’s going to be fascinating to see DC01UK’s figures once they’re public.
  4. Energy reuse
    ERE (energy reuse effectiveness) unknown
    This one’s closely related to the CO2 figure above and to the heat figure below. If energy is used efficiently and not wasted – if heat from CPU cooling is reused elsewhere in the building, for instance – the CUE will come down. 1.0 is the worst here; it means all the energy that’s not used is wasted. A number lower than that indicates greater efficiency.
  5. Renewable energy
    REF (renewable energy factor) unknown
    In Britain in 2025 (and certainly in 2030 when DC01UK is meant to be finished) there’s no reason why electricity for a single facility like this one shouldn’t be from 100% renewable sources. Likewise there’s no reason why electricity, once generated, shouldn’t be stored – either on-site or elsewhere – to improve efficiency and reduce cost. It’s another unknown but perhaps the presence of a green energy business in the DC01UK partnership might help here.
  6. Heat
    HRE (heat reuse effectiveness) unknown
    In Britain we tend to heat our homes individually, using gas boilers. Elsewhere, especially in Europe, buildings or whole districts are heated from a single source. So, it’s probably a bit unrealistic to expect that the spare heat produced by DC01UK be used to heat homes in South Mimms and Potters Bar. It’ll probably just be vented to the air. Heat reuse effectiveness (HRE) is a simple percentage – what proportion of the heat generated is re-used? The most efficient industrial heat sources in the world, including some data centres, achieve HREs of above 50% – less than half of waste heat is thrown away. Unless the DC01UK developers have a surprise in store for us – perhaps a groundbreaking plan to heat local housing or greenhouses – then we expect this to be a very low number: somewhere between 1 and 5%.
  7. Land
    LUE (land use efficiency) basic information
    This is another calculation that’s fairly easy to do, so we can make a decent guess at how efficiently this data centre will use the land allocated to it. Dividing the published figures for the power draw of the DC01UK data centre (360MW) by its area (93,000 sqare metres) and assuming a 75% usage rate gives us just below 3 kW/m², a measure which puts it in the high efficiency category for all buildings but somewhere in the middle for a data centre. It will be fascinating to learn just how ambitious the developers intend to be.

There’s an irony in the fact that the best possible way to cool a huge computing facility like DC01UK would be, well, to build it somewhere else, somewhere really cold to be specific – somewhere like Northern Norway or Alaska. Cooling is then effectively free – you just find a way to let the cold air from outside in (and once you’ve warmed it up you make sure you’re not just venting it back to the outside but using it to heat the housing estate or the university next door). And it doesn’t need to be in the arctic – simply moving your datacentre seven or eight degrees North – say from Potters Bar to Stockholm – would reduce the cooling costs of a 360MW data cantre by 100,000–150,000 MWh per year.

A data centre is a very 21st Century beast. Once it’s built, most people won’t know it’s there and it’ll hum away on the horizon, behind its careful landscaping, relatively innofensively. And since all of its critical inputs and outputs will be travelling over fibre-optic and copper cables buried in the ground there’ll be no queue of 40-foot trucks, no rail siding, no enormous car park. But we shouldn’t be fooled into thinking that DC01UK is going to have no impact. Its need for electricity alone will be vast – amounting to almost 1% of the whole country’s demand for power (this is our estimate – we’ve calculated an annual draw for DC01UK of 2.37TWh at a relatively conservative 75% utilisation and the UK’s demand in 2023 was 266TWh).

A data centre is the ultimate globalised business. Once it’s switched on billions of transactions will take place every day in that inconspicuous high-tech shed next to the M25, connecting people, businesses and machines in every corner of the world.


]]>
https://radlettwire.co.uk/2025/03/dc01uk-a-demanding-new-neighbour/feed/ 0
DC01UK: Hertsmere steps into history https://radlettwire.co.uk/2025/02/hertsmere-steps-into-history/ https://radlettwire.co.uk/2025/02/hertsmere-steps-into-history/#respond Tue, 04 Feb 2025 18:33:54 +0000 https://radlettwire.co.uk/?p=2899 Continue reading DC01UK: Hertsmere steps into history]]> Well, possibly. Data centres are the cotton mills and steel foundries of our day. Vast, industrial-scale facilities that are right at the heart of the fourth industrial revolution. And one might be built right here, next to the motorway in South Mimms.

One of DC01UK's visualisations of their proposed data centre in South Mimms, Hertfordshire.
One of those visualisations that developers churn out. More here.

We’ll admit to a certain childish excitement about this. Mysterious developers want to build Europe’s largest data centre right here in Hertsmere. The developer applied for planning permission in September of last year and last week the council granted outline permission for the development. We suspect they’re as excited as we are, although there’s some doubt as to how involved Hertsmere can now be, given the special status of this development. We’ll be keeping an eye on this project and we’ll provide as much detail as we can as it progresses but, to begin with, here’s what we know so far:

Are you serious? On the green belt? The data centre, if built, will be on green belt land to the East of South Mimms services, right next to the M25. The developers are in luck, though: this is exactly the kind of relatively-unloved green belt that the government has recently been calling ‘grey belt‘. You might expect a plan to build on 85 acres of agricultural land in a district that has a history of green belt belligerence to produce an angry reaction (like the one that held up the Radlett Railfreight terminal for years) but when even Peter Waine, chair of the Hertfordshire branch of the Campaign to Protect Rural England, speaking to the BBC, can’t muster more than: “It may not be the most wonderful and beautiful green belt but it is green belt…” Here at Radlett Wire, at the other end of Hertsmere, we get the strong feeling this particular chunk of green belt may not be long for this world.

Map showing the area to the East of South Mimms services in Hertsfordshire, between Potters Bar and the M25 motorway. Marked on the map is the proposed site of a large data centre.
Inside the red outline

It’s going to be huge. The claim, in the news stories and press releases, that DC01UK will be the largest data centre in Europe looks defensible. Europe’s largest running data centre is presently in Portugal. It covers about 800,000 square feet and the proposed South Mimms operation is aiming for two million square feet. But there’s another project, in Newport, Wales, that’s also aiming for two million and, anyway, the critical measurement – especially for the local community – is probably not square feet. Planners and architects use a measure they call gross external area (GEA), which is more useful. It’s the total area of a building, including all floors, measured from the outside. It’s the critical measurement used for rating, council tax and planning. DC01UK’s GEA is 187,000 square metres, slightly more than the 185,000 square metres (2,000,000 square feet) that will be devoted to computers. And as we’ll explain, there are other important numbers – relating to water and electricity use and to pollution – that we should understand and keep an eye on.

But size isn’t everything. If completed – and if not eclipsed by other projects along the way – this puts DC01UK just inside the top ten biggest data centres in the world, but it’s an extremely volatile market, and brutal and connected directly to the financial performance of the companies that use these facilities. Data centre capacity is a commodity – like bauxite or wheat. The biggest firms (Google, Amazon…) build and run their own but most data centre capacity is bought and sold by intermediaries or brokers. A business that needs servers for a new app is concerned only with cost; margins must be hair-thin, effiency constantly improving. So, honestly, how this plays out, between now and the proposed opening date in 2030 is anybody’s guess. Whatever happens, as you’d expect, this enormous facility is already well-and-truly dwarfed by the biggest. A single Chinese data centre, operated by China Telecom, is already over five times bigger than DC01UK and sits on a campus called the Inner Mongolia Information Park in Hohhot, China alongside several other enormous facilities (truly visible from space). Also, you probably won’t be surprised to learn that the seventh largest data centre in the world is buried in the side of a mountain in Utah and is operated by the United States National Security Agency. It’s safe to say that, although you’ve never heard of it, it knows a fair amount about you.

A visualisation of a data centre proposed for land near South Mimms in Hertsmere, UK. A modern, two-story industrial building is surrounding by planting.
Another one of those visualisations.

Who will own the data centre? Guess what: we don’t know. We know a bit about the company set up to see it through planning, DC01UK, but nothing about the facility’s final user nor even its ultimate owner. This project is, according to Computer Weekly, a joint venture with local house-builder Griggs and Chiltern Green Energy, neither of which – I hope they won’t mind me saying – are what you’d call giants of the information age. The Register, a UK tech news web site, says it will ultimately be used by one of the ‘hyperscalers’ (industry jargon for the handful of tech giants in a position to make use of such a huge facility) but that the usual suspects – Amazon, Google, Microsoft and Meta – all declined to comment. This is not surprising. The big firms are typically secretive about their data centres – these are, to state the obvious, business-critical, round-the-clock facilities and ‘uptime‘ must be protected at all costs. Amazon, in particular, treats its data centres as if they were national security assets and goes to great lengths to hide them. There’s only one project described on DC01UK’s web site and there’s no ‘about us’ page. We expect it’ll be some time before we know who the final client is.

There’s some history in this area. DC01UK was incorporated in 2022 and was originally called Hilfield Battery Storage so we assume it’s connected with the application to build a battery storage farm on Hilfield Farm in Aldenham, rejected last year after a public inquiry. One thing is clear: the deal-making has begun. Vast facilities like DC01UK don’t just show up in districts like ours – they’re incentivised to do so. Tax breaks, subsidies, promises about transport, housing and infrastructure, cut-throat competition between regions – all will play a part in this project. We don’t know enough about any of this yet – and the actors are secretive, but we suspect there will be people in Hertsmere who already know substantially more than us – we’d love to hear from you in the comments!

What’s so exciting about data centres? Aren’t they just big, dumb warehouses for information? Data centres have been a big deal for a long time, since the first of the really huge online businesses began to build their own all around the world. But the pressure to provide new data centre capacity has recently exploded (last year Goldman Sachs estimated a 160% increase in demand). It’s become a very big story in the business and technology press and now it’s crossed over into the mainstream media and social media – and it’s all because of artificial intelligence (AI). AI needs an enormous amount of computing power – both in training new models and in running queries against them once they’re live. Typing a query into ChatGPT will usually use at least ten times more computing power – and thus electricity – than a Google search. And the industry really didn’t see this coming. It’s really only a couple of years since it dawned on the tech firms and on their investors that they were going to need many times more computing power than previously projected. So the rush is on.

Governments and local authorities have noticed this massive new opportunity and are rushing at it. Three weeks ago Keir Starmer launched his AI action plan, which he says will ‘deliver a decade of national renewal’. We should probably also acknowledge the boldness of Hertsmere here. Council leader (and head of something called the Hertfordshire Growth Board) Jeremy Newmark has obviously moved quickly. Apart from the obvious explosive dynamism of the AI industry he’ll be aware that, here in Britain, the government has committed to a completely new and much more aggressive orientation on economic growth – and to change planning procedures to speed up development. More specifically, data centres have been reclassified as critical national infrastructure and will have special status when it comes to planning inquiries. In fact, developments recognised as nationally significant infrastructure projects have had special status since the then Labour government created the status in 2008. In this regime, the local authority is not the lead planning authority – just a ‘statutory consultee’. DC01UK may already be on the fast track.

But could anything hold it up? Short answer: yes. And it probably won’t be planning. Even in the couple of weeks since we learnt about the project, the terms of the explosion in data centre capacity have changed completely. Less than a week before Hertsmere’s DC01UK announcement a new Chinese-built AI model called Deepseek R1 was launched. To say that it caused an epic freak-out at every level of the AI ecology would not be an overstatement. The freak-out centred on the fact that this new software was almost as efficient as the leading American models although it had been built on simpler technology – and much more cheaply. It was a huge shock, challenging the fundamentals of the emerging industry. As it dawned on the company’s competitors that a rich and useful AI model could be built using a fraction of the resources, that their assumptions about the progress of the technology could be all wrong, share prices tumbled, projects were cancelled or paused, projections and forecasts altered. So, could this global chaos change the trajectory of the South Mimms project? Cause it to be cancelled or scaled back? It really could. At Hertsmere Council everyone will have their fingers crossed.

In the next post, more about the crazy business of powering (and cooling) a modern data centre.


  • For comparison, the largest warehouse in the UK – belonging to Amazon.com, natch – also covers around two million square feet. The O2 covers 1,126,270 square feet.
  • It took the addtion of the magical term ‘AI’ to give all this stuff the kind of political salience it needed. When Technology Secretary Peter Kyle announced the change in status for data centres in September the most exciting application he could come up with for these gigantic buildings was storage: “…from photos taken on smartphones to patients’ NHS records and sensitive financial investment information.” Boring! In the few months since then, the AI brainworm has fully infected everyone involved. When Rachel Reeves announced her big ‘growth reset’ in January it was all AI, all the time.
  • Amazon’s policy of secrecy about the location of their data centres even extends to their own staff. According to Peter Judge in Data Center Dynamics: “Only those within Amazon who have a legitimate business need to have such information know the actual location of these data centers…”
  • There’s another huge data centre under construction in Hertfordshire – Google is building the company’s first UK facility in Waltham Cross. Incidentally, the existence of this project may suggest that Google is unlikely to be DC01UK’s ultimate customer.
  • Everyone involved with this project is obviously moving quickly. If the government’s promises about removing barriers to growth are real, this project could almost be a kind of benchmark. The completion date in the press release is a bit unimpressive, though. 2030? Elon Musk claims to have built his new Colossus AI data centre in 19 days (although he did basically just stuff 100,000 NVidia GPUs into an old washing machine factory).
  • Bookmark this page to keep up with our updates on DC01UK. If you’re one of those weirdos who still uses an RSS reader, follow our feed or this Google news alert (RSS explained). We’ll almost certainly be tweeting about this of course.
  • Links, references and background we’re saving as we cover this story.
]]>
https://radlettwire.co.uk/2025/02/hertsmere-steps-into-history/feed/ 0