Honey, I shrunk the data centres: Is small the new big?

Honey, I shrunk the data centres: Is small the new big?

Zoe KleinmanTechnology editor

AFP via Getty Images A worker at a data centre in Sydney, Australia, walking down the middle of two rows of red-faced computer serversAFP via Getty Images

One day the mighty data centre could be toppled into obsolescence by the humble smartphone, said Perplexity CEO Aravind Srinivas on a recent podcast.

Speaking to host Prakhar Gupta, the AI chief argued that people will eventually use powerful, personalised AI tools that will be able to run on the hardware already inside their devices.

This will be instead of the AI relying on transmitting data to and from enormous data centres, and using remote computers to function, as is generally the case now.

Apple’s AI system, Apple Intelligence, already runs some features on specialised chips inside the firm’s latest range of products. The tech giant says this means that its AI tools can operate more quickly, and also keep private data more secure.

Microsoft’s Copilot+ laptops also include on-device AI processing.

But these are all premium-priced gadgets. In general, not many current devices have that capability. AI requires powerful processing that’s beyond the means of standard equipment.

“It’s long term ‘if and when’ powerful and efficient AI can run on local devices,” says Jonathan Evans, director of consultancy company Total Data Centre Solutions.

The data centre industry certainly isn’t shrinking in terms of demand. But is it getting smaller in other ways?

Data centres are traditionally huge buildings, packed full of powerful computers that carry out a large number of digital tasks in addition to driving AI, ranging from video streaming and online banking, to AI processing and data storage.

It’s likely that anything you have an online login for uses a data centre somewhere in the world. Big companies own them, smaller ones lease capacity inside them.

Yet a few years ago I heard about a tiny data centre, the size of a washing machine, that was being operated in Devon, UK. In addition to its computing power, the heat it was releasing was warming a public swimming pool.

This was the first time I’d encountered a data centre that wasn’t a giant warehouse, and I was initially very sceptical about the whole thing.

Since then I’ve heard of plenty of other examples. In November 2025, a British couple revealed they were heating their home via a small data centre housed in their garden shed.

A month later, I had dinner with a university professor who told me he had a GPU – a powerful computer processor used to drive AI – under his desk. And as it churned away, it was also keeping his office warm.

At the same time, the tech companies are investing billions of dollars in enormous data centre plants around the world. There are around 100 new ones underway in the UK alone. Data centres are energy hungry, and there are significant concerns about their environmental impact.

Nvidia CEO Jensen Huang calls data centres “AI factories”. The argument in favour of them is that we need them to enable rapidly-evolving AI technology.

For a long time, the AI sector insisted that there was an apparently exponential “scaling” rule which meant that the more computing power you threw at AI, the better it became – although that seems to have slowed.

But I’m increasingly hearing voices in the tech sector who question the rationale that this all has to be housed within remote and massive data centres.

Evans says there’s a case for “smaller ‘edge’ data centres near large populations”, which would reduce latency and result in faster response times.

“Small is definitely the new big,” says Mark Bjornsgaard. He was the founder of DeepGreen – the company that made the swimming pool data centre.

He thinks every public building should instead house a small data centre, working in a large network with each other where required, and providing heating as a by-product.

“London is just one giant data centre that hasn’t been built yet,” he says.

AFP via Getty Images A woman with nails painted bright red holds her mobile phoneAFP via Getty Images

Amanda Brock, the head of business organisation OpenUK, shares this view. “The data centre myth will be a bubble that will burst over time, I think,” she tells me. Although she didn’t want to put a date on it.

She thinks derelict buildings and closed shops should be repurposed into small data centres instead.

Some are looking a little further afield than high streets and cities: space.

“Space offers a unique opportunity to rethink data structure, where small, scalable data centres in orbit can deliver efficiency, performance and flexibility,” says Avi Shabtai, the CEO of Ramon Space, one firm developing the technology.

Back on terra firma, Brock agrees with Perplexity’s Srinivas that fewer data centres will be required, and that she instead thinks “processing will move to a handheld device, or a set-top box, or a router in your home”.

This might also become more likely if it’s not only the data centres that are shrinking – but also the AI tools themselves.

There’s been huge hype around Large Language Models – massive, powerful AI models trained on vast amounts of data, which run the AI chatbots we use to generate content. But we have also become familiar with their tendency to make mistakes.

It happens in part because of their incredibly broad remit.

As the AI ethics campaigner Ed Newton Rex once put it to me: an AI tool designed to spot signs of cancer does not also need to be able to write song lyrics in the style of Taylor Swift.

AFP via Getty Images A huge, multi-building data centre in Ohio, USAFP via Getty Images

Businesses increasingly agree, and are opting for bespoke enterprise AI tools instead: more expensive but trained on their own data, which is not then used in the training of other products, and primed to carry out tasks specific to the company.

These smaller, private tools tend to perform more accurately, and can require less computing. It is also more likely that it can all be stored on the premises.

“I’ve spoken to multiple people who aren’t seeing the benefits of using generic AI tools,” says Dr Sasha Luccioni, AI and climate lead at machine-learning firm Hugging Face.

“We are already seeing a paradigm switch between large models taking huge resources, to smaller models being more bespoke and running more locally and tailored to business uses.”

But would a plethora of small data centres present a headache for national security?

“The counter argument here is that small targets have less impact if they are penetrated,” says Prof Alan Woodward from Surrey University, a computer security expert.

“Larger centres can be big points of failure, as we’ve seen recently with huge AWS [Amazon Web Services] centres going down.”

There’s also an environmental benefit to a move away from large data centres, adds Luccioni, who says they “are taking more and more resources”. “It makes sense to not use them all of the time.”

Published at Wed, 14 Jan 2026 00:07:50 +0000

Leave a comment

Your email address will not be published. Required fields are marked *