data centre

The AI infrastructure gold rush is leaving everyone else behind

AI-optimised infrastructure risks creating a two-tier data centre economy where traditional workloads are deprioritised, says Jason Walsh
Blogs
Image: Stockfresh

9 January 2026

The most consequential thing happening in AI isn’t the chatbots or the breathless talk of superintelligence. It’s plumbing.

While debate around AI has tended toward catastrophe (predictions of doom are part of the propaganda, functioning to siphon ever more cash, both private and public, into the sector) lesser noted events have the potential for real impact. One such thing is the re-tooling of core IT infrastructure.

Indeed, not only are PC manufacturers rushing to push AI PCs, which no-one can really explain as a product or category, and component manufacturers such as Intel rushing to design AI chips, even things as fundamental – and boring – as data centres are changing.

 

advertisement



 

The recent announcement of a partnership between Nvidia and Lenovo, for instance, demonstrates just how thoroughly AI has captured infrastructure planning. The programme promises to help AI cloud providers deploy gigawatt-scale data centres in weeks rather than months – a feat achieved through liquid cooling, pre-integrated systems, and tight vendor lock-in. Traditional workloads aren’t even part of the conversation.

More broadly, AI workloads require GPU-dense infrastructure with dramatically higher power consumption (up to six times more per rack), advanced cooling systems, and high-bandwidth networking that traditional CPU-based data centres weren’t designed for, pushing operators toward strategic locations with abundant renewable energy and cooler climates.

The result is a crowding-out effect. Goldman Sachs projects traditional workloads will fall from 32% of the market to just 23% by 2027, while AI grows to 27%. According to Deloitte, 80% of data centres are already experiencing resource competition, with 92% citing power capacity as the “primary locus of competition”.

McKinsey’s figures make the capital imbalance stark: AI-equipped data centres are projected to require $5.2 trillion in investment by 2030, while traditional infrastructure gets $1.5 trillion – less than a third. 

This needs to be understood alongside the fact that data centre chip roadmaps are increasingly optimised specifically for large language models rather than general workloads, suggesting that AI will dictate server design for years.

The money has made its choice, even if businesses running databases, ERP systems, and the unglamorous applications that actually keep companies functioning haven’t made theirs.

Read More:


Back to Top ↑