AI data centres are concentrated in a few countries
40% of global investments in AI are made by a mere 100 companies.

Just 32 nations have AI data centres, says the NYT - and Malaysia and Singapore are among them. Is AI the new digital divide and what's an AI data centre anyway?
I had the pleasure to brief NYT correspondent Paul Mozur on AI data centres in the region some weeks ago, so I was naturally excited when I saw his co-written story published.
Spoiler: It was a pure backgrounder call. So nope, I wasn't quoted.
The AI divide
Anyway, the story was about how AI data centres are concentrated in a handful of countries, and 40% of global investments in AI are made by a mere 100 companies.
Nations with little or no AI are running into limits for:
- Scientific work.
- Growth of startups.
- Talent retention.
To be clear, it is entirely possible to train AI models with remote GPUs.
The point made by the report is how this leaves countries without AI beholden to foreign corporations and governments.
Simplistic representation
I need to note that data was compiled by Oxford University researchers who looked at the customer websites of the nine largest cloud service providers to trace where AI chips are hosted.
This means the data overlooks:
- GPUaaS providers.
- Tier 2 cloud providers.
- Government-built AI data centres.
- AI data centres set up for in-house use.
The report argues that the trends are unmistakable, however. Just 32 nations have AI data centres; more than 150 countries have nothing.
AI data centres
It's hard to miss the fact that Johor's data centre growth appears to be going into a period of rationalisation. There's also the overhaul in electricity tariff that takes effect tomorrow (1st July) that's caught many by surprise.
For now, Malaysia has a sizeable number of data centres, many of which run AI workloads. But what's an AI data centre anyway?
As with all things data centres, I expect this to evolve. For now, I'll say it's a data centre with:
- Access to at least 25MW of electricity.
- Infrastructure for high-density racks.
- Support for liquid cooling.
I've argued before that AI data centres and traditional data centres are diverging.
- While AI data centres move towards 130kW and higher, new traditional data centre workloads are still solidly in the low double-digit kW range.
- Even a deployment of two or three H200 GPU servers (8x GPUs each server) requires less than 40kW. By the way, that's over US$1M in GPU hardware alone.
I'll address this more another day.
For now, do you think the AI divide is real?
Access the Oxford study here.