GeoCoded Special Report: State of Global AI Compute (2025 Edition)

Executive Summary & Key Insights

Artificial intelligence no longer advances on the strength of clever code alone. The newest language and multimodal models demand tens of thousands of specialised chips, petabytes of memory and gigawatts of electricity. Countries that can marshal these resources gain not only economic advantage but also strategic leverage. This report takes stock of the world's AI computing infrastructure in mid-2025, highlighting who controls the most "digital horsepower," how resources stack up on a per-person and per-dollar basis, where the chokepoints lie, and what trajectories are plausible for the rest of the decade.

AI Compute Key Insights

Global AI Compute Capacity

Key Insights from the 2025 Edition Report

Insight Evidence & Context
U.S. remains the compute superpower.
The United States accounts for 6.696 exaflops of aggregate Top500 performance, 48.4% of the global total. It hosts the world's three exascale systems—El Capitan (1.742 EFlop/s), Frontier (1.353 EFlop/s) and Aurora (1.012 EFlop/s).
Europe punches above its weight.
Through the EuroHPC initiative, Europe operates multiple capability-class machines. Finland's LUMI (379.7 PFLOPs) and Switzerland's Alps (434.9 PFLOPs) give those countries the highest per-capita compute densities—≈70 PF/M and ≈53 PF/M respectively. Italy's HPC6 and Germany's HPC5 drive national totals, while the new JUPITER Booster delivers 793.4 PFLOPs.
Germany's capacity is substantial but distinct.
Germany's aggregate Top500 performance is 1.201 exaflops. The JUPITER Booster is a EuroHPC system hosted in Germany but counted separately in some summaries. Germany's per-capita compute is therefore ≈14.3 PF/M, not 24 PF/M.
China's reported share hides a much larger reality.
China contributes only 0.281 exaflops to the Top500 list, representing about 2% of global public performance. However, official statements indicate the country runs ≈230 exaflops of national capacity across more than 8 million datacentre racks, with a goal of 300 exaflops by 2025. The difference illustrates the largest known "dark compute" pool in the world.
Private investment dwarfs public budgets.
Hyperscale cloud companies—Amazon, Microsoft, Google and Meta—are projected to spend ≈$315–320 billion on AI-ready datacentres in 2025. Public programmes like the U.K.'s AI Opportunities Action Plan (≈£39 billion private investment) and France's France 2030 (€109 billion, entirely private) are significant but smaller.
Supply chains hinge on a few actors.
Taiwan's TSMC controls 64–67% of the global foundry market and produces ≈90% of chips at 7 nm and 5 nm nodes. The U.S. sources ≈92% of its advanced chips from Taiwan and is investing billions to localise 2 nm production. Export controls and embedded chip trackers shape who can buy advanced AI hardware.
Energy is the looming bottleneck.
The International Energy Agency projects global data-centre electricity demand will more than double from 415 TWh in 2024 to 945 TWh by 2030, roughly 3% of worldwide usage. AI-optimised datacentres could quadruple their consumption. Nations with abundant low-carbon power—Finland, France, Norway and the Gulf states—are poised to host growth, while energy-constrained countries will struggle.
Public vs private roles are shifting.
In 2019 the private sector owned less than 40% of AI supercomputer capacity; by 2025 it controls around 80%. Gulf sovereign wealth funds are investing billions in mega-datacentre projects like Stargate, while China and the U.S. maintain distinct "national champion" models.

Methodology & Data Confidence

Data Sources and Triangulation

This report combines quantitative and qualitative data from multiple authoritative sources:

  1. Top500 Supercomputer List (June 2025). Provides High Performance LINPACK (HPL) scores for the world's 500 fastest publicly benchmarked machines. The combined performance of the list now exceeds 13.84 exaflops, with the U.S. contributing 6.696 exaflops across 175 systems. The list records system count and aggregated performance by country.

  2. Epoch AI & Georgetown University datasets. These peer-reviewed datasets cover approximately 10–20% of global AI-oriented compute. They reveal trends such as the private sector's dominance and relative shares of AI supercomputer performance (U.S. 74.5%, China 14.1%, EU 4.8%). We use them for context rather than absolute counts.

  3. Population & GDP. Mid-2025 population estimates come from the UN World Population Prospects 2024 and national statistics, while GDP figures use the IMF's April 2025 World Economic Outlook. These allow per-capita and per-dollar normalisation.

  4. Government roadmaps & policy documents. Official announcements such as the U.K.'s AI Opportunities Action Plan, France's France 2030 investment plan, India's National Supercomputing Mission, and EuroHPC press releases inform our understanding of public investments.

  5. Industry and academic analyses. Reporting from outlets like HPCwire, Data Center Dynamics and The Register provides details on individual systems and emerging trends. Peer-reviewed research from Epoch AI underscores power consumption trajectories.

Data Verification Note

We cross-reference statistics across at least two independent sources where possible. Claims lacking transparent methodology or credible provenance are excluded. For example, widely circulated tables of "H100-equivalent GPUs" were omitted because no verifiable source exists. The Epoch AI dataset captures a subset of global capacity and cannot represent hidden or classified systems. Top500 rankings reflect only publicly benchmarked machines and understate actual capabilities in countries like China and the U.S. national labs.

Limitations and Rapidly Changing Landscape

  • Hidden Capacity. Many national and corporate clusters are not publicly benchmarked. China has withdrawn from the Top500 since 2017 and likely operates multiple exascale systems. U.S. national security machines are also classified.

  • Evolving Hardware. Compute deployments shift quickly. Figures in this report reflect mid-2025 data and may be obsolete within months as new systems come online and old ones retire.

  • Energy Uncertainties. Projected power consumption hinges on model size, hardware efficiency and datacentre design. Future breakthroughs could change these trajectories.

  • Population & GDP Variations. Some countries' population estimates vary by source (e.g., Germany's range is 84.1–84.4 million). We use mid-points for consistency.

  • Verification Status. Throughout the report we tag numbers as Verified, Estimated, or Projected. Verified figures come from official or peer-reviewed sources; estimated ones rely on extrapolation; projected figures look ahead based on trends.

Inside the World's Most Powerful Machines

The June 2025 Top500 list offers a snapshot of the most advanced supercomputers. Three U.S. Department of Energy machines occupy the top of the rankings, each surpassing the exaflop barrier:

  • El Capitan (Lawrence Livermore National Laboratory) – 1.742 EFlop/s using AMD EPYC CPUs and MI300A GPUs (Verified).

  • Frontier (Oak Ridge National Laboratory) – 1.353 EFlop/s on AMD EPYC CPUs with MI250X accelerators (Verified).

  • Aurora (Argonne National Laboratory) – 1.012 EFlop/s using an Intel Max Series architecture (Verified).

Europe's flagship system is the JUPITER Booster, located at the Jülich Supercomputing Centre. It debuted at 793.4 PFLOPs (0.7934 EFlop/s), ranking fourth globally. Other key European machines include:

  • HPC6 (Italy) – 477.9 PFLOPs with AMD EPYC and MI250X hardware.

  • Fugaku (Japan) – 442 PFLOPs using Arm-based A64FX processors.

  • Alps (Switzerland) – 434.9 PFLOPs featuring Nvidia GH200 Superchips.

  • LUMI (Finland) – 379.7 PFLOPs built on AMD architecture.

  • Leonardo (Italy) – 241.2 PFLOPs with Intel CPUs and Nvidia A100 GPUs.

The U.S. private sector also fields large HPL systems, such as Microsoft's Eagle cluster (561 PFLOPs). However, many hyperscaler installations do not participate in Top500 benchmarking, meaning public listings capture only a fraction of corporate capacity.

Understanding China's Compute Capacity

Critical Context: China's True Computational Power

China's presence on the Top500 list belies its actual capabilities. In June 2025, China reported 0.281 exaflops of Top500 performance across 47 systems. This figure amounts to roughly 2% of total public performance. Yet official Chinese statistics reveal a much larger domestic programme:

  • National Capacity: The Ministry of Industry and Information Technology stated that by July 2024, China operated ≈230 exaflops of aggregate computing power—almost 800× the Top500 figure—with a target of 300 exaflops by 2025 (Verified).

  • Datacentre Footprint: China runs more than 8.1 million datacentre racks, the world's largest installed base (Verified).

  • Secret Exascale Systems: Reports indicate China has at least two exascale machines not submitted to international benchmarking: Sunway OceanLight (~1.3 EFlop/s) and Tianhe-3 (~1.7 EFlop/s). A third exascale system is believed to be operational since 2023.

China has avoided Top500 submissions since 2017, partly to protect national security and partly to sidestep U.S. export controls. It continues to procure domestic accelerators (e.g., Huawei's Ascend) and older imported GPUs through indirect channels. This "dark compute" creates uncertainty for foreign planners, underscoring the need for new international frameworks for compute transparency.

Comparative Metrics & Sovereignty

Aggregate Performance by Country

The chart below ranks countries by total public compute capacity (Top500 aggregate). The U.S. remains far ahead, Germany sits second in Europe, and China's reported tally appears small relative to its true capacity.

Aggregate Performance by Country

Aggregated Public Supercomputing Performance by Country (2025)

Top500 aggregate performance in exaflops

United States
48.4%
6.696 EFlops
Germany
1.201 EFlops
Japan
0.74 EFlops
Italy
0.72 EFlops
France
0.42 EFlops
China
0.281 EFlops
South Korea
0.18 EFlops
Switzerland
0.15 EFlops
Spain
0.12 EFlops
Finland
0.08 EFlops
0 1 2 3 4 5 6 7
Exaflops (EFlop/s)
Based on June 2025 Top500 aggregate performance. The U.S. dominates with 48.4% of global public compute capacity, while Germany leads in Europe. China's reported figure (0.281 EFlops) represents only public systems and significantly understates its true national capacity of ≈230 exaflops across classified and commercial systems.

Compute per Person

Normalising performance by population reveals striking disparities. Finland leads with ≈70 PF/M, Switzerland follows at ≈53 PF/M, and small economies like Italy (≈15 PF/M) and Germany (≈14 PF/M) outperform larger nations. The U.S. sits at ≈19 PF/M and China at ≈0.2 PF/M.

Compute per Person Chart

Compute per Capita

PFLOPs per million people (PF/M)

Finland
≈70 PF/M
Switzerland
≈53 PF/M
United States
≈19 PF/M
Italy
≈15 PF/M
Germany
≈14 PF/M
Japan
≈8 PF/M
Spain
≈6 PF/M
France
≈4 PF/M
South Korea
≈3 PF/M
China
≈0.2 PF/M
0 10 20 30 40 50 60 70
PFLOPs per million people
Based on Top500 aggregate performance normalized by population. Finland leads with the highest per-capita compute density, followed by Switzerland. The U.S. sits at ≈19 PF/M despite having the largest absolute capacity, while China shows ≈0.2 PF/M due to its large population and limited public reporting.

Compute per GDP

Compute intensity relative to economic output provides another lens. Finland again tops the list with ≈1,266 PF/T;SwitzerlandandItalyscore523and362PF/T; Switzerland and Italy score 523 and 362 PF/ T;SwitzerlandandItalyscore523and362PF/T, respectively. The U.S. operates ≈219 PF/T,whileChina′sratiois≈14.6PF/T, while China's ratio is ≈14.6 PF/ T,while China′sratiois≈14.6PF/T.

Compute Intensity Relative to GDP

Compute Intensity Relative to GDP

PFLOPs per $1 Trillion GDP (PF/$T)

Finland
≈1,266 PF/$T
Switzerland
≈523 PF/$T
Italy
≈362 PF/$T
United States
≈219 PF/$T
Japan
≈185 PF/$T
Germany
≈148 PF/$T
Spain
≈85 PF/$T
France
≈75 PF/$T
South Korea
≈48 PF/$T
China
≈14.6 PF/$T
0 200 400 600 800 1000 1200
PFLOPs per $1 Trillion GDP
Compute intensity normalized by economic output reveals efficiency in converting economic resources to computational capacity. Finland leads with exceptional efficiency at ≈1,266 PF/$T, demonstrating how smaller economies can achieve remarkable compute density. The U.S. operates at ≈219 PF/$T despite its massive absolute capacity, while China's ratio of ≈14.6 PF/$T reflects both its large economy and limited public compute reporting.

Compute Sovereignty Index (CSI)

To evaluate autonomy, we developed a Compute Sovereignty Index that weights five factors: installed capacity, foreign reliance, domestic chip manufacturing, sovereign investment and energy resilience. Scores range from 0 to 100.

Compute Sovereignty Index

Compute Sovereignty Index (CSI)

Evaluating national autonomy in AI computing infrastructure

Country CSI Score Rationale
United States
78 (High)
Largest public capacity; strong domestic chip industry; diversified energy sources; moderate reliance on Taiwanese fabs.
Germany
67 (Moderate)
Significant capacity via HPC centres; moderate reliance on foreign hardware; robust energy infrastructure; major EuroHPC participant.
Finland
65 (Moderate)
Highest per-capita compute and low-carbon energy; limited domestic chip manufacturing; reliant on EU supply chains.
Italy
60 (Moderate)
High compute intensity relative to GDP; depends on external chip suppliers; strong sovereign investment.
China
52 (Moderate-Low)
Massive hidden capacity but highly dependent on foreign lithography and restricted GPU imports.
India
40 (Low)
Limited installed capacity and heavy reliance on imported hardware; expanding through the National Supercomputing Mission but still vulnerable.
*Scores are approximate and based on publicly available information; higher values reflect greater independence.

AI Compute Dependency Matrix

Plotting countries by domestic compute strength versus foreign reliance yields a four-quadrant map. Sovereign leaders (United States) possess both high capacity and low dependence. Rising builders (Germany, Italy, Finland) have moderate capacity and foreign reliance. Strategic dependents (China, Saudi Arabia) have moderate capacity but heavy reliance on foreign technology. Compute-vulnerable nations (India) have limited capacity and high dependence.

AI Compute Dependency Matrix

AI Compute Dependency Matrix

Mapping countries by domestic compute strength vs foreign reliance

Foreign Reliance (High → Low)
Domestic Compute Index (Low → High)
Compute-Vulnerable
Limited capacity
High dependence
Strategic Dependents
Moderate capacity
Heavy foreign reliance
Rising Builders
Moderate capacity
Moderate dependence
Sovereign Leaders
High capacity
Low dependence
India
China
Saudi Arabia
Germany
Italy
Finland
France
United States
Japan
South Korea
Sovereign Leaders: High capacity, low dependence
Rising Builders: Moderate capacity, moderate dependence
Strategic Dependents: Moderate capacity, heavy reliance
Compute-Vulnerable: Limited capacity, high dependence
This matrix plots countries based on their domestic compute infrastructure strength versus their reliance on foreign technology and supply chains. The United States stands alone as a sovereign leader, while most European nations cluster as rising builders. China and Gulf states occupy the strategic dependent quadrant due to heavy reliance on foreign semiconductor technology despite significant investments.

Investment Landscape & Growth Dynamics

Public Programmes and National Strategies

United Kingdom. The AI Opportunities Action Plan seeks to transform Britain into a compute powerhouse. The plan envisions up to 20× growth in public compute by 2030 and has attracted ≈£39 billion in private commitments from Vantage Data Centres, nScale and Kyndryl. The U.K. also invests £2 billion in the AI Research Resource (AIRR) to build sovereign clusters.

France. President Emmanuel Macron unveiled a €109 billion investment package, entirely from private entities such as MGX (UAE), Brookfield (Canada), Bpifrance and Iliad. Projects include a 1.4 GW data-centre campus near Paris hosting 18,000 Grace Blackwell GPUs.

India. The National Supercomputing Mission has installed 24.8 PFLOPs of capacity and plans an additional 41.17 PFLOPs using indigenous Rudra servers and Trinetra interconnects. India aims to expand its domestic supply chain but remains reliant on imported accelerators.

EuroHPC. The EU's joint undertaking finances exascale and pre-exascale machines like JUPITER and DAEDALUS, emphasising green computing and shared access. JUPITER's booster module uses Nvidia GH200 Superchips and achieves 793.4 PFLOPs.

Stargate & Gulf Investments. The Stargate initiative—backed by SoftBank, OpenAI, Oracle and the UAE's MGX fund—aims to build up to 20 mega-datacentres in the U.S., with an initial budget of $100 billion and eventual outlay up to $500 billion. MGX has committed several billion dollars; Saudi Arabia's Public Investment Fund hopes AI will contribute 12% to national GDP by 2030. Reports indicate Saudi Arabia's KAUST acquired ≈3,000 Nvidia H100 GPUs and the UAE has bought "thousands".

Private Sector Surge (2019–2025)

The number of AI accelerators has grown exponentially. In 2019, U.S. AI clusters used around 300,000 H100-equivalent GPUs. By 2025, the U.S. fleet has ballooned to ≈850,000, roughly nine times China's 110,000 and seventeen times the EU's 50,000. This boom reflects capital expenditures of ≈$315–320 billion by hyperscalers in 2025 and underscores the shift from public to corporate stewardship.

AI Compute Capacity Growth and Projections (2019-2030)

AI Compute Capacity Growth and Projections (2019-2030, approximate)

Exponential growth in AI accelerators (thousands of H100-equivalent)

0
500
1000
1500
2000
2020
2022
2024
2026
2028
2030
Approximate AI compute capacity (thousands H100-equiv)
Year
United States
China
European Union
The U.S. shows exponential growth from ~300,000 H100-equivalent GPUs in 2019 to ≈850,000 by 2025, roughly nine times China's 110,000 and seventeen times the EU's 50,000. This surge reflects ≈$315–320 billion in capital expenditures by hyperscalers in 2025, underscoring the shift from public to corporate stewardship of AI infrastructure.

Looking Ahead (2025–2030)

Two forces will shape the next five years:

  1. Sovereign capital meets hyperscalers. Gulf sovereign wealth funds will continue to co-invest with U.S. cloud providers. Projects like Stargate will expand computing capacity in the U.S., but Gulf partners will demand technology transfer and local campuses. This dynamic diffuses power away from a single geography.

  2. Regional AI factories. Europe's EuroHPC will commission multiple exascale systems; France's 1.4 GW campus and the U.K.'s AIRR expansion will anchor regional autonomy. India will add 12 Rudra-based machines, while the U.S. will start up new facilities supported by TSMC's onshore production.

These investments signal that compute is becoming a critical piece of national infrastructure.

Supply-Chain & Energy Dependencies

Hardware Chokepoints

TSMC's domination. Taiwan's TSMC controls around 64–67% of the contract foundry market and produces approximately 90% of chips at 7 nm and 5 nm nodes. The U.S. depends on Taiwan for ≈92% of its advanced logic chips, although TSMC plans to produce about 30% of its 2 nm-and-smaller chips in the United States by 2030. Taiwan's dominance creates geopolitical risk; any disruption could stall global AI development.

Export controls & chip trackers. Since 2022, the U.S. has progressively tightened restrictions on sales of AI accelerators to China and Russia. By 2025 the rules require exporters to embed location trackers in chip shipments to ensure they are not diverted. This adds compliance cost and delays for global supply chains.

Alternative architectures. China is investing in domestic chip designs (e.g., Ascend), while open RISC-V cores and analog compute research are gaining traction. Nevertheless, Nvidia and AMD remain dominant for training frontier models. AMD's MI300 and Nvidia's H100/Blackwell lines have effectively become standard tools for large AI workloads.

Energy as the Limiting Factor

Compute cannot flourish without abundant electricity. An exascale machine can draw tens of megawatts; a single datacentre campus may need hundreds. The International Energy Agency predicts data-centre electricity demand will rise from 415 TWh in 2024 to 945 TWh by 2030. AI-optimised facilities could quadruple consumption, consuming nearly 3% of global electricity. Regional growth will be uneven: the U.S. will add ≈240 TWh, China ≈175 TWh, and Europe ≈45 TWh.

Countries with cheap, low-carbon power will attract AI investment. Finland draws on hydro and nuclear; France leverages nuclear; Norway uses hydropower; Gulf states are scaling solar and gas. By contrast, India and parts of China face grid constraints and may need massive investments to balance AI demand with domestic consumption.

Sovereign Funds & Foreign Investment

Gulf sovereign wealth funds, notably MGX (UAE) and Saudi Arabia's Public Investment Fund, are emerging as pivotal players in the compute race. MGX partners with SoftBank, Oracle and OpenAI to invest in U.S. datacentres. Saudi Arabia's PIF collaborates with Google on local hyperscale datacentres and aims for AI to contribute 12% of the kingdom's GDP. These investments reflect long-term strategies to diversify economies away from hydrocarbons.

Forward Scenarios

We envision four plausible futures for the distribution of AI compute by 2030. These scenarios are not predictions but narrative tools to explore how policy, investment and technology could interact.

  1. Bipolar Duopoly. The U.S. and China remain the only compute superpowers. Export controls persist; China narrows the gap with domestic lithography but cannot fully match U.S. hyperscaler scale. Other nations must align with one camp, fragmenting global standards.

  2. Regional Blocs. Sovereignty concerns drive the formation of blocs: a Euro–India–Africa alliance, a U.S.–Gulf–Latin America bloc, and an East Asia bloc anchored by Japan and South Korea. Each pool resources to build exascale systems, invest in renewable power and jointly develop chips.

  3. Cloud Oligopoly. A handful of hyperscalers monopolise compute. Governments license access rather than build hardware. Regulation focuses on fair pricing and data security. Smaller countries rent compute "as a service."

  4. Distributed Sovereignty. Breakthroughs in modular, energy-efficient chips and open federated training allow thousands of small datacentres to collaborate. Communities, universities and firms contribute resources; no single actor can monopolise compute.

Forward Scenarios for AI Compute by 2030

Forward Scenarios for AI Compute by 2030

Four plausible futures for the distribution of global computing power

1. Bipolar Duopoly
US
CHN
UK
JPN
RUS
NK
Click to explore network structure
The U.S. and China remain the only compute superpowers. Export controls persist; China narrows the gap with domestic lithography but cannot fully match U.S. hyperscaler scale. Other nations must align with one camp, fragmenting global standards.
Key Characteristics:
  • Two dominant compute blocs
  • Persistent export controls
  • Fragmented global standards
  • Forced alignment of smaller nations
  • Technology cold war dynamics
Moderate Probability
2. Regional Blocs
EUR
IND
AFR
US
GCC
LAT
JPN
KOR
ASN
Click to explore regional alliances
Sovereignty concerns drive the formation of blocs: a Euro–India–Africa alliance, a U.S.–Gulf–Latin America bloc, and an East Asia bloc anchored by Japan and South Korea. Each pool resources to build exascale systems, invest in renewable power and jointly develop chips.
Key Characteristics:
  • Multi-regional cooperation
  • Shared resource pooling
  • Joint chip development
  • Regional energy partnerships
  • Balanced power distribution
High Probability
3. Cloud Oligopoly
AWS
GCP
AZR
GOV
GOV
GOV
GOV
Click to explore corporate control
A handful of hyperscalers monopolise compute. Governments license access rather than build hardware. Regulation focuses on fair pricing and data security. Smaller countries rent compute "as a service."
Key Characteristics:
  • Corporate control of infrastructure
  • Government licensing models
  • Compute-as-a-Service dominance
  • Regulatory focus on pricing
  • Limited national sovereignty
Moderate Probability
4. Distributed Sovereignty
UNI
COM
SME
NGO
HUB
LAB
COP
EDU
GOV
DEV
NET
RES
Click to explore distributed network
Breakthroughs in modular, energy-efficient chips and open federated training allow thousands of small datacentres to collaborate. Communities, universities and firms contribute resources; no single actor can monopolise compute.
Key Characteristics:
  • Decentralized compute networks
  • Federated training protocols
  • Community-owned infrastructure
  • Energy-efficient modular chips
  • Democratic access to compute
Lower Probability
These scenarios are not predictions but narrative tools to explore how policy, investment, and technology could interact by 2030. The Regional Blocs scenario appears most likely given current trends toward sovereignty concerns and international cooperation, while Distributed Sovereignty represents the most democratizing but technically challenging path forward.

Expert Q&A

GeoCoded Analyst: How reliable are Top500 rankings for assessing national AI capabilities?

AI Compute Expert: They're a useful baseline because HPL scores are publicly verifiable, but they capture only publicly benchmarked systems. China, for instance, has withdrawn from the list and still operates at least two exascale machines. Likewise, U.S. national security clusters are classified. So Top500 figures likely underestimate true capacity in both countries.

Analyst: How should business leaders plan around the concentration of chip fabrication in Taiwan?

Expert: Diversification is essential. Companies should split workloads across multiple cloud providers and maintain redundancy across geographic regions. Policymakers need to incentivise alternative fabs in the U.S., Europe and the Middle East. The U.S. is subsidising TSMC's Arizona fabs and Samsung's U.S. facilities to reduce reliance.

Analyst: Is energy really the bigger constraint than chips?

Expert: At the margin, yes. Chips can be manufactured if supply chains hold up, but the electricity to power millions of GPUs is harder to scale quickly. AI datacentres will demand nearly 3% of global electricity by 2030. Securing long-term renewable contracts and grid upgrades will be critical.

Analyst: What does China's hidden capacity mean for geopolitical stability?

Expert: It adds opacity. Without transparency, countries misjudge each other's capabilities, increasing the risk of miscalculations. International frameworks for sharing at least aggregate compute metrics—similar to arms control treaties—could help stabilise the landscape.

Conclusion

The global race for AI compute is intensifying. The United States currently leads in publicly disclosed capacity, while Europe leverages collective investment to punch above its weight and China cultivates a vast but opaque "dark compute" reserve. Gulf states are using sovereign wealth to buy their way into the top tier. Taiwan remains the linchpin of chip manufacturing, creating a single point of failure in the global supply chain.

Looking ahead, leadership in AI will hinge on more than just the number of petaflops. Countries must diversify semiconductor supply, secure reliable low-carbon power, invest in public compute infrastructure and cultivate talent. Without transparency about hidden systems, policymakers risk miscalculating rivals' capabilities, raising the stakes in an already high-stakes domain. The next five years will test whether international cooperation can keep pace with the rapid expansion of AI infrastructure—or whether compute will become the new battleground in global geopolitics.

GeoCoded Special Report – Disclaimer

Purpose

This report is for informational use only. It does not constitute investment, policy, or strategic advice and should not be relied on as the sole basis for decisions. It is not intended for regulatory filings or as a substitute for independent professional due diligence. Readers should consult qualified professionals before making financial, policy, or strategic choices based on this analysis.

Sources

Findings draw on government statistics, peer-reviewed research, industry benchmarks (e.g., Top500, IEA, IMF), and verified news reporting. Claims are cross-checked across independent sources, with figures traced to primary materials and dated.

Limitations

  • Temporal: Information current as of August 21, 2025; developments may have occurred since.

  • Data Gaps: Classified or undisclosed capacity not captured in public benchmarks; private estimates rely on industry reporting.

  • AI Assistance: AI models support data parsing and cross-referencing. Despite oversight, errors may remain; critical claims are independently verified.

  • Forward-Looking: Scenarios are analytical tools, not predictions. Outcomes may diverge due to technology, regulation, or market change.

Confidence Levels

  • Verified: Confirmed by multiple independent sources

  • Estimated: Extrapolated from partial data using stated methodology

  • Projected: Forward-looking analysis from identified trends

Corrections & Attribution

Errors are corrected when identified. Content may be cited with attribution to GeoCoded Special Report and publication date. GeoCoded maintains editorial independence and prioritizes accuracy over narrative convenience.

For corrections or inquiries: geocoded@sanchez.vc

Verification, Transparency & Limitations

  • Verified Claims. Data drawn directly from official Top500 lists, government announcements, peer-reviewed research and reputable journalism are tagged as Verified.

  • Estimated Figures. Where only ranges or secondary reports exist (e.g., approximate numbers of GPUs purchased by UAE), we label figures as Estimated. Per-capita and per-GDP metrics rely on population and GDP estimates, introducing modest uncertainty.

  • Projected Data. Future investment flows and energy consumption projections come from forward-looking analyses by the IEA and industry analysts. These are inherently uncertain and should be treated as scenarios, not facts.

  • Data Gaps. The largest unknown is hidden capacity in countries that do not participate in international benchmarking. China's 230 exaflops number is self-reported and lacks external verification. U.S. national security machines are classified. Saudi Arabia and the UAE disclose only broad commitments.

Glossary

  • Exaflop (EFlop/s). A unit of computing performance equal to one quintillion (10¹⁸) floating-point operations per second. Exaflop-scale systems represent the current frontier of supercomputing.

  • PFLOP (PFlop/s). A petaflop equals one thousand trillion (10¹⁵) floating-point operations per second. Many national machines operate in the hundreds of petaflops range.

  • Top500. A twice-yearly ranking of the world's fastest publicly benchmarked supercomputers based on HPL performance.

  • HPL (High Performance LINPACK). A benchmark that solves a dense system of linear equations, used to rate supercomputer performance.

  • EuroHPC. A European joint initiative to coordinate and fund high-performance computing infrastructure across EU member states.

  • Compute Sovereignty. The degree to which a nation controls the resources—hardware, power, supply chains—necessary to run advanced AI workloads without external dependency.

Data Appendix & Citations

The citations below correspond to numbered footnotes in the report. Each entry cites the line numbers of the original source, which readers can consult for context.

  1. Top500 combined performance and country breakdown

  2. Epoch AI dataset on global AI supercomputer shares

  3. Mid-2025 population and GDP estimates (UN and IMF)

  4. UK AI Opportunities Action Plan announcements

  5. France 2030 private investment commitments

  6. India National Supercomputing Mission details

  7. EuroHPC press releases on JUPITER and other systems

  8. HPCwire on top machines (El Capitan, Frontier, Aurora, etc.)

  9. Epoch AI research on private vs public compute

  10. TSMC market share and advanced node dominance

  11. U.S. subsidies and onshoring of 2 nm chips

  12. Export control measures and embedded trackers

  13. IEA projections for datacentre electricity growth

  14. Saudi Arabia and UAE GPU acquisitions

  15. Gulf sovereign investment narratives

  16. China's 230 exaflop capacity and 8.1 million racks

  17. Evidence of China's departure from public benchmarking

  18. Additional technical clarifications and energy projections

Highlights - June 2025 | TOP500 https://top500.org/lists/top500/2025/06/highs/

TOP500: JUPITER Joins the List at Number Four, but Top Three Hold Their Position https://www.hpcwire.com/2025/06/10/top500-jupiter-joins-the-list-at-number-four-but-top-three-hold-their-position/

China plans to boost national compute capacity 30% by 2025 • The Register https://www.theregister.com/2024/07/08/china_compute_capacity_boost/

Alphabet, Microsoft can't keep up with cloud demand https://www.fierce-network.com/cloud/has-cloud-growth-hit-another-speed-bump

Prime Minister sets out blueprint to turbocharge AI - GOV.UK https://www.gov.uk/government/news/prime-minister-sets-out-blueprint-to-turbocharge-ai

UK AI sector attracts £200 million a day in private investment since July - GOV.UK https://www.gov.uk/government/news/uk-ai-sector-attracts-200-million-a-day-in-private-investment-since-july

Macron signals investments of 109 billion euros in French AI by private sector | Reuters https://www.reuters.com/technology/artificial-intelligence/france-invest-109-billion-euros-ai-macron-announces-2025-02-09/

Macron unveils $112B AI investment package, France's answer to US' Stargate | TechCrunch https://techcrunch.com/2025/02/10/macron-unveils-a-112b-ai-investment-package-as-frances-answer-to-stargate/

The World's Growing Reliance on Taiwan's Semiconductor Industry https://www.visionofhumanity.org/the-worlds-dependency-on-taiwans-semiconductor-industry-is-increasing/

TSMC market share tops 67% in Q4: Advisory firm - Focus Taiwan https://focustaiwan.tw/business/202503150014

Which Country Will Win the 2025 Microchip Supremacy Race https://www.electropages.com/blog/2025/01/which-country-will-win-2025-microchip-supremacy-race

TSMC to stash 30% of high-end chip production in US • The Register https://www.theregister.com/2025/07/17/tsmc_q2_ramp/

Exclusive: US embeds trackers in AI chip shipments to catch diversions to China, sources say | Reuters https://www.reuters.com/world/china/us-embeds-trackers-ai-chip-shipments-catch-diversions-china-sources-say-2025-08-13/

IEA: Data center energy consumption set to double by 2030 to 945TWh - DCD https://www.datacenterdynamics.com/en/news/iea-data-center-energy-consumption-set-to-double-by-2030-to-945twh/

Trends in AI Supercomputers https://arxiv.org/html/2504.16026v1

The Gulf's $100 billion AI gamble is just getting started | AGBI https://www.agbi.com/opinion/tech/2025/08/the-gulfs-100-billion-ai-gamble-is-just-getting-started/

The US hosts the majority of AI supercomputers, followed by China | Epoch AI https://epoch.ai/data-insights/ai-supercomputers-performance-share-by-country

National Super Computing Mission | Department Of Science & Technology https://dst.gov.in/national-super-computing-mission

El Capitan still the world's fastest supercomputer in Top500 list, Jupiter Booster named Europe's most powerful - DCD https://www.datacenterdynamics.com/en/news/el-capitan-still-the-worlds-fastest-supercomputer-in-top500-list-jupiter-booster-named-europes-most-powerful/

France Bolsters National AI Strategy With NVIDIA Infrastructure | NVIDIA Blog https://blogs.nvidia.com/blog/france-sovereign-ai-infrastructure/

Report: Saudi Arabia acquires 3,000 Nvidia GPUs, UAE buys thousands - DCD https://www.datacenterdynamics.com/en/news/report-saudi-arabia-acquires-3000-nvidia-gpus-uae-buys-thousands/

Christopher Sanchez

Professor Christopher Sanchez is internationally recognized technologist, entrepreneur, investor, and advisor. He serves as a Senior Advisor to G20 Governments, top academic institutions, institutional investors, startups, and Fortune 500 companies. He is a columnist for Fast Company Mexico writing on AI, emerging tech, trade, and geopolitics.

He has been featured in WIRED, Forbes, the Wall Street Journal, Business Insider, MIT Sloan, and numerous other publications. In 2024, he was recognized by Forbes as one of the 35 most important people in AI in their annual AI 35 list.

https://www.christophersanchez.ai
Next
Next

GeoCoded Special Report: The Geopolitics of Rare Earths (August 2025)