GeoCoded Special Report: State of Global AI Compute (2025 Edition)
Executive Summary & Key Insights
Artificial intelligence no longer advances on the strength of clever code alone. The newest language and multimodal models demand tens of thousands of specialised chips, petabytes of memory and gigawatts of electricity. Countries that can marshal these resources gain not only economic advantage but also strategic leverage. This report takes stock of the world's AI computing infrastructure in mid-2025, highlighting who controls the most "digital horsepower," how resources stack up on a per-person and per-dollar basis, where the chokepoints lie, and what trajectories are plausible for the rest of the decade.
Methodology & Data Confidence
Data Sources and Triangulation
This report combines quantitative and qualitative data from multiple authoritative sources:
Top500 Supercomputer List (June 2025). Provides High Performance LINPACK (HPL) scores for the world's 500 fastest publicly benchmarked machines. The combined performance of the list now exceeds 13.84 exaflops, with the U.S. contributing 6.696 exaflops across 175 systems. The list records system count and aggregated performance by country.
Epoch AI & Georgetown University datasets. These peer-reviewed datasets cover approximately 10–20% of global AI-oriented compute. They reveal trends such as the private sector's dominance and relative shares of AI supercomputer performance (U.S. 74.5%, China 14.1%, EU 4.8%). We use them for context rather than absolute counts.
Population & GDP. Mid-2025 population estimates come from the UN World Population Prospects 2024 and national statistics, while GDP figures use the IMF's April 2025 World Economic Outlook. These allow per-capita and per-dollar normalisation.
Government roadmaps & policy documents. Official announcements such as the U.K.'s AI Opportunities Action Plan, France's France 2030 investment plan, India's National Supercomputing Mission, and EuroHPC press releases inform our understanding of public investments.
Industry and academic analyses. Reporting from outlets like HPCwire, Data Center Dynamics and The Register provides details on individual systems and emerging trends. Peer-reviewed research from Epoch AI underscores power consumption trajectories.
Data Verification Note
We cross-reference statistics across at least two independent sources where possible. Claims lacking transparent methodology or credible provenance are excluded. For example, widely circulated tables of "H100-equivalent GPUs" were omitted because no verifiable source exists. The Epoch AI dataset captures a subset of global capacity and cannot represent hidden or classified systems. Top500 rankings reflect only publicly benchmarked machines and understate actual capabilities in countries like China and the U.S. national labs.
Limitations and Rapidly Changing Landscape
Hidden Capacity. Many national and corporate clusters are not publicly benchmarked. China has withdrawn from the Top500 since 2017 and likely operates multiple exascale systems. U.S. national security machines are also classified.
Evolving Hardware. Compute deployments shift quickly. Figures in this report reflect mid-2025 data and may be obsolete within months as new systems come online and old ones retire.
Energy Uncertainties. Projected power consumption hinges on model size, hardware efficiency and datacentre design. Future breakthroughs could change these trajectories.
Population & GDP Variations. Some countries' population estimates vary by source (e.g., Germany's range is 84.1–84.4 million). We use mid-points for consistency.
Verification Status. Throughout the report we tag numbers as Verified, Estimated, or Projected. Verified figures come from official or peer-reviewed sources; estimated ones rely on extrapolation; projected figures look ahead based on trends.
Inside the World's Most Powerful Machines
The June 2025 Top500 list offers a snapshot of the most advanced supercomputers. Three U.S. Department of Energy machines occupy the top of the rankings, each surpassing the exaflop barrier:
El Capitan (Lawrence Livermore National Laboratory) – 1.742 EFlop/s using AMD EPYC CPUs and MI300A GPUs (Verified).
Frontier (Oak Ridge National Laboratory) – 1.353 EFlop/s on AMD EPYC CPUs with MI250X accelerators (Verified).
Aurora (Argonne National Laboratory) – 1.012 EFlop/s using an Intel Max Series architecture (Verified).
Europe's flagship system is the JUPITER Booster, located at the Jülich Supercomputing Centre. It debuted at 793.4 PFLOPs (0.7934 EFlop/s), ranking fourth globally. Other key European machines include:
HPC6 (Italy) – 477.9 PFLOPs with AMD EPYC and MI250X hardware.
Fugaku (Japan) – 442 PFLOPs using Arm-based A64FX processors.
Alps (Switzerland) – 434.9 PFLOPs featuring Nvidia GH200 Superchips.
LUMI (Finland) – 379.7 PFLOPs built on AMD architecture.
Leonardo (Italy) – 241.2 PFLOPs with Intel CPUs and Nvidia A100 GPUs.
The U.S. private sector also fields large HPL systems, such as Microsoft's Eagle cluster (561 PFLOPs). However, many hyperscaler installations do not participate in Top500 benchmarking, meaning public listings capture only a fraction of corporate capacity.
Understanding China's Compute Capacity
Critical Context: China's True Computational Power
China's presence on the Top500 list belies its actual capabilities. In June 2025, China reported 0.281 exaflops of Top500 performance across 47 systems. This figure amounts to roughly 2% of total public performance. Yet official Chinese statistics reveal a much larger domestic programme:
National Capacity: The Ministry of Industry and Information Technology stated that by July 2024, China operated ≈230 exaflops of aggregate computing power—almost 800× the Top500 figure—with a target of 300 exaflops by 2025 (Verified).
Datacentre Footprint: China runs more than 8.1 million datacentre racks, the world's largest installed base (Verified).
Secret Exascale Systems: Reports indicate China has at least two exascale machines not submitted to international benchmarking: Sunway OceanLight (~1.3 EFlop/s) and Tianhe-3 (~1.7 EFlop/s). A third exascale system is believed to be operational since 2023.
China has avoided Top500 submissions since 2017, partly to protect national security and partly to sidestep U.S. export controls. It continues to procure domestic accelerators (e.g., Huawei's Ascend) and older imported GPUs through indirect channels. This "dark compute" creates uncertainty for foreign planners, underscoring the need for new international frameworks for compute transparency.
Comparative Metrics & Sovereignty
Aggregate Performance by Country
The chart below ranks countries by total public compute capacity (Top500 aggregate). The U.S. remains far ahead, Germany sits second in Europe, and China's reported tally appears small relative to its true capacity.
Compute per Person
Normalising performance by population reveals striking disparities. Finland leads with ≈70 PF/M, Switzerland follows at ≈53 PF/M, and small economies like Italy (≈15 PF/M) and Germany (≈14 PF/M) outperform larger nations. The U.S. sits at ≈19 PF/M and China at ≈0.2 PF/M.
Compute per GDP
Compute intensity relative to economic output provides another lens. Finland again tops the list with ≈1,266 PF/T;SwitzerlandandItalyscore523and362PF/T; Switzerland and Italy score 523 and 362 PF/ T;SwitzerlandandItalyscore523and362PF/T, respectively. The U.S. operates ≈219 PF/T,whileChina′sratiois≈14.6PF/T, while China's ratio is ≈14.6 PF/ T,while China′sratiois≈14.6PF/T.
Compute Sovereignty Index (CSI)
To evaluate autonomy, we developed a Compute Sovereignty Index that weights five factors: installed capacity, foreign reliance, domestic chip manufacturing, sovereign investment and energy resilience. Scores range from 0 to 100.
AI Compute Dependency Matrix
Plotting countries by domestic compute strength versus foreign reliance yields a four-quadrant map. Sovereign leaders (United States) possess both high capacity and low dependence. Rising builders (Germany, Italy, Finland) have moderate capacity and foreign reliance. Strategic dependents (China, Saudi Arabia) have moderate capacity but heavy reliance on foreign technology. Compute-vulnerable nations (India) have limited capacity and high dependence.
Investment Landscape & Growth Dynamics
Public Programmes and National Strategies
United Kingdom. The AI Opportunities Action Plan seeks to transform Britain into a compute powerhouse. The plan envisions up to 20× growth in public compute by 2030 and has attracted ≈£39 billion in private commitments from Vantage Data Centres, nScale and Kyndryl. The U.K. also invests £2 billion in the AI Research Resource (AIRR) to build sovereign clusters.
France. President Emmanuel Macron unveiled a €109 billion investment package, entirely from private entities such as MGX (UAE), Brookfield (Canada), Bpifrance and Iliad. Projects include a 1.4 GW data-centre campus near Paris hosting 18,000 Grace Blackwell GPUs.
India. The National Supercomputing Mission has installed 24.8 PFLOPs of capacity and plans an additional 41.17 PFLOPs using indigenous Rudra servers and Trinetra interconnects. India aims to expand its domestic supply chain but remains reliant on imported accelerators.
EuroHPC. The EU's joint undertaking finances exascale and pre-exascale machines like JUPITER and DAEDALUS, emphasising green computing and shared access. JUPITER's booster module uses Nvidia GH200 Superchips and achieves 793.4 PFLOPs.
Stargate & Gulf Investments. The Stargate initiative—backed by SoftBank, OpenAI, Oracle and the UAE's MGX fund—aims to build up to 20 mega-datacentres in the U.S., with an initial budget of $100 billion and eventual outlay up to $500 billion. MGX has committed several billion dollars; Saudi Arabia's Public Investment Fund hopes AI will contribute 12% to national GDP by 2030. Reports indicate Saudi Arabia's KAUST acquired ≈3,000 Nvidia H100 GPUs and the UAE has bought "thousands".
Private Sector Surge (2019–2025)
The number of AI accelerators has grown exponentially. In 2019, U.S. AI clusters used around 300,000 H100-equivalent GPUs. By 2025, the U.S. fleet has ballooned to ≈850,000, roughly nine times China's 110,000 and seventeen times the EU's 50,000. This boom reflects capital expenditures of ≈$315–320 billion by hyperscalers in 2025 and underscores the shift from public to corporate stewardship.
Looking Ahead (2025–2030)
Two forces will shape the next five years:
Sovereign capital meets hyperscalers. Gulf sovereign wealth funds will continue to co-invest with U.S. cloud providers. Projects like Stargate will expand computing capacity in the U.S., but Gulf partners will demand technology transfer and local campuses. This dynamic diffuses power away from a single geography.
Regional AI factories. Europe's EuroHPC will commission multiple exascale systems; France's 1.4 GW campus and the U.K.'s AIRR expansion will anchor regional autonomy. India will add 12 Rudra-based machines, while the U.S. will start up new facilities supported by TSMC's onshore production.
These investments signal that compute is becoming a critical piece of national infrastructure.
Supply-Chain & Energy Dependencies
Hardware Chokepoints
TSMC's domination. Taiwan's TSMC controls around 64–67% of the contract foundry market and produces approximately 90% of chips at 7 nm and 5 nm nodes. The U.S. depends on Taiwan for ≈92% of its advanced logic chips, although TSMC plans to produce about 30% of its 2 nm-and-smaller chips in the United States by 2030. Taiwan's dominance creates geopolitical risk; any disruption could stall global AI development.
Export controls & chip trackers. Since 2022, the U.S. has progressively tightened restrictions on sales of AI accelerators to China and Russia. By 2025 the rules require exporters to embed location trackers in chip shipments to ensure they are not diverted. This adds compliance cost and delays for global supply chains.
Alternative architectures. China is investing in domestic chip designs (e.g., Ascend), while open RISC-V cores and analog compute research are gaining traction. Nevertheless, Nvidia and AMD remain dominant for training frontier models. AMD's MI300 and Nvidia's H100/Blackwell lines have effectively become standard tools for large AI workloads.
Energy as the Limiting Factor
Compute cannot flourish without abundant electricity. An exascale machine can draw tens of megawatts; a single datacentre campus may need hundreds. The International Energy Agency predicts data-centre electricity demand will rise from 415 TWh in 2024 to 945 TWh by 2030. AI-optimised facilities could quadruple consumption, consuming nearly 3% of global electricity. Regional growth will be uneven: the U.S. will add ≈240 TWh, China ≈175 TWh, and Europe ≈45 TWh.
Countries with cheap, low-carbon power will attract AI investment. Finland draws on hydro and nuclear; France leverages nuclear; Norway uses hydropower; Gulf states are scaling solar and gas. By contrast, India and parts of China face grid constraints and may need massive investments to balance AI demand with domestic consumption.
Sovereign Funds & Foreign Investment
Gulf sovereign wealth funds, notably MGX (UAE) and Saudi Arabia's Public Investment Fund, are emerging as pivotal players in the compute race. MGX partners with SoftBank, Oracle and OpenAI to invest in U.S. datacentres. Saudi Arabia's PIF collaborates with Google on local hyperscale datacentres and aims for AI to contribute 12% of the kingdom's GDP. These investments reflect long-term strategies to diversify economies away from hydrocarbons.
Forward Scenarios
We envision four plausible futures for the distribution of AI compute by 2030. These scenarios are not predictions but narrative tools to explore how policy, investment and technology could interact.
Bipolar Duopoly. The U.S. and China remain the only compute superpowers. Export controls persist; China narrows the gap with domestic lithography but cannot fully match U.S. hyperscaler scale. Other nations must align with one camp, fragmenting global standards.
Regional Blocs. Sovereignty concerns drive the formation of blocs: a Euro–India–Africa alliance, a U.S.–Gulf–Latin America bloc, and an East Asia bloc anchored by Japan and South Korea. Each pool resources to build exascale systems, invest in renewable power and jointly develop chips.
Cloud Oligopoly. A handful of hyperscalers monopolise compute. Governments license access rather than build hardware. Regulation focuses on fair pricing and data security. Smaller countries rent compute "as a service."
Distributed Sovereignty. Breakthroughs in modular, energy-efficient chips and open federated training allow thousands of small datacentres to collaborate. Communities, universities and firms contribute resources; no single actor can monopolise compute.
Expert Q&A
GeoCoded Analyst: How reliable are Top500 rankings for assessing national AI capabilities?
AI Compute Expert: They're a useful baseline because HPL scores are publicly verifiable, but they capture only publicly benchmarked systems. China, for instance, has withdrawn from the list and still operates at least two exascale machines. Likewise, U.S. national security clusters are classified. So Top500 figures likely underestimate true capacity in both countries.
Analyst: How should business leaders plan around the concentration of chip fabrication in Taiwan?
Expert: Diversification is essential. Companies should split workloads across multiple cloud providers and maintain redundancy across geographic regions. Policymakers need to incentivise alternative fabs in the U.S., Europe and the Middle East. The U.S. is subsidising TSMC's Arizona fabs and Samsung's U.S. facilities to reduce reliance.
Analyst: Is energy really the bigger constraint than chips?
Expert: At the margin, yes. Chips can be manufactured if supply chains hold up, but the electricity to power millions of GPUs is harder to scale quickly. AI datacentres will demand nearly 3% of global electricity by 2030. Securing long-term renewable contracts and grid upgrades will be critical.
Analyst: What does China's hidden capacity mean for geopolitical stability?
Expert: It adds opacity. Without transparency, countries misjudge each other's capabilities, increasing the risk of miscalculations. International frameworks for sharing at least aggregate compute metrics—similar to arms control treaties—could help stabilise the landscape.
Conclusion
The global race for AI compute is intensifying. The United States currently leads in publicly disclosed capacity, while Europe leverages collective investment to punch above its weight and China cultivates a vast but opaque "dark compute" reserve. Gulf states are using sovereign wealth to buy their way into the top tier. Taiwan remains the linchpin of chip manufacturing, creating a single point of failure in the global supply chain.
Looking ahead, leadership in AI will hinge on more than just the number of petaflops. Countries must diversify semiconductor supply, secure reliable low-carbon power, invest in public compute infrastructure and cultivate talent. Without transparency about hidden systems, policymakers risk miscalculating rivals' capabilities, raising the stakes in an already high-stakes domain. The next five years will test whether international cooperation can keep pace with the rapid expansion of AI infrastructure—or whether compute will become the new battleground in global geopolitics.
GeoCoded Special Report – Disclaimer
Purpose
This report is for informational use only. It does not constitute investment, policy, or strategic advice and should not be relied on as the sole basis for decisions. It is not intended for regulatory filings or as a substitute for independent professional due diligence. Readers should consult qualified professionals before making financial, policy, or strategic choices based on this analysis.
Sources
Findings draw on government statistics, peer-reviewed research, industry benchmarks (e.g., Top500, IEA, IMF), and verified news reporting. Claims are cross-checked across independent sources, with figures traced to primary materials and dated.
Limitations
Temporal: Information current as of August 21, 2025; developments may have occurred since.
Data Gaps: Classified or undisclosed capacity not captured in public benchmarks; private estimates rely on industry reporting.
AI Assistance: AI models support data parsing and cross-referencing. Despite oversight, errors may remain; critical claims are independently verified.
Forward-Looking: Scenarios are analytical tools, not predictions. Outcomes may diverge due to technology, regulation, or market change.
Confidence Levels
Verified: Confirmed by multiple independent sources
Estimated: Extrapolated from partial data using stated methodology
Projected: Forward-looking analysis from identified trends
Corrections & Attribution
Errors are corrected when identified. Content may be cited with attribution to GeoCoded Special Report and publication date. GeoCoded maintains editorial independence and prioritizes accuracy over narrative convenience.
For corrections or inquiries: geocoded@sanchez.vc
Verification, Transparency & Limitations
Verified Claims. Data drawn directly from official Top500 lists, government announcements, peer-reviewed research and reputable journalism are tagged as Verified.
Estimated Figures. Where only ranges or secondary reports exist (e.g., approximate numbers of GPUs purchased by UAE), we label figures as Estimated. Per-capita and per-GDP metrics rely on population and GDP estimates, introducing modest uncertainty.
Projected Data. Future investment flows and energy consumption projections come from forward-looking analyses by the IEA and industry analysts. These are inherently uncertain and should be treated as scenarios, not facts.
Data Gaps. The largest unknown is hidden capacity in countries that do not participate in international benchmarking. China's 230 exaflops number is self-reported and lacks external verification. U.S. national security machines are classified. Saudi Arabia and the UAE disclose only broad commitments.
Glossary
Exaflop (EFlop/s). A unit of computing performance equal to one quintillion (10¹⁸) floating-point operations per second. Exaflop-scale systems represent the current frontier of supercomputing.
PFLOP (PFlop/s). A petaflop equals one thousand trillion (10¹⁵) floating-point operations per second. Many national machines operate in the hundreds of petaflops range.
Top500. A twice-yearly ranking of the world's fastest publicly benchmarked supercomputers based on HPL performance.
HPL (High Performance LINPACK). A benchmark that solves a dense system of linear equations, used to rate supercomputer performance.
EuroHPC. A European joint initiative to coordinate and fund high-performance computing infrastructure across EU member states.
Compute Sovereignty. The degree to which a nation controls the resources—hardware, power, supply chains—necessary to run advanced AI workloads without external dependency.
Data Appendix & Citations
The citations below correspond to numbered footnotes in the report. Each entry cites the line numbers of the original source, which readers can consult for context.
Top500 combined performance and country breakdown
Epoch AI dataset on global AI supercomputer shares
Mid-2025 population and GDP estimates (UN and IMF)
UK AI Opportunities Action Plan announcements
France 2030 private investment commitments
India National Supercomputing Mission details
EuroHPC press releases on JUPITER and other systems
HPCwire on top machines (El Capitan, Frontier, Aurora, etc.)
Epoch AI research on private vs public compute
TSMC market share and advanced node dominance
U.S. subsidies and onshoring of 2 nm chips
Export control measures and embedded trackers
IEA projections for datacentre electricity growth
Saudi Arabia and UAE GPU acquisitions
Gulf sovereign investment narratives
China's 230 exaflop capacity and 8.1 million racks
Evidence of China's departure from public benchmarking
Additional technical clarifications and energy projections
Highlights - June 2025 | TOP500 https://top500.org/lists/top500/2025/06/highs/
TOP500: JUPITER Joins the List at Number Four, but Top Three Hold Their Position https://www.hpcwire.com/2025/06/10/top500-jupiter-joins-the-list-at-number-four-but-top-three-hold-their-position/
China plans to boost national compute capacity 30% by 2025 • The Register https://www.theregister.com/2024/07/08/china_compute_capacity_boost/
Alphabet, Microsoft can't keep up with cloud demand https://www.fierce-network.com/cloud/has-cloud-growth-hit-another-speed-bump
Prime Minister sets out blueprint to turbocharge AI - GOV.UK https://www.gov.uk/government/news/prime-minister-sets-out-blueprint-to-turbocharge-ai
UK AI sector attracts £200 million a day in private investment since July - GOV.UK https://www.gov.uk/government/news/uk-ai-sector-attracts-200-million-a-day-in-private-investment-since-july
Macron signals investments of 109 billion euros in French AI by private sector | Reuters https://www.reuters.com/technology/artificial-intelligence/france-invest-109-billion-euros-ai-macron-announces-2025-02-09/
Macron unveils $112B AI investment package, France's answer to US' Stargate | TechCrunch https://techcrunch.com/2025/02/10/macron-unveils-a-112b-ai-investment-package-as-frances-answer-to-stargate/
The World's Growing Reliance on Taiwan's Semiconductor Industry https://www.visionofhumanity.org/the-worlds-dependency-on-taiwans-semiconductor-industry-is-increasing/
TSMC market share tops 67% in Q4: Advisory firm - Focus Taiwan https://focustaiwan.tw/business/202503150014
Which Country Will Win the 2025 Microchip Supremacy Race https://www.electropages.com/blog/2025/01/which-country-will-win-2025-microchip-supremacy-race
TSMC to stash 30% of high-end chip production in US • The Register https://www.theregister.com/2025/07/17/tsmc_q2_ramp/
Exclusive: US embeds trackers in AI chip shipments to catch diversions to China, sources say | Reuters https://www.reuters.com/world/china/us-embeds-trackers-ai-chip-shipments-catch-diversions-china-sources-say-2025-08-13/
IEA: Data center energy consumption set to double by 2030 to 945TWh - DCD https://www.datacenterdynamics.com/en/news/iea-data-center-energy-consumption-set-to-double-by-2030-to-945twh/
Trends in AI Supercomputers https://arxiv.org/html/2504.16026v1
The Gulf's $100 billion AI gamble is just getting started | AGBI https://www.agbi.com/opinion/tech/2025/08/the-gulfs-100-billion-ai-gamble-is-just-getting-started/
The US hosts the majority of AI supercomputers, followed by China | Epoch AI https://epoch.ai/data-insights/ai-supercomputers-performance-share-by-country
National Super Computing Mission | Department Of Science & Technology https://dst.gov.in/national-super-computing-mission
El Capitan still the world's fastest supercomputer in Top500 list, Jupiter Booster named Europe's most powerful - DCD https://www.datacenterdynamics.com/en/news/el-capitan-still-the-worlds-fastest-supercomputer-in-top500-list-jupiter-booster-named-europes-most-powerful/
France Bolsters National AI Strategy With NVIDIA Infrastructure | NVIDIA Blog https://blogs.nvidia.com/blog/france-sovereign-ai-infrastructure/
Report: Saudi Arabia acquires 3,000 Nvidia GPUs, UAE buys thousands - DCD https://www.datacenterdynamics.com/en/news/report-saudi-arabia-acquires-3000-nvidia-gpus-uae-buys-thousands/