×
AI supercomputers are a US first, China second phenomenon. And growing rapidly.
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

AI supercomputers are scaling at an exponential rate, with performance doubling every nine months while power requirements and costs double annually. This unprecedented growth, detailed in a comprehensive study of 500 AI systems from 2019-2025, reveals a dramatic shift toward private ownership of computing resources, with industry now controlling 80% of global AI compute power. Understanding these trends is crucial as we approach a future where leading AI systems could require power equivalent to multiple cities and hardware investments in the hundreds of billions.

The big picture: AI supercomputers have experienced explosive growth in computational performance, increasing 2.5x annually through deploying more numerous and powerful specialized chips.

  • Leading systems that once contained fewer than 10,000 chips now regularly feature more than 100,000, exemplified by xAI’s Colossus with its 200,000 AI chips.
  • This growth is driven by a yearly 1.6x increase in chip quantity combined with a 1.6x annual improvement in performance per chip.

Behind the numbers: The expansion of AI supercomputers has created massive energy and financial demands, with power requirements and hardware costs doubling every year.

  • xAI‘s Colossus, the most powerful AI supercomputer as of March 2025, requires approximately 300 megawatts of power—equivalent to 250,000 households—and cost an estimated $7 billion in hardware alone.
  • Despite growing power demands, computational performance per watt has increased by 1.34x annually, primarily through the adoption of more energy-efficient chips.

Key shift: The landscape of AI supercomputing has transformed from primarily academic and public research to industry dominance in just six years.

  • Industry’s share of global AI compute jumped from 40% in 2019 to 80% in 2025, as private companies rapidly scaled their systems to conduct larger training runs.
  • Leading industry systems grew by 2.7x annually, significantly outpacing the 1.9x annual growth of public sector systems.

The big picture: The global distribution of AI supercomputing power shows overwhelming American dominance, with significant implications for technological leadership.

  • The United States controls approximately 75% of global AI supercomputer performance in the dataset, with China a distant second at 15%.
  • Traditional supercomputing powers like the UK, Germany, and Japan now play marginal roles in the AI supercomputing landscape.

Implications: If current growth trajectories continue, the scale of future AI systems will test physical and economic boundaries.

  • Projections suggest that by 2030, the largest AI supercomputer could require 9 gigawatts of power and cost hundreds of billions of dollars to build.
  • These unprecedented requirements raise serious questions about the sustainability of AI scaling and who will be able to participate in frontier AI development.
Trends in AI Supercomputers

Recent News

Unpublished AI system allegedly stolen by synthetic researcher on GitHub

The repository allegedly contains an unpublished recursive AI system architecture with suspicious backdated commits and connection to a potentially synthetic researcher identity with falsified credentials.

The need for personal AI defenders in a world of manipulative AI

Advanced AI systems that protect users from digital manipulation are emerging as essential counterparts to the business-deployed agents that increasingly influence consumer decisions and behavior.

AI excels at identifying geographical locations but struggles with objects in retro games

Modern AI systems show paradoxical visual skills, excelling at complex geographic identification while struggling with simple pixel-based game objects.