×
Why reasoning models like DeepSeek-R1 are increasing (not decreasing) demand for GPUs
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

The rise of advanced AI reasoning models like DeepSeek-R1 has transformed the computational landscape for AI infrastructure providers. Together AI has secured significant funding to meet growing enterprise demands for high-performance AI model deployment and reasoning capabilities.

Recent funding and growth: Together AI has raised $305 million in Series B funding led by General Catalyst and Prosperity7, marking a significant expansion of its AI deployment platform.

  • The company has experienced 6X year-over-year growth with over 450,000 registered developers
  • Notable customers include AI startups like Krea AI, Captions, and Pika Labs
  • The platform now supports multiple AI modalities including language, reasoning, images, audio, and video

Infrastructure demands: Contrary to initial expectations, DeepSeek-R1 and similar reasoning models are driving increased infrastructure requirements.

  • DeepSeek-R1’s 671 billion parameters require distribution across multiple servers
  • User requests can last up to 2-3 minutes, necessitating dedicated computational resources
  • Together AI has introduced “reasoning clusters” offering 128 to 2,000 chips for optimal performance

Key applications: Reasoning models are finding practical applications across various use cases.

  • Enhanced coding capabilities through problem decomposition
  • Reduced model hallucinations through verification processes
  • Quality improvements in non-reasoning models
  • Self-improvement capabilities using reinforcement learning

Technical innovations: Together AI has made strategic moves to enhance its platform capabilities.

  • Acquisition of CodeSandbox for lightweight VM deployment
  • Implementation of Nvidia’s Blackwell GPUs, offering 2X performance improvement
  • Optimization of inference speeds, achieving 85 tokens per second compared to Azure’s 7 tokens per second

Competitive landscape: Together AI operates in an increasingly crowded market for AI infrastructure.

  • Major cloud providers Microsoft, AWS, and Google offer competing platforms
  • Specialized AI infrastructure providers like Groq and Samba Nova target similar markets
  • Together AI differentiates through full-stack offerings and superior performance metrics

Market implications: The increasing demand for AI infrastructure, particularly for reasoning models, suggests a continuing trend toward more sophisticated and resource-intensive AI applications, potentially creating new opportunities and challenges for infrastructure providers and enterprises alike.

Together AI’s $305M bet: Reasoning models like DeepSeek-R1 are increasing, not decreasing, GPU demand

Recent News

Musk-backed DOGE project targets federal workforce with AI automation

DOGE recruitment effort targets 300 standardized roles affecting 70,000 federal employees, sparking debate over AI readiness for government work.

AI tools are changing workflows more than they are cutting jobs

Counterintuitively, the Danish study found that ChatGPT and similar AI tools created new job tasks for workers and saved only about three hours of labor monthly.

Disney abandons Slack after hacker steals terabytes of confidential data using fake AI tool

A Disney employee fell victim to malware disguised as an AI art tool, enabling the hacker to steal 1.1 terabytes of confidential data and forcing the company to abandon Slack entirely.