Nvidia‘s strategic expansion with NVLink Fusion represents a significant pivot in the AI chip ecosystem, allowing non-Nvidia processors to interconnect with their technology for the first time. This move positions the company to maintain its central role in AI development even as competitors build custom solutions, effectively acknowledging the hybrid future of AI infrastructure while ensuring Nvidia remains the gravitational center of the industry.
The big picture: Nvidia CEO Jensen Huang announced “NVLink Fusion,” opening the company’s previously closed interconnect technology to competitors’ CPUs and custom chips for the first time.
- The program allows customers to build “semi-custom AI infrastructure” by combining Nvidia’s technology with third-party processors and application-specific integrated circuits (ASICs).
- This strategic shift enables Nvidia to extend its influence into data centers that aren’t exclusively built with its chips.
Key partnerships: Nvidia has already secured several chipmaking partners for NVLink Fusion including MediaTek, Marvell, Alchip, Astera Labs, Synopsys and Cadence.
- Major customers like Fujitsu and Qualcomm Technologies will be able to connect their third-party CPUs with Nvidia’s GPUs in AI data centers.
- Notably absent from the NVLink Fusion ecosystem are Nvidia’s primary competitors: Broadcom, AMD, and Intel.
Why this matters: The move represents Nvidia’s strategy to remain central in AI infrastructure development even as major customers like Google, Microsoft and Amazon develop their own custom processors.
- Ray Wang, a semiconductor analyst, explained the program “consolidates NVIDIA as the center of next-generation AI factories—even when those systems aren’t built entirely with NVIDIA chips.”
- While Nvidia dominates in general AI training GPUs, competitors are focusing on chips designed for specialized applications—a market Nvidia now aims to influence.
Additional announcements: Huang revealed several other developments during his Computex 2025 presentation in Taiwan.
- Nvidia’s next-generation Grace Blackwell systems “GB300” will be released in Q3.
- The company introduced a new NVIDIA DGX Cloud Lepton AI platform.
- Plans were shared for a new office in Taiwan and an AI supercomputer project with Foxconn.
Nvidia announces new tech to keep it at the center of AI development