×
Snap’s next-gen Spectacles promise lighter design and AI features for 2026
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

Snap has unveiled its next-generation Spectacles AR glasses, which will be significantly lighter than previous models and feature advanced AI assistance powered by OpenAI and Google Gemini integrations. The announcement at the Augmented World Expo 2025 positions Snap to compete directly with Meta’s Ray-Ban smart glasses, addressing the major complaint about the weight of earlier Spectacles while adding multimodal AI capabilities for everyday use.

What you should know: The new Specs represent a major hardware and software upgrade designed to make AR glasses more practical for daily wear.

  • The glasses will be powered by Snapdragon processors and run on an upgraded Snap OS with deep AI integrations.
  • Unlike the fifth-generation Spectacles released in 2024 (which were developer-only), these new glasses will launch publicly in 2026.
  • They offer both AI assistance features and entertainment experiences like shared games and portable workstation capabilities.

The big picture: Snap is betting that lightweight, AI-powered AR glasses will become the next major computing platform, following the success of Meta’s Ray-Ban collaboration.

  • The company is addressing the primary weakness of previous Spectacles—their weight—which was a major barrier to adoption.
  • By integrating with established AI platforms like OpenAI and Google Gemini, Snap is positioning itself alongside tech giants in the smart glasses race.

How the AI features work: Developers can build multimodal AI-powered experiences using integrations with major AI platforms.

  • Examples include real-time text translation, currency conversion, recipe suggestions, and other contextual assistance.
  • The glasses will support over 40 languages through an Automated Speech Recognition API with “high accuracy.”
  • Three new APIs enable spatial intelligence, speech recognition, and 3D object generation within AR experiences.

Developer tools and capabilities: Snap introduced several new APIs and management tools to support the Spectacles ecosystem.

  • The Depth Module API translates 2D information from large language models to accurately anchor AR content in three dimensions.
  • Fleet management tools allow monitoring multiple pairs of Specs, while guided mode enables single- or multiplayer experiences.
  • WebXR support will soon allow developers to build web-based AR experiences directly in the browser.

Competitive landscape: The announcement comes as major tech companies race to develop practical smart glasses.

  • Google recently demonstrated its own smart glasses with Gemini integration at Google I/O, featuring similar lightweight design and AI capabilities.
  • Meta’s Ray-Ban smart glasses have proven there’s consumer demand for tech-enhanced eyewear that doesn’t sacrifice style or comfort.
  • The combination of lightweight design, multimodal AI, and AR capabilities puts Snap’s offering in direct competition with both Google and Meta’s approaches.
Snapchat's upcoming AR glasses beat the Meta Ray-Bans in a meaningful way

Recent News