Apple dramatically scaled back AI messaging at its iPhone 17 event, offering only brief mentions of Apple Intelligence features during the 75-minute presentation. This marked a sharp contrast to last year’s iPhone 16 launch, where AI dominated the conversation but led to public disappointment when promised features failed to materialize on schedule.
The big picture: Apple positioned AI as a behind-the-scenes enabler rather than a consumer-facing headline feature, focusing more on hardware improvements and how machine learning powers core functionality.
What you should know: The company emphasized technical infrastructure over flashy AI capabilities throughout the event.
- Executives highlighted how an updated neural engine (Apple’s specialized AI processing chip) powers Apple Intelligence and how local large language models enable better gaming performance at higher frame rates.
- Apple now builds neural accelerators into each GPU core to provide “MacBook Pro-levels of compute in an iPhone,” making intensive AI workloads possible on mobile devices.
- Most consumer-facing AI tools mentioned—like visual intelligence features and live translation—were previously announced at WWDC 2025 in June.
Key hardware integrations: Apple’s AI strategy focused on seamless integration across its ecosystem rather than standalone AI features.
- New AirPods leverage computational models combined with Apple Intelligence for live translation and heart rate monitoring through machine learning algorithms.
- The heart rate sensors use an on-device AI model trained on “more than 50 million hours of training data from more than 250,000 participants in an Apple study.”
- Apple Watch AI capabilities center on health monitoring, with machine learning algorithms analyzing users’ blood pressure responses over 30-day periods using data from studies totaling more than 100,000 participants.
Why this matters: Apple’s restrained approach reflects the company’s struggle to keep pace in the AI arms race as competitors make bold moves and talent exits mount.
- The company has reportedly lost at least 10 AI researchers recently, including four last week alone.
- Jian Zhang, Apple’s lead for robotics research, departed for Meta, while three other AI researchers left the foundation models team for OpenAI and Anthropic.
- This conservative messaging contrasts sharply with Google’s Pixel 10 unveiling and Samsung’s January event, both of which heavily featured AI assistant capabilities.
The competitive context: The AI investment stakes continue escalating across the industry, making Apple’s cautious stance more notable.
- OpenAI reached a $300 billion valuation this year and reportedly expects to burn $115 billion through 2029.
- Anthropic recently raised $13 billion at a $183 billion post-money valuation.
- Meta has spent billions hiring top industry researchers after investing more than $14 billion in Scale AI.
What they’re saying: Dr. Sumbul Desai, Apple’s VP of health, emphasized the company’s health-focused AI applications, stating Apple hopes to “notify over one million people with undiagnosed hypertension in the first year alone” and expects FDA clearance soon for its blood pressure monitoring features.
Apple barely talked about AI at its big iPhone 17 event