Back to News

How developers are using Apple's local AI models with iOS 26 | TechCrunch

Ivan Mehta
October 3, 2025 at 02:00 PM
Joy (60%)
positive
How developers are using Apple's local AI models with iOS 26 | TechCrunch

Key Takeaways

  • Apple introduced the Foundation Models framework at WWDC 2025, enabling developers to use local AI models without inference costs.
  • Developers are integrating these local AI models into apps coinciding with the iOS 26 rollout.
  • The local models focus on quality-of-life improvements, as they are smaller than leading models from competitors.
  • Numerous apps, including Lil Artist, MoneyCoach, LookUp, and Day One, have implemented features powered by Apple's on-device AI.
  • Features range from automated content generation (stories, soundscapes) to smart suggestions (tags, spending insights, scheduling).

Apple's Foundation Models framework, unveiled at WWDC 2025, empowers developers to integrate the company’s local AI models into their applications, notably without any inference cost. With the rollout of iOS 26, developers are actively updating apps to leverage these on-device capabilities, which include features like guided generation and tool calling. While Apple's models are smaller compared to those from OpenAI or Google, they are enhancing user experience across various apps by providing quality-of-life improvements rather than overhauling core workflows. A variety of apps are already demonstrating these integrations, such as Lil Artist using AI for story creation and MoneyCoach for automated spending categorization. Other notable integrations include LookUp's new learning modes, the Tasks app for scheduling suggestions, and Day One for generating entry highlights and prompts. Furthermore, apps like Crouton, Signeasy, Dark Noise, and Lights Out are using this local intelligence for recipe breakdown, contract summarization, custom soundscape generation, and race commentary summarization, respectively.

Related Articles