Earlier this year, Apple introduced the basic model framework in WWDC 2025. This allows developers to use their local AI model to enhance their application capabilities.
The company touted that with this framework, developers can access AI models without worrying about inference costs. Additionally, these local models have features such as guided generation and built-in tool calls.
With iOS 26 deployed to all users, developers are updating their apps to include features powered by Apple’s local AI model. Apple’s models are smaller compared to major models from Openai, Anthropic, Google, or Meta. So, local only features significantly improve the quality of life with these apps, rather than introducing major changes to the app’s workflow.
Below are some of the first apps to leverage Apple’s AI framework:
Lil Artist
The LIL Artist App offers a variety of interactive experiences to help children learn different skills such as creativity, mathematics, music and more. Developers Arima Jain and Aman Jain have shipped AI Story creators with the iOS 26 update. This allows users to select characters and themes, and the app uses AI to generate a story. The developer said that story text generation is equipped with a local model.

Daylish
The developers of the Daylish app are working on a prototype to automatically propose emojis for timeline events based on the title of the Daily Planner app.
Moneycoach
The finance tracking app MoneyCoach has two neat features, featuring a local model. First, the app provides insight into spending, such as whether you spent above average on groceries for that particular week. Another feature automatically suggests categories and subcategories of spend items for quick entry.

look up
Word Learning App Lookup has added two new modes using Apple’s AI model. There is a new learning mode, which utilizes local models to create examples that correspond to words. Additionally, this example asks the user to explain the use of words in a sentence.

Developers are also using models on the device to generate map views of the origin of words.

task
Like some other apps, the Tasks app implemented the ability to automatically use the local model to suggest tags for entries. They also use these models to detect recurring tasks and schedule them accordingly. Also, with the app, users can talk about a few things and break them down into different tasks without using the internet using the local model.

first day
The first day of the journaling app owned by Automattic uses Apple’s models to get highlights and propose the title of the entry. The team has also implemented a feature to generate prompts to dive deeper and tweak you to write more based on what you already wrote.

Crutons
The recipe app Crouton uses Apple Intelligence to suggest recipe tags and assign names to timers. It also uses AI to break down blocks of text into simple follow steps for cooking.
A subtle
Digital Signasis app Signeasy uses Apple’s local model to extract key insights from contracts and provide a summary of the documents that users are signing.
Dark noise
Dark Noise in the Background Sound App uses Apple’s local model to allow users to explain the soundscape in several words and generate it based on it. Once Soundscape is created, you can adjust the levels of various elements of Soundscape.
Light out
Light Out is a new app that tracks the F1 season and Grand Prix of Shihab Mehboob, developer of Twitter client Avery and Mammoth’s Mammoth client. The app uses an on-device AI model to summarise commentary during the race.
capture
The note-taking app Capture uses local AI to display category suggestions as users enter notes and tasks.

Lumi
Sun and Weather-Tracking App Lumy uses AI to show neat weather-related suggestions for your app.

Card Pointer
CardPointers is an app that helps you track your credit card costs and offers suggestions on the best ways to earn points from cards you have. Newer versions of the app allow users to use AI to ask questions about their cards and offers.

Guitar with
Guitar With is a guitar learning app that uses the framework of the Apple Foundation model in several ways. Users will see the chord description while learning it. This app provides advanced insights based on time intervals. Additionally, the AI model helps developers support over 15 languages.
The SmartGym app uses local AI to convert training descriptions into sets step by step using rep counts, intervals, and equipment. Users also provide a summary of training with Mothly progress, routine failures and individual athletic performance.
stoic
Journaling App Stoic uses Apple’s models to provide personalized prompts to users based on mood logging. The model helps users summerize posts, search for past entries and organize them.

SwingVision
This app helps players in racket sports such as tennis and pickleball to improve their form based on video recordings. App makers are currently using basic models to provide practical and concrete feedback.
Zoho
Zoho, a India-based productivity suite company, uses local models to use summary, translation and transcription as power sources across apps such as documents for spreadsheets and notebooks for tables.
TrainFitness
The Workout app uses an on-device model to suggest exercise alternatives if you don’t have specific equipment available.
thing
The To-Do app has a listening mode that uses Apple’s AI model to listen to users and converts voice into individual tasks.
We will continue to update this list to discover more apps using Apple’s local models.
Source link