Apple Announces Foundation Models Framework for Developers to Leverage AI

Apple at WWDC today announced Foundation Models Framework, a new API allowing third-party developers to leverage the large language models at the heart of Apple Intelligence and build it into their apps.

foundation models framework
With the Foundation Models Framework, developers can integrate Apple's on-device models directly into apps, allowing them to build on Apple Intelligence.

"Last year, we took the first steps on a journey to bring users intelligence that's helpful, relevant, easy to use, and right where users need it, all while protecting their privacy. Now, the models that power Apple Intelligence are becoming more capable and efficient, and we're integrating features in even more places across each of our operating systems," said Craig Federighi, Apple's senior vice president of Software Engineering. "We're also taking the huge step of giving developers direct access to the on-device foundation model powering Apple Intelligence, allowing them to tap into intelligence that is powerful, fast, built with privacy, and available even when users are offline. We think this will ignite a whole new wave of intelligent experiences in the apps users rely on every day. We can't wait to see what developers create."

The Foundation Models framework lets developers build AI-powered features that work offline, protect privacy, and incur no inference costs. For example, an education app can generate quizzes from user notes on-device, and an outdoors app can offer offline natural language search.

Apple says the framework is available for testing starting today through the Apple Developer Program at developer.apple.com, and a public beta will be available through the Apple Beta Software Program next month at beta.apple.com. It includes built-in features like guided generation and tool calling for easy integration of generative capabilities into existing apps.

Popular Stories

iphone air thickness

Apple Said to Cut iPhone Air Production Amid Underwhelming Sales

Friday October 17, 2025 8:29 am PDT by
Apple plans to cut production of the iPhone Air amid underwhelming sales performance, Japan's Mizuho Securities believes (via The Elec). The Japanese investment banking and securities firm claims that the iPhone 17 Pro and iPhone 17 Pro Max are seeing higher sales than their predecessors during the same period last year, while the standard iPhone 17 is a major success, performing...
iOS 26 Feature

iOS 26.1 to iOS 26.4 Will Add These New Features to Your iPhone

Saturday October 18, 2025 11:00 am PDT by
iOS 26 was released last month, but the software train never stops, and iOS 26.1 beta testing is already underway. So far, iOS 26.1 makes both Apple Intelligence and Live Translation on compatible AirPods available in additional languages, and it includes some other minor changes across the Apple Music, Calendar, Photos, Clock, and Safari apps. More features and changes will follow in future ...
iOS 26

iOS 26.0.2 Update for iPhones Coming Soon

Friday October 17, 2025 7:35 am PDT by
Apple's software engineers continue to internally test iOS 26.0.2, according to MacRumors logs, which have been a reliable indicator of upcoming iOS versions. iOS 26.0.2 will be a minor update that addresses bugs and/or security vulnerabilities, but we do not know any specific details yet. The update will likely be released by the end of next week. Last month, Apple released iOS 26.0.1,...
HomePod mini and Apple TV

Apple's Next Rumored Products: New HomePod Mini, Apple TV, and More

Thursday October 16, 2025 9:13 am PDT by
Apple on Wednesday updated the 14-inch MacBook Pro, iPad Pro, and Vision Pro with its next-generation M5 chip, but previous rumors have indicated that the company still plans to announce at least a few additional products before the end of the year. The following Apple products have at one point been rumored to be updated in 2025, although it is unclear if the timeframe for any of them has...
iPhone Siri Glow

Some Apple Employees Have 'Concerns' About iOS 26.4's Revamped Siri

Sunday October 19, 2025 7:39 am PDT by
iOS 26.4 is expected to introduce a revamped version of Siri powered by Apple Intelligence, but not everyone is satisfied with how well it works. In his Power On newsletter today, Bloomberg's Mark Gurman said some of Apple's software engineers have "concerns" about the overhauled Siri's performance. However, he did not provide any specific details about the shortcomings. iOS 26.4 will...
Apple iPad Pro hero M5

New iPad Pro Has Six Key Upgrades Beyond M5 Chip

Saturday October 18, 2025 10:57 am PDT by
While the new iPad Pro's headline feature is the M5 chip, the device has some other changes, including N1 and C1X chips, faster storage speeds, and more. With the M5 chip, the new iPad Pro has up to a 20% faster CPU and up to a 40% faster GPU compared to the previous model with the M4 chip, according to Geekbench 6 results. Keep in mind that 256GB and 512GB configurations have a 9-core CPU,...
14 inch MacBook Pro Keyboard

New 14-Inch MacBook Pro Has Two Key Upgrades Beyond the M5 Chip

Thursday October 16, 2025 8:31 am PDT by
Apple on Wednesday updated the 14-inch MacBook Pro base model with an M5 chip, and there are two key storage-related upgrades beyond that chip bump. First, Apple says the new 14-inch MacBook Pro offers up to 2× faster SSD performance than the equivalent previous-generation model, so read and write speeds should get a significant boost. Apple says it is using "the latest storage technology," ...
m4 macbook air blue

M5 MacBook Air Coming Spring 2026 With M5 Mac Studio and Mac Mini in Development

Thursday October 16, 2025 3:57 pm PDT by
Apple plans to launch MacBook Air models equipped with the new M5 chip in spring 2026, according to Bloomberg's Mark Gurman. Apple is also working on M5 Pro and M5 Max MacBook Pro models that will come early in the year. Neither the MacBook Pro models nor the MacBook Air models are expected to get design changes, with Apple focusing on simple chip upgrades. In the case of the MacBook Pro, a m...
14 inch MacBook Pro Keyboard

M5 Chip Achieves Impressive Feat in 14-Inch MacBook Pro Speed Test

Friday October 17, 2025 7:10 am PDT by
The first alleged benchmark result for the M5 chip in the new 14-inch MacBook Pro has surfaced, allowing for some performance comparisons. Based on a single unconfirmed result uploaded to the Geekbench 6 database today, the M5 chip has pulled off an impressive feat. Specifically, the chip achieved a score of 4,263 for single-core CPU performance, which is the highest single-core score that...

Top Rated Comments

heretiq Avatar
19 weeks ago

Aren't the on device models quite limited in capabilities? What can they do? In any case even access to a limited model could be huge.
While Apple’s OpenELM LLM is no ChatGPT, it is a very capable resource for incorporating conversational interface, on-device RAG and LoRA fine-tuning into an app.

We used it to incorporate a fine-tuned model into one of our apps and was very pleased with the results. We held off on shipping the updated app because at the time it would require the app to download a 3.8GB fine-tuned OpenELM model from huggingface — which we didn’t want to require users to do.

We also tried implementing the same AI feature set by incorporating ChatGPT and Perplexity via API. It worked functionally but the latency and API costs were prohibitive for our use case.

Need to see the details but this announcement could possibly eliminate all of these problems — assuming the foundation models include OpenELM equivalents and the API supports fine-tuning via Adapters as was announced at last year’s WWDC. I can’t wait to see the details and start working with the beta!

Follow-up: After watching the WWDC Platforms State of Union keynote .. AFM Framework is a Bonafide Game Changer!

The AFM framework and APIs exceed my expectations by delivering utility that goes well beyond providing the desired functionality — which for me was simply (a) a capable on-device LLM that eliminates the need for costly, high-latency, off-board 3rd-party LLMs, (b) adapters to allow model fine-tuning, (c) user data privacy, and (d) off-line operation.

The unexpected benefits include tool calling, response streaming and model macros that eliminate complex and error-prone LLM response parsing and mapping to app data structures.

I took the last item (complex, imprecise and time-consuming parsing and mapping) as a given and something that developers should just expect to do when incorporating LLMs into an app with structured data — and was completely surprised to hear that Apple completely eliminated this issue for Apple platform developers! This is a really big deal because this single issue is actually a limiting factor for app development use cases. Prior to the Apple specialized macro utility the solution was either complex and brittle regular expressions that were guaranteed to fail (because LLM output is non-deterministic), or ballooning LLM API cost and latency to add guardrails to constrain LLM output to behave more deterministically.

The final word will depend on the stability of the AFM implementation and how well it aligns with what was demonstrated, but this developer is very pleased. The AFM API is a year late, but definitely way better than what was expected.

Bravo Apple. Thank you! ??
Score: 6 Votes (Like | Disagree)
MacTwick Avatar
19 weeks ago
This is huge. I have so many ideas for my app now! I can't wait.
Score: 4 Votes (Like | Disagree)
macduke Avatar
19 weeks ago
I think this could open up a lot of new and exciting apps for developers to build, but probably at the expense of battery life. It will be interesting to see how this evolves. I think this will be one of those things that is a much bigger deal a few years down the road, so better to see it now than later.
Score: 3 Votes (Like | Disagree)
heretiq Avatar
19 weeks ago

foundation models framework image ('https://www.macrumors.com/2025/06/09/foundation-models-framework/')

Apple at WWDC today announced Foundation Models Framework, a new API allowing third-party developers to leverage the large language models at the heart of Apple Intelligence and build it into their apps.

foundation models framework image

With the Foundation Models Framework, developers can integrate Apple's on-device models directly into apps, allowing them to build on Apple Intelligence.
The Foundation Models framework lets developers build AI-powered features that work offline, protect privacy, and incur no inference costs. For example, an education app can generate quizzes from user notes on-device, and an outdoors app can offer offline natural language search.

Apple says the framework is available for testing starting today through the Apple Developer Program at developer.apple.com, and a public beta will be available through the Apple Beta Software Program next month at beta.apple.com. It includes built-in features like guided generation and tool calling for easy integration of generative capabilities into existing apps.

Article Link: Apple Announces Foundation Models Framework for Developers to Leverage AI ('https://www.macrumors.com/2025/06/09/foundation-models-framework/')
Finally!! Been waiting a year for this. Need to see the details but this could be a game changer for devs.
Score: 1 Votes (Like | Disagree)
name99 Avatar
19 weeks ago

Aren't the on device models quite limited in capabilities? What can they do? In any case even access to a limited model could be huge.
They have two main capabilities
- "understanding" language and
- "understanding" images.

The obvious thing you can do is baby steps towards a language-driven UI. Imagine something like you tell UberEats "What was that Asian food I ordered last week? Can you order me that again?"
I think at least part of why Apple is doing this is research, to see how this plays out in the real world.

There are also some less obvious capabilities this allows. For example imagine a note-taking app that creates quizzes from your last week of notes, so you can see what you remember, vs what you don't remember. (I used the word "remember" here deliberately. A better app would create quizzes to see what you UNDERSTAND, but that's probably still too much to expect, even from a leading edge LLM, let along a small edge model.)

Similarly presumably we will see things like photo editing apps where you can just tell the app "remove Jenna's face" and see what happens.
Again this is research. Ultimately the goal is a system-wide language UI, not dedicated per-app code handling this stuff. But at least this gets Apple some of the way there for a year or two, while they figure out the bigger solution.
Score: 1 Votes (Like | Disagree)