Hello Apple Developer Community,
I'm investigating Core ML model loading behavior and noticed that even when the compiled model path remains unchanged after an APP update, the first run still triggers an "uncached load" process. This seems to impact user experience with unnecessary delays.
Question: Does Core ML provide any public API to check whether a compiled model (from a specific .mlmodelc path) is already cached in the system?
If such API exists, we'd like to use it for pre-loading decision logic - only perform background pre-load when the model isn't cached.
Has anyone encountered similar scenarios or found official solutions? Any insights would be greatly appreciated!
Explore the power of machine learning and Apple Intelligence within apps. Discuss integrating features, share best practices, and explore the possibilities for your app here.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Created
the specific context is that i would like to build an agent that monitors my phone call (with a customer support for example), and simiply identify whether or not im still put on hold, and notify me when im not.
currently after reading the doc, i dont think its possible yet, but im so annoyed by the customer support calls that im willing to go the distance and see if theres any way.
Hello,
I was successfully able to compile TKDKid1000/TinyLlama-1.1B-Chat-v0.3-CoreML using Core ML, and it's working well. However, I’m now trying to compile the same model using Swift Transformers.
With the limited documentation available on the swift-chat and Hugging Face repositories, I’m finding it difficult to understand the correct process for compiling a model via Swift Transformers. I attempted the following approach, but I’m fairly certain it’s not the recommended or correct method.
Could someone guide me on the proper way to compile and use models like TinyLlama with Swift Transformers? Any official workflow, example, or best practice would be very helpful.
Thanks in advance!
This is the approach I have used:
import Foundation
import CoreML
import Tokenizers
@main
struct HopeApp {
static func main() async {
print(" Running custom decoder loop...")
do {
let tokenizer = try await AutoTokenizer.from(pretrained: "PY007/TinyLlama-1.1B-Chat-v0.3")
var inputIds = tokenizer("this is the test of the prompt")
print("🧠 Prompt token IDs:", inputIds)
let model = try float16_model(configuration: .init())
let maxTokens = 30
for _ in 0..<maxTokens {
let input = try MLMultiArray(shape: [1, 128], dataType: .int32)
let mask = try MLMultiArray(shape: [1, 128], dataType: .int32)
for i in 0..<inputIds.count {
input[i] = NSNumber(value: inputIds[i])
mask[i] = 1
}
for i in inputIds.count..<128 {
input[i] = 0
mask[i] = 0
}
let output = try model.prediction(input_ids: input, attention_mask: mask)
let logits = output.logits // shape: [1, seqLen, vocabSize]
let lastIndex = inputIds.count - 1
let lastLogitsStart = lastIndex * 32003 // vocab size = 32003
var nextToken = 0
var maxLogit: Float32 = -Float.greatestFiniteMagnitude
for i in 0..<32003 {
let logit = logits[lastLogitsStart + i].floatValue
if logit > maxLogit {
maxLogit = logit
nextToken = i
}
}
inputIds.append(nextToken)
if nextToken == 32002 { break }
let partialText = try await tokenizer.decode(tokens:inputIds)
print(partialText)
}
} catch {
print("❌ Error: \(error)")
}
}
}
Topic:
Machine Learning & AI
SubTopic:
Core ML
Hello folks! Taking a look at https://developer.apple.com/documentation/foundationmodels it’s not clear how to use another models there.
Do anyone knows if it’s possible use one trained model from outside (imported) here in foundation models framework?
Thanks!
I'm attempting to run a basic Foundation Model prototype in Xcode 26, but I'm getting the error below, using the iPhone 16 simulator with iOS 26. Should these models be working yet? Do I need to be running macOS 26 for these to work? (I hope that's not it)
Error:
Passing along Model Catalog error: Error Domain=com.apple.UnifiedAssetFramework Code=5000 "There are no underlying assets (neither atomic instance nor asset roots) for consistency token for asset set com.apple.MobileAsset.UAF.FM.Overrides" UserInfo={NSLocalizedFailureReason=There are no underlying assets (neither atomic instance nor asset roots) for consistency token for asset set com.apple.MobileAsset.UAF.FM.Overrides} in response to ExecuteRequest
Playground to reproduce:
#Playground {
let session = LanguageModelSession()
do {
let response = try await session.respond(to: "What's happening?")
} catch {
let error = error
}
}
I installed Xcode 26.0 beta and downloaded the generative models sample from here:
https://developer.apple.com/documentation/foundationmodels/adding-intelligent-app-features-with-generative-models
But when I run it in the iOS 26.0 simulator, I get the error shown here. What's going wrong?
Topic:
Machine Learning & AI
SubTopic:
Foundation Models
Introduced in the Keynote was the 3D Lock Screen images with the kangaroo:
https://9to5mac.com/wp-content/uploads/sites/6/2025/06/3d-lock-screen-2.gif
I can't see any mention on if this effect is available for developers with an API to convert flat 2D photos in to the same 3D feeling image.
Does anyone know if there is an API?
Topic:
Machine Learning & AI
SubTopic:
General
I couldn't find information about this in the documentation. Could someone clarify if this API is available and how to access it?
I'm the creator of an app that helps users learn Arabic. Inside of the app users can save words, engage in lessons specific to certain grammar concepts etc. I'm looking for a way for Siri to 'suggest' my app when the user asks to define any Arabic words. There are other questions that I would like for Siri to suggest my app for, but I figure that's a good start. What framework am I looking for here? I think AppItents? I remember I played with it for a bit last year but didn't get far. Any suggestions would be great.
Would the new Foundations model be any help here?
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
I am watching a few WWDC sessions on Foundation Model and its usage and it looks pretty cool.
I was wondering if it is possible to perform RAG on the user documents on the devices and entuallly on iCloud...
Let's say I have a lot of pages documents about me and I want the Foundation model to access those information on the documents to answer questions about me that can be retrieved from the documents.
How can this be done ?
Thanks
Topic:
Machine Learning & AI
SubTopic:
Foundation Models
Greetings! I was trying to get a response from the LanguageModelSession but I just keep getting the following:
Error getting response: Model Catalog error: Error Domain=com.apple.UnifiedAssetFramework Code=5000 "There are no underlying assets (neither atomic instance nor asset roots) for consistency token for asset set com.apple.MobileAsset.UAF.FM.Overrides" UserInfo={NSLocalizedFailureReason=There are no underlying assets (neither atomic instance nor asset roots) for consistency token for asset set com.apple.MobileAsset.UAF.FM.Overrides}
This occurs both in macOS 15.5 running the new Xcode beta with an iOS 26 simulator, and also on a macOS 26 with Xcode beta. The simulators are both Pro iPhone 16s.
I was wondering if anyone had any advice?
I am excited to try Foundation Models during WWDC, but it doesn't work at all for me. When running on my iPad Pro M4 with iPadOS 26 seed 1, I get the following error even when running the simplest query:
let prompt = "How are you?"
let stream = session.streamResponse(to: prompt)
for try await partial in stream {
self.answer = partial
self.resultString = partial
}
In the Xcode console, I see the following error:
assetsUnavailable(FoundationModels.LanguageModelSession.GenerationError.Context(debugDescription: "Model is unavailable", underlyingErrors: []))
I have verified that Apple Intelligence is enabled on my iPad. Any tips on how can I get it working? I have also submitted this feedback: FB17896752
Topic:
Machine Learning & AI
SubTopic:
Foundation Models
How do I test the new RecognizeDocumentRequest API. Reference: https://www.youtube.com/watch?v=H-GCNsXdKzM
I am running Xcode Beta, however I only have one primary device that I cannot install beta software on.
Please provide a strategy for testing. Will simulator work?
The new capability is critical to my application, just what I need for structuring document scans and extraction.
Thank you.
Attempted to download the Adapter Toolkit linked to from https://developer.apple.com/apple-intelligence/foundation-models-adapter/. Failed on all attempts, with a "403 Forbidden" error. I had accepted the agreement on the first attempt. How would we get access please?
Hi,
I have an app that uses Core Data to store user information and display it in various views. I want to know if it's possible to easily integrate this setup with FoundationModels to make it easier for the user to query and manipulate the information, and if so, how would I go about it? Can the model be pointed to the database schema file and the SQLite file sitting in the user's app group container to parse out the information needed? And/or should the NSManagedObjects be made @Generable for better output? Any guidance about this would be useful.
Hello. I am willing to hire game developer for cards game called baloot. My question is Can the developer implement an AI when the computer is playing and the computer on the same time the conputer improves his rises level without any interaction?
🌹
Topic:
Machine Learning & AI
SubTopic:
General
Hi Apple team,
When using AppShortcutsProvider, I hit the hard limit:
Each app may have at most 10 App Shortcuts.
This feels limiting for apps that offer multiple workflows and would benefit from deeper Siri integration.
Could this cap be raised — ideally to 30 — to support broader use of AppIntents, enhance Siri automation, and unlock more system-level capabilities?
AppShortcuts are a fantastic tool. Increasing the limit would make them even more powerful.
Thanks!
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
Tags:
Shortcuts
App Intents
Apple Intelligence
I downloaded the new developer beta and then installed xcode. I did the downloads but I couldn't download the Predictive Code Completion Model. When I try to download it I get the error "The operation couldn’t be completed. (ModelCatalog.CatalogErrors.AssetErrors error 1.)". I am using the M3 Pro model.
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
I've downloaded the Xcode-beta and run the sample project "FoundationModelsTripPlanner" but I got this error when trying generate the response.
InferenceError::inferenceFailed::Error Domain=com.apple.UnifiedAssetFramework Code=5000 "There are no underlying assets (neither atomic instance nor asset roots) for consistency token for asset set com.apple.modelcatalog" UserInfo={NSLocalizedFailureReason=There are no underlying assets (neither atomic instance nor asset roots) for consistency token for asset set com.apple.modelcatalog}
Device: M1 Pro
Question:
Is it because M1 not supporting this feature?
Topic:
Machine Learning & AI
SubTopic:
Foundation Models
I'm experimenting with using the Foundation Models framework to do news summarization in an RSS app but I'm finding that a lot of articles are getting kicked back with a vague message about guardrails.
This seems really common with political news but we're talking mainstream stuff, i.e. Politico, etc.
If the models are this restrictive, this will be tough to use. Is this intended?
FB17904424
Topic:
Machine Learning & AI
SubTopic:
Foundation Models