When i’m update to iOS 18.2 beta 1, Apple intelligence is downloading still now!
“Apple Intelligence is downloading. Some Apple Intelligence and Siri features may be unavailable until the download is complete”
How can i solve this problem?
Apple Intelligence
RSS for tagApple Intelligence is the personal intelligence system that puts powerful generative models right at the core of your iPhone, iPad, and Mac and powers incredible new features to help users communicate, work, and express themselves.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
The simulator should support the new Siri Ul and Apple Intelligence features, at least they should work if Apple Intelligence is enabled on the Mac itself.
Feedback ID: FB15699827
I tried this:
struct CarShortcutsProvider: AppShortcutsProvider {
@AppShortcutsBuilder
static var appShortcuts: [AppShortcut] {
AppShortcut(
intent: LockCarIntent(),
phrases: ["Lock my car with \(.applicationName)", "Lock my \(\.$car) with \(.applicationName)"],
shortTitle: LocalizedStringResource("Lock Car"),
systemImageName: "lock.fill"
)
AppShortcut(
intent: UnlockCarIntent(),
phrases: ["Unlock my car with \(.applicationName)", "Unlock my \(\.$car) with \(.applicationName)"],
shortTitle: LocalizedStringResource("Unlock Car"),
systemImageName: "lock.open.fill"
)
}
}
but Siri only understands "unlock my car ", not with the placeholder. Siri asks me then for the car, and it understands it, but not in one sentence. Is there something wrong with my code?
Also I tried it without applicationName first, and then it didn't work at all with Siri. Is this a general limitation of app intents? I thought the goal was to reduce friction. If the user has to mention the app name all the time, it adds friction.
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
I just got my new iPhone 16 Pro and upgraded to the 18.2 developer beta 4. I've set both Siri and the device language to English (United States), but the Apple Intelligence feature still doesn’t appear in my settings.
I'm currently trying to add support for Image Playground to our apps. It seems that it's not working in an app that is "Designed for iPad" and runs on a Mac. The modal just shows a spinner and the following is logged to console:
Private sandbox for com.apple.GenerativePlaygroundApp.remoteUIExtension : <none>
Private sandbox for com.apple.GenerativePlaygroundApp.remoteUIExtension : <none>
Private sandbox for com.apple.GenerativePlaygroundApp.remoteUIExtension : <none>
Private sandbox for com.apple.GenerativePlaygroundApp.remoteUIExtension : <none>
GP extension could not be loaded: Extension (platform: 2) could not be found (in update)
dealloc Query controller [C32BA176-6A3E-465D-B3C5-0F8D91068B89]
ImagePlaygroundViewController.isAvailable returns true, however.
In a "real" Mac Catalyst app, it's working. Just not when the app is actually an iPad app.
Is this a bug?
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
Tags:
Photos and Imaging
Apple Intelligence
Hi everyone,
On the "Apple Intelligence & Siri" settings there's a section titled "Extensions" that specifically mentions ChatGPT.
This got me curious—does Apple provide an API or SDK for developers to create custom integrations or use Apple Intelligence Extensions? Or is this currently limited to the Apple/OpenAI partnership?
I appreciate any insights or links to relevant documentation.
Here's a screenshot of what I mean: https://imgur.com/a/4MuQkIJ
I have an image based app with albums, except in my app, albums are known as galleries.
When I tried to conform my existing OpenGalleryIntent with @AssistantIntent(schema: .photos.openAlbum), I had to change my existing gallery parameter to be called target in order to fit the predefined shape of this domain.
Previously, my intent was configured to display as “Open Gallery” with the description “Opens the selected Gallery” in the Shortcuts app. After conforming to the photos domain, it displays as “Open Album” with a description “Opens the Provided Album”.
Shortcuts is ignoring my configured title and description now. My code builds, but with the following build warnings:
Parameter argument title of a required Assistant schema intent parameter target should not be overridden
Implementation of the property title of an AppIntent conforming to AssistantSchemaIntent should not be overridden
Implementation of the property description of an AppIntent conforming to AssistantSchemaIntent should not be overridden
Is my only option to change the concept of a Gallery inside of my app into an Album? I don't want to do this... Conceptually, my app aligns well with this domain does, but I didn't consider that conforming to the shape of an AI schema intent would also dictate exactly how it's presented to the user.
FB16283840
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
Tags:
Siri and Voice
Shortcuts
App Intents
Apple Intelligence
Hello,
Are there any plans to compile a python 3.13 version of tensorflow-metal?
Just got my new Mac mini and the automatically installed version of python installed by brew is python 3.13 and while if I was in a hurry, I could manage to get python 3.12 installed and use the corresponding tensorflow-metal version but I'm not in a hurry.
Many thanks,
Alan
Is anyone else seeing their apps crash on iOS/macOS 17.4/14.4 and newer when building a project that simply just includes the iOS 18 @AssistantIntent Macro?
The beta 4 releases still have this problem. There are no notes about this that I have seen in the beta release notes. Crash message shown in console when trying to run on 17.4, 17.5, 17.5.1, etc:
dyld[21935]: Symbol not found: _$s10AppIntents15AssistantSchemaV06IntentD0VAC0E0AAWP Referenced from: <F7A1FEF0-F3B0-379C-A914-D1FB0BA7C693> /Users/jonathan/Library/Developer/CoreSimulator/Devices/CA308F47-BCA8-4429-8599-1BB1CCEAB5B6/data/Containers/Bundle/Application/D7DC8E16-90DB-406A-A521-20F18326E4A7/IntentDemo.app/IntentDemo.debug.dylib Expected in: <88E18E38-24EC-364E-94A1-E7922AD247AF> /Library/Developer/CoreSimulator/Volumes/iOS_21F79/Library/Developer/CoreSimulator/Profiles/Runtimes/iOS 17.5.simruntime/Contents/Resources/RuntimeRoot/System/Library/Frameworks/AppIntents.framework/AppIntents
Obviously, the new Apple Intelligence AssistantIntents only work on the 2024 OS releases. However, even when these new App Intents are marked with @available(iOS 18, macOS 15, *), the app crashes on any earlier OS version. But it runs just fine on iOS 18 and macOS 15...
I would love for me to just have done something wrong but I don’t think I have… Here is the sample project: https://github.com/JTostitos/FB14323923
Maybe it's a compiler issue thats failing to strip out the macro when building for older OS's or an Xcode issue - I have no idea. I just would like to know why its not working and how to resolve it.
Thanks in advance for anyones help...
Hiya,
I have downloaded the update and requested early access for playground etc yet almost a week in and I’m still waiting for early access, anyone else having this problem?
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
Hello, we're investigating an option to disable writing tools for some customers in our app. I'm aware of the writingToolsBehavior property for UITextView etc, but we would like to have a way to set this globally without having to update all UITextView instances (or future instances). Is there any API to do this?
We tried using UITextView.appearance.writingToolsBehavior = .none and it seemed promising on 18.2 beta, however it introduced crashes on devices running 18.1.
The crashes look like:
*** Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: 'Have you sent -setWritingToolsBehavior: to <UITextView: 0x14462c000; frame = (0 0; 0 0); text = ''; userInteractionEnabled = NO; gestureRecognizers = <NSArray: 0x30067cb40>; backgroundColor = UIExtendedGrayColorSpace 0 0; layer = <CALayer: 0x3009b1ba0>; contentOffset: {0, 0}; contentSize: {0, 0}; adjustedContentInset: {0, 0, 0, 0}> off the main thread? To verify, look for a complaint in the logs: "Unsupported use of UIKit…", and fix the problem if you find it. If your use is main-thread only please file a radar on UIKit, and attach this log. exercisedImplementations = {
"setWritingToolsBehavior:" = (
);
}'
Similarly, even on 18.2 beta if we used UITextField.appearance.writingToolsBehavior = .none we would see crashes for any search fields like:
*** Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: 'Have you sent -setWritingToolsBehavior: to <UISearchBarTextField: 0x141c04a00; frame = (0 0; 0 0); text = ''; opaque = NO; gestureRecognizers = <NSArray: 0x301fe15c0>; placeholder = Search Leads; borderStyle = RoundedRect; background = <_UITextFieldSystemBackgroundProvider: 0x3015de960: backgroundView=<_UISearchBarSearchFieldBackgroundView: 0x141c60200; frame = (0 0; 0 0); opaque = NO; autoresize = W+H; userInteractionEnabled = NO; layer = <CALayer: 0x3015de8e0>>, fillColor=(null), textfield=<UISearchBarTextField: 0x141c04a00>>; layer = <CALayer: 0x3015de240>> off the main thread? To verify, look for a complaint in the logs: "Unsupported use of UIKit…", and fix the problem if you find it. If your use is main-thread only please file a radar on UIKit, and attach this log. exercisedImplementations = {
"setWritingToolsBehavior:" = (
);
}'
Is it possible to set this globally?
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
Hi everyone,
Could someone confirm if it's currently possible, or if there are any plans, to restrict users from enabling Apple Intelligence altogether?
I understand that we can block individual features using MDM, but I'm interested in knowing if we can prevent users from toggling Apple Intelligence on and off in System Settings entirely.
Thanks!
Kind Regards,
Filipe Nogueira
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
I'm attempting to run a basic Foundation Model prototype in Xcode 26, but I'm getting the error below, using the iPhone 16 simulator with iOS 26. Should these models be working yet? Do I need to be running macOS 26 for these to work? (I hope that's not it)
Error:
Passing along Model Catalog error: Error Domain=com.apple.UnifiedAssetFramework Code=5000 "There are no underlying assets (neither atomic instance nor asset roots) for consistency token for asset set com.apple.MobileAsset.UAF.FM.Overrides" UserInfo={NSLocalizedFailureReason=There are no underlying assets (neither atomic instance nor asset roots) for consistency token for asset set com.apple.MobileAsset.UAF.FM.Overrides} in response to ExecuteRequest
Playground to reproduce:
#Playground {
let session = LanguageModelSession()
do {
let response = try await session.respond(to: "What's happening?")
} catch {
let error = error
}
}
After more than a year since the announcement, I'm still unable to get this feature working properly and wondering if there are known issues or missing implementation details.
Current Setup:
Device: iPhone 16 Pro Max
iOS: 26 beta 3
Development: Tested on both Xcode 16 and Xcode 26
Implementation: Following the official documentation examples
The Problem:
Semantic search simply doesn't work. Lexical search functions normally, but enabling semantic search produces identical results to having it disabled. It's as if the feature isn't actually processing.
Error Output (Xcode 26):
[QPNLU][qid=5] Error Domain=com.apple.SpotlightEmbedding.EmbeddingModelError Code=-8007 "Text embedding generation timeout (timeout=100ms)"
[CSUserQuery][qid=5] got a nil / empty embedding data dictionary
[CSUserQuery][qid=5] semanticQuery failed to generate, using "(false)"
In Xcode 16, there are no error messages at all - the semantic search just silently fails.
Missing Resources:
The sample application mentioned during the WWDC24 presentation doesn't appear to have been released, which makes it difficult to verify if my implementation is correct.
Would really appreciate any guidance or clarification on the current status of this feature. Has anyone in the community successfully implemented this?
I found what might be a bug with enabling Apple Intelligence when switching languages. When my iPhone's language is set to Catalan, the Apple Intelligence is disabled because it is not available for that language. Switching to Spanish doesn't activate it, and it still shows the same message of being unavailable, this time saying not available in Spanish (which is not true). However, it is enabled when the phone is rebooted.
Once at this point, the bug becomes even weirder. Having the iPhone language set to Spanish and with Apple Intelligence on, I switch the language to Catalan, and the feature remains enabled. After I ask a query in Catalan, it surprisingly understands it and works, but then it gets disabled.
Apart from that, as user feedback, I would love to activate Apple Intelligence in an available language other than my device's language. That's how I always used Siri (iPhone in Catalan, Siri in Spanish).
Thanks!
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
Tags:
Siri and Voice
Internationalization
Localization
Apple Intelligence
Using an iPhone 15 Pro. I'm in Australia on iOS 18.2 Beta 1 but there's no mention of Apple Intelligence anywhere in the phone. Settings looks identical without the menu for AI, there's no playground image generation. It's as if my phone doesn't know it exists
I requested early access for image playground 24 hours ago and haven't gotten it approved. Are they rolling it out slowly or is it a problem with my phone.
With apple intelligence enabled I am seeing a 10-15 sec delay in push notification.
This is actually causing issue with my alarm system, door bells and 2FA requests.
I'm using iPhone 16 Pro.
First installed 18.1 Dev beta 5 and AI and noticed this.
When beta 6 was released it continued.
I put in FB15389900.
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
Tags:
User Notifications
Notification Center
I can’t seem to login my paid chatgpt account in apple intelligence setting. Is this a bug?
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
I wanted to join the Apple Intelligence after I updated my iPhone 15pro to iOS 18.1 beta. But it is still showing that I’m in the waitlist. It has been almost one day! Why? Is it normal?
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence