Hi
I got access to developer beta 18.2 . All the features are working fine but image Playground stuck at Early Access requested. Can someone share how much time it takes before you are able to access image playground? I have been stuck for 3 hours
Apple Intelligence
RSS for tagApple Intelligence is the personal intelligence system that puts powerful generative models right at the core of your iPhone, iPad, and Mac and powers incredible new features to help users communicate, work, and express themselves.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
My Playground app has been stuck at Downloading support for Image Playground now for about 4-5 days. I've reinstalled the app, restarted phone multiple times, reset network settings 2-3 times. Tried LTE/5G and Different WIFI networks and it still sat showing Downloading Support.
Anyone else having this issue?
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
Hello,
I'm unable to develop for Apple Intelligence on my Mac Studio, M1 Max running macOS 26 beta 1.
The models get downloaded and I can also verify that they exist in /System/Library/AssetsV2/ however the download progress remains stuck at 100%.
Checking console logs shows the process generativeexperiencesd reporting the following:
My device region and language is set to English (India).
Things I've already tried:
Changing language and region to English (US)
Reinstalling macOS
Trying with a different ISP via hotspot.
I wanted to join the Apple Intelligence after I updated my iPhone 15pro to iOS 18.1 beta. But it is still showing that I’m in the waitlist. It has been almost one day! Why? Is it normal?
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
After installing IOS 18.2 public beta, the Apple intelligence been stuck on downloading and now I’m Back to the old Siri. Which kinda sucks. lol
Any help or tips to get my APPLE INTELLIGENCE back?
With apple intelligence enabled I am seeing a 10-15 sec delay in push notification.
This is actually causing issue with my alarm system, door bells and 2FA requests.
I'm using iPhone 16 Pro.
First installed 18.1 Dev beta 5 and AI and noticed this.
When beta 6 was released it continued.
I put in FB15389900.
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
Tags:
User Notifications
Notification Center
I can’t seem to login my paid chatgpt account in apple intelligence setting. Is this a bug?
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
Using an iPhone 15 Pro. I'm in Australia on iOS 18.2 Beta 1 but there's no mention of Apple Intelligence anywhere in the phone. Settings looks identical without the menu for AI, there's no playground image generation. It's as if my phone doesn't know it exists
I requested early access for image playground 24 hours ago and haven't gotten it approved. Are they rolling it out slowly or is it a problem with my phone.
Is anyone else seeing their apps crash on iOS/macOS 17.4/14.4 and newer when building a project that simply just includes the iOS 18 @AssistantIntent Macro?
The beta 4 releases still have this problem. There are no notes about this that I have seen in the beta release notes. Crash message shown in console when trying to run on 17.4, 17.5, 17.5.1, etc:
dyld[21935]: Symbol not found: _$s10AppIntents15AssistantSchemaV06IntentD0VAC0E0AAWP Referenced from: <F7A1FEF0-F3B0-379C-A914-D1FB0BA7C693> /Users/jonathan/Library/Developer/CoreSimulator/Devices/CA308F47-BCA8-4429-8599-1BB1CCEAB5B6/data/Containers/Bundle/Application/D7DC8E16-90DB-406A-A521-20F18326E4A7/IntentDemo.app/IntentDemo.debug.dylib Expected in: <88E18E38-24EC-364E-94A1-E7922AD247AF> /Library/Developer/CoreSimulator/Volumes/iOS_21F79/Library/Developer/CoreSimulator/Profiles/Runtimes/iOS 17.5.simruntime/Contents/Resources/RuntimeRoot/System/Library/Frameworks/AppIntents.framework/AppIntents
Obviously, the new Apple Intelligence AssistantIntents only work on the 2024 OS releases. However, even when these new App Intents are marked with @available(iOS 18, macOS 15, *), the app crashes on any earlier OS version. But it runs just fine on iOS 18 and macOS 15...
I would love for me to just have done something wrong but I don’t think I have… Here is the sample project: https://github.com/JTostitos/FB14323923
Maybe it's a compiler issue thats failing to strip out the macro when building for older OS's or an Xcode issue - I have no idea. I just would like to know why its not working and how to resolve it.
Thanks in advance for anyones help...
Hiya,
I have downloaded the update and requested early access for playground etc yet almost a week in and I’m still waiting for early access, anyone else having this problem?
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
Hello, we're investigating an option to disable writing tools for some customers in our app. I'm aware of the writingToolsBehavior property for UITextView etc, but we would like to have a way to set this globally without having to update all UITextView instances (or future instances). Is there any API to do this?
We tried using UITextView.appearance.writingToolsBehavior = .none and it seemed promising on 18.2 beta, however it introduced crashes on devices running 18.1.
The crashes look like:
*** Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: 'Have you sent -setWritingToolsBehavior: to <UITextView: 0x14462c000; frame = (0 0; 0 0); text = ''; userInteractionEnabled = NO; gestureRecognizers = <NSArray: 0x30067cb40>; backgroundColor = UIExtendedGrayColorSpace 0 0; layer = <CALayer: 0x3009b1ba0>; contentOffset: {0, 0}; contentSize: {0, 0}; adjustedContentInset: {0, 0, 0, 0}> off the main thread? To verify, look for a complaint in the logs: "Unsupported use of UIKit…", and fix the problem if you find it. If your use is main-thread only please file a radar on UIKit, and attach this log. exercisedImplementations = {
"setWritingToolsBehavior:" = (
);
}'
Similarly, even on 18.2 beta if we used UITextField.appearance.writingToolsBehavior = .none we would see crashes for any search fields like:
*** Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: 'Have you sent -setWritingToolsBehavior: to <UISearchBarTextField: 0x141c04a00; frame = (0 0; 0 0); text = ''; opaque = NO; gestureRecognizers = <NSArray: 0x301fe15c0>; placeholder = Search Leads; borderStyle = RoundedRect; background = <_UITextFieldSystemBackgroundProvider: 0x3015de960: backgroundView=<_UISearchBarSearchFieldBackgroundView: 0x141c60200; frame = (0 0; 0 0); opaque = NO; autoresize = W+H; userInteractionEnabled = NO; layer = <CALayer: 0x3015de8e0>>, fillColor=(null), textfield=<UISearchBarTextField: 0x141c04a00>>; layer = <CALayer: 0x3015de240>> off the main thread? To verify, look for a complaint in the logs: "Unsupported use of UIKit…", and fix the problem if you find it. If your use is main-thread only please file a radar on UIKit, and attach this log. exercisedImplementations = {
"setWritingToolsBehavior:" = (
);
}'
Is it possible to set this globally?
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
Hi everyone,
Could someone confirm if it's currently possible, or if there are any plans, to restrict users from enabling Apple Intelligence altogether?
I understand that we can block individual features using MDM, but I'm interested in knowing if we can prevent users from toggling Apple Intelligence on and off in System Settings entirely.
Thanks!
Kind Regards,
Filipe Nogueira
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
I'm attempting to run a basic Foundation Model prototype in Xcode 26, but I'm getting the error below, using the iPhone 16 simulator with iOS 26. Should these models be working yet? Do I need to be running macOS 26 for these to work? (I hope that's not it)
Error:
Passing along Model Catalog error: Error Domain=com.apple.UnifiedAssetFramework Code=5000 "There are no underlying assets (neither atomic instance nor asset roots) for consistency token for asset set com.apple.MobileAsset.UAF.FM.Overrides" UserInfo={NSLocalizedFailureReason=There are no underlying assets (neither atomic instance nor asset roots) for consistency token for asset set com.apple.MobileAsset.UAF.FM.Overrides} in response to ExecuteRequest
Playground to reproduce:
#Playground {
let session = LanguageModelSession()
do {
let response = try await session.respond(to: "What's happening?")
} catch {
let error = error
}
}
After more than a year since the announcement, I'm still unable to get this feature working properly and wondering if there are known issues or missing implementation details.
Current Setup:
Device: iPhone 16 Pro Max
iOS: 26 beta 3
Development: Tested on both Xcode 16 and Xcode 26
Implementation: Following the official documentation examples
The Problem:
Semantic search simply doesn't work. Lexical search functions normally, but enabling semantic search produces identical results to having it disabled. It's as if the feature isn't actually processing.
Error Output (Xcode 26):
[QPNLU][qid=5] Error Domain=com.apple.SpotlightEmbedding.EmbeddingModelError Code=-8007 "Text embedding generation timeout (timeout=100ms)"
[CSUserQuery][qid=5] got a nil / empty embedding data dictionary
[CSUserQuery][qid=5] semanticQuery failed to generate, using "(false)"
In Xcode 16, there are no error messages at all - the semantic search just silently fails.
Missing Resources:
The sample application mentioned during the WWDC24 presentation doesn't appear to have been released, which makes it difficult to verify if my implementation is correct.
Would really appreciate any guidance or clarification on the current status of this feature. Has anyone in the community successfully implemented this?
I found what might be a bug with enabling Apple Intelligence when switching languages. When my iPhone's language is set to Catalan, the Apple Intelligence is disabled because it is not available for that language. Switching to Spanish doesn't activate it, and it still shows the same message of being unavailable, this time saying not available in Spanish (which is not true). However, it is enabled when the phone is rebooted.
Once at this point, the bug becomes even weirder. Having the iPhone language set to Spanish and with Apple Intelligence on, I switch the language to Catalan, and the feature remains enabled. After I ask a query in Catalan, it surprisingly understands it and works, but then it gets disabled.
Apart from that, as user feedback, I would love to activate Apple Intelligence in an available language other than my device's language. That's how I always used Siri (iPhone in Catalan, Siri in Spanish).
Thanks!
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
Tags:
Siri and Voice
Internationalization
Localization
Apple Intelligence
I'm using an iPhone 15 Pro Max and running developer beta 18.2 released today. I've already been an 'Apple Intelligence' user and now have been able to link it with my PAID ChatGPT account.
HOWEVER;
I'm searching for these Image features everyone seems to be posting about and cannot find them anywhere.
I'm apparently supposed to sign up for beta access to the Image features through some new Apple natively released app that was supposedly included in this build update, which I cannot find.
What gives??!!
I have updated the ios 18.2. The image playground is stuck at access requested and its been one and a half day. When will this be available on my phone i am using 16 pro max.
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
Just wanted to reach out to see if this is the norm. I see several posts saying people are still waiting for the early access playground app, what’s going on? it’s been almost 48 hours and I’ve received nothing. If this is the norm, then so be it…but even when I had to wait for the Apple Intelligence early access that was only a few hours. Hopefully, this will be resolved quickly. I mean what’s the point of being developer beta testers, if we can’t test the beta?
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
I just watched the October 30 MacBook Pro Announcement where they talked about on-device local LLMs for the M4s.
What developer training resources are available, where we can learn how to use custom llm models and build our Swift apps to use both Apple Intelligence and other llm models on device?
Is the guidance to follow MLX github repos, or were those experimental and now there is an approved workflow and tooling?
https://www.youtube.com/watch?v=G0cmfY7qdmY