With iOS 18, Writing Tools are enabled for text fields all over the system. But under the hood, this uses Apple's on device LLM to summarize a piece of text. Is there any kind of Swift API to access this LLM summarization feature for pieces of text that I provide to the API? Instead of forcing the user to select the text.
Apple Intelligence
RSS for tagApple Intelligence is the personal intelligence system that puts powerful generative models right at the core of your iPhone, iPad, and Mac and powers incredible new features to help users communicate, work, and express themselves.
Post
Replies
Boosts
Views
Activity
Dear Apple Development Team,
I’m writing to express my concerns and request a feature enhancement regarding the ChatGPT app for iOS. Currently, the app's audio functionality does not work when the app is in the background. This limitation significantly affects the user experience, particularly for those of us who rely on the app for ongoing, interactive voice conversations.
Given that many apps, particularly media and streaming services, are allowed to continue audio playback when minimized, it’s frustrating that the ChatGPT app cannot do the same. This restriction interrupts the flow of conversation, forcing users to stay within the app to maintain an audio connection.
For users who multitask on their iPhones, being able to switch between apps while continuing to listen or interact with ChatGPT is essential. The ability to reference notes, browse the web, or even respond to messages while maintaining an ongoing conversation with ChatGPT would greatly enhance the app’s usability and align it with other background-capable apps.
I understand that Apple prioritizes resource management and device performance, but I believe there’s a strong case for allowing apps like ChatGPT to operate with background audio. Given its growing importance as a tool for productivity, learning, and communication, adding this capability would provide significant value to users.
I hope you will consider this feedback for future updates to iOS, or provide guidance on any existing APIs that could be leveraged to enable such functionality.
Thank you for your time and consideration.
Best regards,
luke
yes I used gpt to write this.
Dear Apple Team,
I have a suggestion to enhance the Apple Watch user experience.
A new feature could provide personalized
recommendations based on weather conditions and the user’s mood.
For example, during hot weather, it could suggest drink something cold,
or if the user feeling down,it could offer ways the boost their mood.
This kind of feature could make the Apple Watch not just a health and fitness
tracker but also a more functional personal assistant.
“İmprove communication with Apple Watch”
Feature #1: Noise detection and location
suggestions.
Imagine having your Apple Watch detect
ambient noise levels and suggest a quieter location for your call.
Feature #2: Context-aware call response options.
If you can't answer a call,
your Apple Watch could offer pre-set responses to communicate your status and
reduce missed call anxiety.
For example, if you're in a busy restaurant, your Apple Watch could suggest
moving to a quieter spot nearby for a better conversation.
Or if you’re in a movie theater,your Apple Watch could send an automatic
“I’m in the movie’s” text to the caller.
“İmprove user experience and app management“
Automated Sleep Notifications:
The ability for the Apple Watch to automatically turn off notifications or change the watch face
when the user is sleeping would provide a more seamless experience.
For instance,when the watch detects that the user is in sleep mode,it could enable Do Not
Disturb to silence calls and alerts.
Caller Notification:
In addition,it would be great if the Apple Watch
could inform callers that the user is currently sleeping.
This could help manage expectations for those attempting to reach the user at night.
App Management to Conserve Battery:
Implementing a feature that detects
draining app's when the user is asleep could further enhance battery life.
The watch and the iphone could close or pause apps that are using significant
power while the user is not active.
I believe these features could provide valuable advancements in enhancing the
Apple Watch's usability for those who prioritize a restful night's sleep.
Thank you for considering my suggestions.
Best regards,
Mahmut Ötgen
Istanbul,Turkey
Well, hello there.
Correct me if I’m wrong: Apple Intelligence will be available only for US developers, and if you want to use it in your app you should immigrate to US?
Because now not only usage of it’s limited for US, but every new API usage are prohibited too.
It's been over 2 hours and my connection is fine. Already tried restarting a couple of times
Hello,
I still have problems with the activation with Apple Intelligence despite Beta 2 I had recently written a post here but had to make a new one because I had accidentally marked the old post as solved. I really tried to restart everything and reinstall MacOS it always gets stuck with Preparing even in memory you can't see a download I even tried it with VPN. I constantly use mobile hotspot is it maybe because of that? Unfortunately, I don't have Wi-Fi at home, only an unlimited data volume contract.
MacBook Air M2 and 170GB free Storage
I've been running Sequoia 15.1 since it was released. I soon thereafter was taken off the waitlist and had been using Apple Intelligence until this morning.
My first hint something was wrong was that Writing Tools, which I'd been using extensively, disappeared. I tried in another app, and it wasn't there, either.
I then looked at the Siri icon in my menu bar - which looks different under Apple Intelligence - and it had been reverted to the old icon.
I then checked my Apple Intelligence settings and, sure enough, not only was it off, but I'd been returned to the waitlist.
My iOS and iPadOS devices continue working just fine with Apple Intelligence. Only my MacBook Pro is experiencing this issue.
Has anyone else seen this?
Apple's sample code 'Trails' supports multiple scenes, however everything is using shared state across the scenes. Put the app in Split View mode and have two windows of the app running and navigate, you can see both mirror each other. Works as designed, it is using a shared 'navigation model' across all scenes.
https://developer.apple.com/documentation/appintents/acceleratingappinteractionswithappintents
I would like to know if there is a supported or recommended way to modify individual scene storage from within the perform body of an AppIntent. The objective is to have App Shortcuts that launch different tabs in a TabView or different selections in a List.
In short, I want to deep link to features, but account for more than one scene being open on iPad and only have programatic navigation happen on the scene that is 'foremost' or the 'activated' one in Split View.
I have it working with either a @Dependency or posting a Notification with my main ContentView listening to the other end, but it changes all scenes.
The action “bla bla” could not run because of an internal error on specific mobile only
I have tried updating shortcuts app and restart my phone but nothing works
I have updated to MacOS Sequoia, but i do not see Apple Intelligence and Siri in the settings, I can just see Siri.
Apple Intelligents is here, but I have some problems. First of all, it often shows that something is being downloaded on the settings page. Is this normal? And the Predictive Code Completion Model in Xcode seems to have been suddenly deleted and needs to be re-downloaded, and the error The operation couldn't be complet has occurred. Ed. (ModelCatalog.CatalogErrors.AssetErrors error 1.), detailed log:
The operation couldn’t be completed. (ModelCatalog.CatalogErrors.AssetErrors error 1.)
Domain: ModelCatalog.CatalogErrors.AssetErrors
Code: 1
User Info: {
DVTErrorCreationDateKey = "2024-08-27 14:42:54 +0000";
}
--
Failed to find asset: com.apple.fm.code.generate_small_v1.base - no asset
Domain: ModelCatalog.CatalogErrors.AssetErrors
Code: 1
--
System Information
macOS Version 15.1 (Build 24B5024e)
Xcode 16.0 (23049) (Build 16A5230g)
Timestamp: 2024-08-27T22:42:54+08:00
When building and running an app on iOS 18.1 Beta 3 a fresh sample app with an @AssistantIntent will immediately crash. Using a sample Assistant Intent from the developer documentation site will cause this. Removing the @AssistantIntent macro will allow the app to run. Using Xcode 16.1 beta.
dyld[1278]: Symbol not found: _$s10AppIntents15AssistantSchemaV06IntentD0VAC0E0AAWP Referenced from: <DC018008-EC0E-3251-AAFC-5DEB51863F17> /private/var/containers/Bundle/Application/2726C2CE-0255-4692-A7CA-B343146D4A83/Runner.app/Runner.debug.dylib Expected in: <E9AF073B-B6E0-31B8-88AA-092774CEEE3D> /System/Library/Frameworks/AppIntents.framework/AppIntents
(FB14949135)
Hi, it's been 3 days that Apple Intelligence is stuck on preparing step and I don't know what can I do, can you help me please ?
(macOS Sonoma 15.1 Beta 3, Mac mini M1)
I just installed iOS 18.1 Beta 3 on my iPad M4 (I was previously on 18.0 betas).
I did the the same thing on my iPhone 15 Pro Max which works perfectly.
However on the iPad, it seems to be stuck on 99% and won't complete downloading.
The status message near the top keeps switching between "downloading" and "will continue later on WiFi".
Note, I'm connected to my home WiFi, very fast and iPhone was on the same network and downloaded quickly without issue.
Is there a way to reset and start again since it's stuck? This is really frustrating.
This has been going on for several hours at this point.
I am working on an app which would refine text the user wrote without user having to select the text and then interact with the options.
For example, one use-case is where users talks into the microphone and dictates the text, which is refined immediately.
Is this something where Apple-Inteligence, or Writing Tools can assist?
Hello, Mac Mini M1 2020, macOS 15.1 (24B5035e), AI enabled, no cleanup function in the Photos app, is that normal in your opinion? Thank you.
I am checking actual movement on iOS18.1 beta 3 devices, but the following items are not functioning.
Image Playground
Image Wand
Genmoji
Please let me know the following
Are the above 3 items available on iOS18.1 beta 3?
If available, are there any other operations other than enabling Apple Intelligece that are required to use the features?
Hi,
I have an existing app with AppEntities defined, that works on iOS16 and iOS17. The AppEntities also have EntityPropertyQuery defined, so they work as 'find intents'. I want to use the new @AssistantEntity on iOS18, while supporting the previous versions. What's the best way to do this?
For e.g. I have a 'person' AppEntity:
@available(iOS 16.0, macOS 13.0, watchOS 9.0, tvOS 16.0, *)
struct CJLogAppEntity: AppEntity {
static var defaultQuery = CJLogAppEntityQuery()
....
}
struct CJLogAppEntityQuery: EntityPropertyQuery {
...
}
How do I adopt this with @AssistantEntity(schema: .journal.entry) for iOS18, while maintaining compatibility with iOS16 and 17?
I've got Apple AI working on my iPhone 15 pro max, SIRI 2.0 working as expected, however I don't seem to have the below options for Apple AI working / appearing.
AI in mail
AI in notes
Clean up just stuck on downloading in photos
Not sure if my setup is wrong or it's just not available for me yet
As a user, when viewing a photo or image, I want to be able to tell Siri, “add this to ”, similar to example from the WWDC presentation where a photo is added to a note in the notes app.
Is this... possible with app domains as they are documented?
I see domains like open-file and open-photo, but I don't know if those are appropriate for this kind of functionality?