Trying to experiment with Genmoji per the WWDC documentation and samples, but I don't seem to get Genmoji keyboard.
I see this error in my log:
Received port for identifier response: <(null)> with error:Error Domain=RBSServiceErrorDomain Code=1 "Client not entitled" UserInfo={RBSEntitlement=com.apple.runningboard.process-state,
NSLocalizedFailureReason=Client not entitled, RBSPermanent=false}
elapsedCPUTimeForFrontBoard couldn't generate a task port
Is anything presently supported for developers? All I have done here is a simple app with a UITextView and code for:
textView.supportsAdaptiveImageGlyph = true
Any thoughts?
Apple Intelligence
RSS for tagApple Intelligence is the personal intelligence system that puts powerful generative models right at the core of your iPhone, iPad, and Mac and powers incredible new features to help users communicate, work, and express themselves.
Post
Replies
Boosts
Views
Activity
As a user, when viewing a photo or image, I want to be able to tell Siri, “add this to ”, similar to example from the WWDC presentation where a photo is added to a note in the notes app.
Is this... possible with app domains as they are documented?
I see domains like open-file and open-photo, but I don't know if those are appropriate for this kind of functionality?
I’m currently developing an app that features a main view with a UITableView. When users select a row, they are navigated to a detail view that contains a UITextField. This UITextField already supports Writing Tools.
My question is: When a user long-presses a UITableView cell, is it possible to add a Writing Tools option to the Context Menu, allowing users to interact with the Writing Tools more conveniently?like Summary detail text
I need to add AI Image Playground in my iOS app with UIKit, as per WWDC 2024 introduce new AI Image Playground API, I didn't find any official document yet, So how can add it ?
I signed up for apple intelligence on my IPad Air m1 and then updated my phone today. It tells me that I’ve already been put on a waitlist i didn’t even join? And it’s been stuck on that for 2 days now
i have iphone 15 pro max on ios 18.1 I tried to join Apple Intelligence about three weeks ago, but I'm still stuck on the waitlist. I've already tried everything recommended by Apple, including changing my region and Siri's language to English (US). Can anyone help me figure out how to solve this issue?
Yesterday after updating to iOS 18.1 I joined the Apple Intelligence waitlist on my iPhone 15 Pro. About an hour later I noticed that it had the message "Support for processing Apple Intelligence on device is downloading." A day later it is still displaying the same message. I have strong wi-fi, I'm plugged in to power with full battery, and there are 750gb available in storage. From what I have been able to find online, this isn't the typical user experience and that it probably isn't going to complete the process at this point. Any advice on how to proceed and get Apple Intelligence installed and working would be greatly appreciated.
I just updated to 18.1 beta 4 and the only AI features I can use are the new Siri and photo cleanup. The rest are unavailable. Apple intelligence takes up 589mb on my iphone
hey just curious if apple intelligence will be available on iPhone 15 Plus as well??? in october or is there a way that iPhone 15 Plus owners can join apple intelligence’s wait lists or something??? please let me know !😫
I've got Apple AI working on my iPhone 15 pro max, SIRI 2.0 working as expected, however I don't seem to have the below options for Apple AI working / appearing.
AI in mail
AI in notes
Clean up just stuck on downloading in photos
Not sure if my setup is wrong or it's just not available for me yet
With iOS 18, Writing Tools are enabled for text fields all over the system. But under the hood, this uses Apple's on device LLM to summarize a piece of text. Is there any kind of Swift API to access this LLM summarization feature for pieces of text that I provide to the API? Instead of forcing the user to select the text.
Hi,
I have an existing app with AppEntities defined, that works on iOS16 and iOS17. The AppEntities also have EntityPropertyQuery defined, so they work as 'find intents'. I want to use the new @AssistantEntity on iOS18, while supporting the previous versions. What's the best way to do this?
For e.g. I have a 'person' AppEntity:
@available(iOS 16.0, macOS 13.0, watchOS 9.0, tvOS 16.0, *)
struct CJLogAppEntity: AppEntity {
static var defaultQuery = CJLogAppEntityQuery()
....
}
struct CJLogAppEntityQuery: EntityPropertyQuery {
...
}
How do I adopt this with @AssistantEntity(schema: .journal.entry) for iOS18, while maintaining compatibility with iOS16 and 17?
I am checking actual movement on iOS18.1 beta 3 devices, but the following items are not functioning.
Image Playground
Image Wand
Genmoji
Please let me know the following
Are the above 3 items available on iOS18.1 beta 3?
If available, are there any other operations other than enabling Apple Intelligece that are required to use the features?
I just installed iOS 18.1 Beta 3 on my iPad M4 (I was previously on 18.0 betas).
I did the the same thing on my iPhone 15 Pro Max which works perfectly.
However on the iPad, it seems to be stuck on 99% and won't complete downloading.
The status message near the top keeps switching between "downloading" and "will continue later on WiFi".
Note, I'm connected to my home WiFi, very fast and iPhone was on the same network and downloaded quickly without issue.
Is there a way to reset and start again since it's stuck? This is really frustrating.
This has been going on for several hours at this point.
I am working on an app which would refine text the user wrote without user having to select the text and then interact with the options.
For example, one use-case is where users talks into the microphone and dictates the text, which is refined immediately.
Is this something where Apple-Inteligence, or Writing Tools can assist?
Hello, Mac Mini M1 2020, macOS 15.1 (24B5035e), AI enabled, no cleanup function in the Photos app, is that normal in your opinion? Thank you.
I'm not comfortable yet installing the iOS beta on my iPhone 15 - is it possible to play with Apple Intelligence in the simulator, or is it on-device only?
Hi, it's been 3 days that Apple Intelligence is stuck on preparing step and I don't know what can I do, can you help me please ?
(macOS Sonoma 15.1 Beta 3, Mac mini M1)
When building and running an app on iOS 18.1 Beta 3 a fresh sample app with an @AssistantIntent will immediately crash. Using a sample Assistant Intent from the developer documentation site will cause this. Removing the @AssistantIntent macro will allow the app to run. Using Xcode 16.1 beta.
dyld[1278]: Symbol not found: _$s10AppIntents15AssistantSchemaV06IntentD0VAC0E0AAWP Referenced from: <DC018008-EC0E-3251-AAFC-5DEB51863F17> /private/var/containers/Bundle/Application/2726C2CE-0255-4692-A7CA-B343146D4A83/Runner.app/Runner.debug.dylib Expected in: <E9AF073B-B6E0-31B8-88AA-092774CEEE3D> /System/Library/Frameworks/AppIntents.framework/AppIntents
(FB14949135)
Apple Intelligents is here, but I have some problems. First of all, it often shows that something is being downloaded on the settings page. Is this normal? And the Predictive Code Completion Model in Xcode seems to have been suddenly deleted and needs to be re-downloaded, and the error The operation couldn't be complet has occurred. Ed. (ModelCatalog.CatalogErrors.AssetErrors error 1.), detailed log:
The operation couldn’t be completed. (ModelCatalog.CatalogErrors.AssetErrors error 1.)
Domain: ModelCatalog.CatalogErrors.AssetErrors
Code: 1
User Info: {
DVTErrorCreationDateKey = "2024-08-27 14:42:54 +0000";
}
--
Failed to find asset: com.apple.fm.code.generate_small_v1.base - no asset
Domain: ModelCatalog.CatalogErrors.AssetErrors
Code: 1
--
System Information
macOS Version 15.1 (Build 24B5024e)
Xcode 16.0 (23049) (Build 16A5230g)
Timestamp: 2024-08-27T22:42:54+08:00