I am revising my app to support NSWritingToolsCoordinator/ NSWritingToolsCoordinatorDelegate.
When proofreading some paragraphs, it works well.
But, when proofreading many paragraphs (for example, 56 paragraphs), it can't complete the proofreading.
I am not sure which is wrong with my app or macOS API.
I think I implemented all NSWritingToolsCoordinatorDelegate methods.
Is there any information for such an issue?
Phenomenon
For paragraphs 1-9, text animation completed. But, for paragraphs 10-56, text animation does not complete.
It shows 5 corrected items, but I can't jump to items 3, 4, 5, when I click the ">" button in the "Proofread" window.
Items 3, 4, 5 were not corrected actually.
Log
For each NSWritingToolsCoordinatorDelegate method, the method name and main arguments are output by NSLog.
requestsContextsForScope
willChangeToState newState:2
requestsPreviewForTextAnimation range:(0, 18233)
prepareForTextAnimation range:(0, 18233)
willChangeToState newState:3
requestsPreviewForTextAnimation range:(0, 18233)
finishTextAnimation range:(0, 18233)
requestsPreviewForRect
requestsPreviewForTextAnimation range:(0, 1837)
replaceRange proposedText:an range:(208, 2)
replaceRange proposedText:you range:(443, 4)
prepareForTextAnimation range:(1836, 16396)
requestsPreviewForTextAnimation range:(0, 1836)
requestsBoundingBezierPathsForRange range:(208, 2)
requestsBoundingBezierPathsForRange range:(443, 3)
requestsPreviewForRect
prepareForTextAnimation range:(0, 1836)
prepareForTextAnimation range:(1836, 0)
finishTextAnimation range:(1836, 16396)
requestsPreviewForTextAnimation range:(1836, 16396)
requestsBoundingBezierPathsForRange range:(208, 2)
requestsBoundingBezierPathsForRange range:(443, 3)
prepareForTextAnimation range:(1836, 16396)
finishTextAnimation range:(0, 1836)
finishTextAnimation range:(0, 1836)
replaceRange proposedText:an range:(208, 2)
requestsUnderlinePathsForRange range:(208, 2)
requestsUnderlinePathsForRange range:(443, 3)
selectRanges ranges.count:1
requestsBoundingBezierPathsForRange range:(208, 2)
replaceRange proposedText:an range:(208, 2)
requestsUnderlinePathsForRange range:(208, 2)
requestsUnderlinePathsForRange range:(443, 3)
selectRanges ranges.count:1
replaceRange proposedText:you range:(443, 3)
requestsUnderlinePathsForRange range:(208, 2)
requestsUnderlinePathsForRange range:(443, 3)
selectRanges ranges.count:1
requestsBoundingBezierPathsForRange range:(443, 3)
replaceRange proposedText:you range:(443, 3)
requestsUnderlinePathsForRange range:(208, 2)
requestsUnderlinePathsForRange range:(443, 3)
selectRanges ranges.count:1
macOS version is 15.3.1 (24D70)
Apple Intelligence
RSS for tagApple Intelligence is the personal intelligence system that puts powerful generative models right at the core of your iPhone, iPad, and Mac and powers incredible new features to help users communicate, work, and express themselves.
Posts under Apple Intelligence tag
92 Posts
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
On the October 10/28 release day of Apple Intelligence I opted in. My iPhone and iPad immediately went to "waitlist" and within 2 to 3 hours were ready to initialize Apple Intelligence.
My MacBook Pro 14" with M3 Pro processor and 18 GB or RAM has been stuck on "preparing" since release day (6 days now).
I've tried numerous workarounds that I found on forums as well as talking to Apple support, who basically had me repeat the workarounds that I found on forums.
I've tried changing region to an area that does not have Apple Intelligence and then back to the US, I've changed Siri language to an unsupported one and back to a supported one, and I have tried disabling background/startup Apps, I've disabled and reenabled Siri. Oh, I've restarted a bunch and let the Mac alone for hours at a time.
I've noticed that my selected Siri voice seems to not download.
Finally, after several chats and calls with Apple support, I was told that it's Beta software, they can't help me, and I should try the developer forums.... so here I am. Any advice?
I have an image based app with albums, except in my app, albums are known as galleries.
When I tried to conform my existing OpenGalleryIntent with @AssistantIntent(schema: .photos.openAlbum), I had to change my existing gallery parameter to be called target in order to fit the predefined shape of this domain.
Previously, my intent was configured to display as “Open Gallery” with the description “Opens the selected Gallery” in the Shortcuts app. After conforming to the photos domain, it displays as “Open Album” with a description “Opens the Provided Album”.
Shortcuts is ignoring my configured title and description now. My code builds, but with the following build warnings:
Parameter argument title of a required Assistant schema intent parameter target should not be overridden
Implementation of the property title of an AppIntent conforming to AssistantSchemaIntent should not be overridden
Implementation of the property description of an AppIntent conforming to AssistantSchemaIntent should not be overridden
Is my only option to change the concept of a Gallery inside of my app into an Album? I don't want to do this... Conceptually, my app aligns well with this domain does, but I didn't consider that conforming to the shape of an AI schema intent would also dictate exactly how it's presented to the user.
FB16283840
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
Tags:
Siri and Voice
Shortcuts
App Intents
Apple Intelligence
I'm currently trying to add support for Image Playground to our apps. It seems that it's not working in an app that is "Designed for iPad" and runs on a Mac. The modal just shows a spinner and the following is logged to console:
Private sandbox for com.apple.GenerativePlaygroundApp.remoteUIExtension : <none>
Private sandbox for com.apple.GenerativePlaygroundApp.remoteUIExtension : <none>
Private sandbox for com.apple.GenerativePlaygroundApp.remoteUIExtension : <none>
Private sandbox for com.apple.GenerativePlaygroundApp.remoteUIExtension : <none>
GP extension could not be loaded: Extension (platform: 2) could not be found (in update)
dealloc Query controller [C32BA176-6A3E-465D-B3C5-0F8D91068B89]
ImagePlaygroundViewController.isAvailable returns true, however.
In a "real" Mac Catalyst app, it's working. Just not when the app is actually an iPad app.
Is this a bug?
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
Tags:
Photos and Imaging
Apple Intelligence
Hello,
I would like to inquire about the release date of Swift Assist’s beta version. Apple has stated that it will be released later this year, but they have not provided a specific date or time.
Could you please provide information on the beta version’s release date? Additionally, is there a trial version available? If so, when was it released?
Thank you for your assistance.
Can someone from Apple officially confirm either a delay or cancellation of this feature? 9 days before 2024 ends and Apple said “later this year” in June 2024. Apple mentioned Swift Assist even in the MacBook Pro announcement on the 30th of October 2024 as if it already exists.
I've been exploring the Trails Sample App from this session at WWDC24.
The app has a TrailEntity of type AppEntity which is leveraged in multiple places throughout the app, including:
The GetTrailInfo App Intent with a trail parameter of type TrailEntity.
A parameterized App Shortcut which calls the GetTrailInfo intent.
The TrailDataManager's init calls updateSpotlightIndex(), which creates a CSSearchableItem for each Trail in the app, along with an associateAppEntity call linking the corresponding TrailEntity to each item that gets added to the CSSearchableIndex.
If you build the app and search "trails" in Spotlight, the Trails Sample App section includes instances of TrailEntity as search results. But if you comment out the App Shortcut that takes a TrailEntity as a parameter and rebuild, there are no instances of TrailEntity in the search results. In both cases, the console prints [Spotlight] Trails indexed by Spotlight.
Is this expected behavior? Why are the TrailEntity instances only appearing in Spotlight via the App Shortcut? Shouldn't the CSSearchableItem instances show up in Spotlight on their own regardless? If not, then what is the purpose of adopting Core Spotlight with App Entities? Does this add the app entities to the semantic index for "new Siri", even though they're not user facing in the Spotlight UI?
Topic:
App & System Services
SubTopic:
General
Tags:
Shortcuts
Core Spotlight
App Intents
Apple Intelligence
In this thread, I asked about adding parameters to App Shortcuts. The conclusion that I've drawn so far is that for App Shortcuts, there cannot be any parameters in the prompt, otherwise the system cannot find the AppShortcutsProvider. While this is fine for Shortcuts and non-voice interaction, I'd like to find a way to add parameters to the prompt. Here is the scenario:
My app controls a device that displays some content on "pages." The pages are defined in an AppEnum, which I use for Shortcuts integration via App Intents. The App Intent functions as expected, and is able to change the page based on the user selection within Shortcuts (or prompted if using the App Shortcut). What I'd like to do is allow the user to be able to say "Siri, open with ."
So far, The closest I've come to understanding how this works is through the .intentsdefinition file you can create (and SiriKit in general), however the part that really confused me there is a button in the File Editor that says "Convert to App Intent." To me, this means that I should be able to use the app intent I've already authored and hook that into Siri, rather than making an entirely new function/code-block that does exactly the same thing. Ideally, that's what I want to do.
What's the right way to define this behavior?
p.s. If I had to pick an intent schema in the context of AssistantSchemas, I'd say it's closest to the "Open File" one, if that helps. I'd ultimately like to make the "pages" user-customizable so in the long run, that would be what I'd do.
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
Tags:
Siri and Voice
SiriKit
App Intents
Apple Intelligence
Given that iOS 18.2 is out and following documentation and WWDC example (limited to iOS 18.2+), I am attempting to use @AssistantIntent(schema: .system.search) along an AppIntent.
Questions:
Has anyone made this to work on a real device?!
In my case (code below): when I run the intent from Shortcuts or Siri, it does NOT open the App but only calls the perform method (and the App is not foregrounded) -- changing openAppWhenRun has no effect! Strangely: If my App was backgrounded before invocation and I foreground it after, it has navigated to Search but just not foregrounded the App!
Am I doing anything wrong? (adding @Parameter etc doesn't change anything).
Where is the intelligence here? The criteria parameter can NOT be used in the Siri phrase -- build error if you try that since only AppEntity/AppEnum is permitted as variable in Siri phrase but not a StringSearchCriteria.
Said otherwise: What's the gain in using @AssistantIntent(schema: .system.search) vs a regular AppIntent in this case?!
Some code:
@available(iOS 18.2, *)
@AssistantIntent(schema: .system.search)
struct MySearchIntent: ShowInAppSearchResultsIntent {
static let searchScopes: [StringSearchScope] = [.general]
static let openAppWhenRun = true
var criteria: StringSearchCriteria
@MainActor
func perform() async throws -> some IntentResult {
NavigationHandler().to(.search(.init(query: criteria.term)), from: .siri)
return .result()
}
}
Along with this ShortCut in AppShortcutsProvider:
AppShortcut(
intent: MySearchIntent(),
phrases: [
"Search \(.applicationName)"
],
shortTitle: "Search",
systemImageName: "magnifyingglass"
)
Topic:
App & System Services
SubTopic:
General
Tags:
Siri and Voice
Shortcuts
App Intents
Apple Intelligence
I can't shake the "I don't think I did this correctly" feeling about a change I'm making for Image Playground support.
When you create an image via an Image Playground sheet it returns a URL pointing to where the image is temporarily stored. Just like the Image Playground app I want the user to be able to decide to edit that image more.
The Image Playground sheet lets you pass in a source URL for an image to start with, which is perfect because I could pass in the URL of that temp image.
But the URL is NOT optional. So what do I populate it with when the user is starting from scratch?
A friendly AI told me to use URL(string: "")! but that crashes when it gets forced unwrapped.
URL(string: "about:blank")! seems to work in that it is ignored (and doesn't crash) when I have the user create the initial image (that shouldn't have a source image).
This feels super clunky to me. Am I overlooking something?
iOS 18.2 includes a new feature called Visual Intelligence. If I hold down the Camera Control on my iPhone, I can take a photo of an object and use Google to look up items similar to what I've photographed.
Is there a way to programmatically open this interface within my app? If so, can I see which result the user selects?
i have iPhone 16 pro and I’ve updated to IOS 18.2.
however after I clicked on ‘Get Apple Intelligence’, and then set up button, nothing happened. I’ve tried this process many times still nothing working.