I and many of my users observed the problem that the Shortcuts app seems to confuse and swap actions of different apps from the same developer. This happens after updating to iOS 26.
This breaks many Shortcuts. Deleting one of the apps or re-adding the actions sometimes seems to help.
Does anybody else observe this problem, or know how to handle this?
App Intents
RSS for tagExtend your app’s custom functionality to support system-level services, like Siri and the Shortcuts app.
Posts under App Intents tag
167 Posts
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I want to offer the user the opportunity to add more stuff to a list in AppIntents, but nothing I've tried "loops back" to the first Siri query. Checked several LLMs and they are suggest using "requestDialog" which doesn't exist, and calling recursively my AppIntent.
Is this even possible?
So I'm developing an ios application which should be showing shortcuts, but its not. I'm not sure how to debug why the functionality isnt working.
Believe I'm correctly calling AppShortcutsProvider's updateAppShortcutParameters, but I dont see any errors in the console showing me a problem.
And in fact, I made a simplified just swift version that works before I tried to integrate it into a more complex project.
But now I'm at a loss as to what is going wrong or what debug tools I can use to figure it out. Any help would be appreciated.
When building my project I see:
025-08-18 14:07:49.371 appintentsmetadataprocessor[57506:35387547] Starting appintentsmetadataprocessor export
2025-08-18 14:07:49.414 appintentsmetadataprocessor[57506:35387547] Writing Metadata.appintents
2025-08-18 14:07:49.414 appintentsmetadataprocessor[57506:35387547] Metadata root: /Users/jpetersen/Library/Developer/Xcode/DerivedData/Unity-iPhone-dtnhxevagfkzsjdavesziaqrwisr/Build/Products/ReleaseForRunning-iphoneos/UnityFramework.framework/Metadata.appintents
AppIntentsSSUTraining (in target 'UnityFramework' from project 'Unity-iPhone')
cd /Users/jpetersen/no_doc_repos/payments_ios_investigation/SpotlightSearch/client/Build
/Applications/Xcode_16.1.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/appintentsnltrainingprocessor --infoplist-path /Users/jpetersen/Library/Developer/Xcode/DerivedData/Unity-iPhone-dtnhxevagfkzsjdavesziaqrwisr/Build/Products/ReleaseForRunning-iphoneos/UnityFramework.framework/Info.plist --temp-dir-path /Users/jpetersen/Library/Developer/Xcode/DerivedData/Unity-iPhone-dtnhxevagfkzsjdavesziaqrwisr/Build/Intermediates.noindex/Unity-iPhone.build/ReleaseForRunning-iphoneos/UnityFramework.build/ssu --bundle-id com.unity3d.framework --product-path /Users/jpetersen/Library/Developer/Xcode/DerivedData/Unity-iPhone-dtnhxevagfkzsjdavesziaqrwisr/Build/Products/ReleaseForRunning-iphoneos/UnityFramework.framework --extracted-metadata-path /Users/jpetersen/Library/Developer/Xcode/DerivedData/Unity-iPhone-dtnhxevagfkzsjdavesziaqrwisr/Build/Products/ReleaseForRunning-iphoneos/UnityFramework.framework/Metadata.appintents --archive-ssu-assets
2025-08-18 14:07:49.436 appintentsnltrainingprocessor[57507:35387550] Parsing options for appintentsnltrainingprocessor
2025-08-18 14:07:49.437 appintentsnltrainingprocessor[57507:35387550] Starting AppIntents SSU YAML Generation
2025-08-18 14:07:49.444 appintentsnltrainingprocessor[57507:35387550] Training 'Start ${+applicationName}' for English
2025-08-18 14:07:49.444 appintentsnltrainingprocessor[57507:35387550] Training 'Play ${ShortcutEntity}|Play ${ShortcutEntity} on ${+applicationName}' for English
2025-08-18 14:07:49.444 appintentsnltrainingprocessor[57507:35387550] Training Negative Phrases '' for English
2025-08-18 14:07:49.444 appintentsnltrainingprocessor[57507:35387550] Application name 'UnityFramework' for English
2025-08-18 14:07:49.449 appintentsnltrainingprocessor[57507:35387550] Generated AppIntents SSU YAML files in file:///Users/jpetersen/Library/Developer/Xcode/DerivedData/Unity-iPhone-dtnhxevagfkzsjdavesziaqrwisr/Build/Intermediates.noindex/Unity-iPhone.build/ReleaseForRunning-iphoneos/UnityFramework.build/ssu/
2025-08-18 14:07:49.449 appintentsnltrainingprocessor[57507:35387550] Copied AppIntents SSU YAML files to file:///Users/jpetersen/Library/Developer/Xcode/DerivedData/Unity-iPhone-dtnhxevagfkzsjdavesziaqrwisr/Build/Products/ReleaseForRunning-iphoneos/UnityFramework.framework/Metadata.appintents/
So I think it should be making the required app intent data :shrug:
New features in WatchOS 26 with configurable widgets make it more important than ever that apps adopt IntentConfiguration options where applicable.
I develop an app with an Apple Watch complication/widget on many many user's Watch faces around the world. I've completed updating my code to support WidgetKit and remove ClockKit.
However, I face huge issues adding support for users to configure their widget/complications.
If I update a widget to go from StaticConfiguration to IntentConfiguration, even when keeping the "kind" string the same, the widget disappears from the Watch face.
This is an unacceptable user experience meaning I can't proceed with the migration. The problem is users will expect me to offer configuration in the Watch face soon for their widget/complication. Currently this process is done in a sub-optimal way in the app itself.
A similar issue exists on iOS where the widget will just "freeze" indefinitely is migrated.
This issue still occurs on the iOS 26 and WatchOS 26 betas.
So how to move this forward.
This has been discussed previously here: https://developer.apple.com/forums/thread/661247
I've mentioned it at WidgetKit labs
I've filed feedback last year: FB13880020
I've filed feedback this year: FB18180368
It seems really important this gets fixed for developers to adopt these new features, is there any other migration route I'm missing or a workaround that would mitigate this seemingly big problem.
I would like to have an AppEntity with a Property that is a Date, which is only the date, not the time. ie the equivalent of 09/14/2025, not 09/14/2025 09:00 UTC
How would I model this? How would I create an EntityPropertyQuery for this? If I add QueryProperties they have the UI in Shortcuts pick a time too.
Thanks!
I'm a bit confused as to what we're supposed to be doing to support starting a workout using Siri in iOS/watchOS 26. On one hand, I see a big push to move towards App Intents and shortcuts rather than SiriKit. On the other hand, I see that some of the things I would expect to work with App Intents well... don't work. BUT - I'm also not sure it isn't just developer error on my part.
Here are some assertions that I'm hoping someone more skilled and knowledgable can correct me on:
Currently the StartWorkoutIntent only serves the Action button on the Watch Ultra. It cannot be used to register Shortcuts, nor does Siri respond to it.
I can use objects inherited from AppIntent to create shortcuts, but this requires an additional permission to run a shortcut if a user starts a workout with Siri.
AppIntent shortcuts requires the user to say "Start a workout in " - if the user leaves out the "in " part, Siri will not prompt the user to select my app.
If I want to allow the user to simply say "Start a Workout" and have Siri prompt the user for input as to which app it should use, I must currently use the older SiriKit to do so.
Are these assertions correct - or am I just implementing something incorrectly?
Using the latest Xcode 26 beta for what it is worth.
Topic:
App & System Services
SubTopic:
Health & Fitness
Tags:
Siri and Voice
SiriKit
Intents
App Intents
Due to our min iOS version, this is my first time using .xcstrings instead of .strings for AppShortcuts.
When using the migrate .strings to .xcstrings Xcode context menu option, an .xcstrings catalog is produced that, as expected, has each invocation phrase as a separate string key.
However, after compilation, the catalog changes to group all invocation phrases under the first phrase listed for each intent (see attached screenshot). It is possible to hover in blank space on the right and add more translations, but there is no 1:1 key matching requirement to the phrases on the left nor a requirement that there are the same number of keys in one language vs. another. (The lines just happen to align due to my window size.)
What does that mean, practically?
Do all sub-phrases in each language in AppShortcuts.xcstrings get processed during compilation, even if there isn't an equivalent phrase key declared in the AppShortcut (e.g., the ja translation has more phrases than the English)? (That makes some logical sense, as these phrases need not be 1:1 across languages.)
In the AppShortcut declaration, if I delete all but the top invocation phrase, does nothing change with Siri?
Is there something I'm doing incorrectly?
struct WatchShortcuts: AppShortcutsProvider {
static var appShortcuts: [AppShortcut] {
AppShortcut(
intent: QuickAddWaterIntent(),
phrases: [
"\(.applicationName) log water",
"\(.applicationName) log my water",
"Log water in \(.applicationName)",
"Log my water in \(.applicationName)",
"Log a bottle of water in \(.applicationName)",
],
shortTitle: "Log Water",
systemImageName: "drop.fill"
)
}
}
I'm trying to run sample Trails app from the documentation, unaltered.
When I do the build, I get a
Command ValidateAppShortcutStringsMetadata failed with a nonzero exit code
error. How do I debug this?
I'm trying this on Xcode 16.4.
After reinstalling the App,the ControlWidget Gallery doesn't show custom SF Symbols
Consider the following in an AppIntent:
struct TestIntent: AppIntent {
static let title: LocalizedStringResource = "Test Intent"
static var parameterSummary: some ParameterSummary {
Summary("Test") {
\.$options
}
}
enum Option: Int, CaseIterable, AppEnum {
case one
case two
case three
case four
case five
case six
static let typeDisplayRepresentation: TypeDisplayRepresentation =
TypeDisplayRepresentation(
name: "Options"
)
static let caseDisplayRepresentations: [Option: DisplayRepresentation] = [
.one: DisplayRepresentation(title: "One"),
.two: DisplayRepresentation(title: "Two"),
.three: DisplayRepresentation(title: "Three"),
.four: DisplayRepresentation(title: "Four"),
.five: DisplayRepresentation(title: "Five"),
.six: DisplayRepresentation(title: "Six"),
]
}
@Parameter(title: "Options", default: [])
var options: [Option]
@MainActor
func perform() async throws -> some IntentResult {
print(options)
return .result()
}
}
In Shortcuts, this will turn into a dropdown where you can check multiple Option values. However, when perform() is called, options will be an empty array regardless of what the user selects. This is observed on both iOS 18.5 and macOS 15.5. However, on iOS 26.0 beta and macOS 26.0 beta, the issue seems to be resolved and options contains all the checked options. However, we do back deploy to current/previous iOS/macOS versions. How can we provide a multiple choice selection of fixed values on these older versions?
I developed a shortcut feature for my app using the AppIntents framework, which can display a maximum of 10 shortcuts in the Shortcuts app. However, I've noticed that apps like Tesla and Porsche have a significantly larger number of shortcuts, far exceeding 10. After searching online, I found that they might be using Intent Extensions and the SiriKit framework. I customized an Intent through SiriKit, checked the option for "Intent is user-configurable in the Shortcuts app" and "Add to Siri." I can find this shortcut when I search for it, but it does not appear on the homepage or under the app category. Is there any way to resolve this?
I currently use AppShortcutsProvider successfully to register AppShortcut items that appear in the Shortcuts app and in Spotlight on iOS 18 (built with Xcode 16.4). However, I'm unable to set a branded image asset (SVG or PNG) as the tile icon — only SF Symbols are allowed via systemImageName: String, and specifying an asset name results in a blank icon in the UI.
Interestingly, other apps (e.g. Notes, Talabat, Photos) seem to display a custom image as the shortcut’s icon in Shortcuts, despite current documentation stating otherwise. I'm kindly requesting Apple formalize and support this capability.
My question is similar to https://developer.apple.com/forums/thread/757298?answerId=791343022#791343022 but the solution from there did not help me.
My app sends messages. I need it to do so when a user says to Siri: "Send message with ". When a user says so, Siri shows "Open button and says " hasn't added support for that with Siri".
The code is pretty short and must work, but it doesn't. Could you please help and explain how to add the support mentioned above? How else I can use AppIntent and register the app as one capable to send messages when asked by Siri?
import AppIntents
@main
struct MyAppNameApp: App {
var body: some Scene {
WindowGroup {
ContentView()
}
}
init() {
MyAppNameShortcuts.updateAppShortcutParameters()
Task {
await MyAppNameShortcuts.updateAppShortcutParameters()
}
}
}
struct SendMessageWithMyAppName: AppIntent {
static var title: LocalizedStringResource = "Send message"
static let description = IntentDescription(
"Dictate a message and have MyAppName print it to the Xcode console.")
@Parameter(title: "Message", requestValueDialog: "What should I send?")
var content: String
static var openAppWhenRun = false
func perform() async throws -> some IntentResult {
print("MyAppName message: \(content)")
await MainActor.run {
NotificationCenter.default.post(name: .newMessageReceived, object: content)
}
return .result(dialog: "Message sent: \(content)")
}
}
struct MyAppNameShortcuts: AppShortcutsProvider {
static var appShortcuts: [AppShortcut] {
AppShortcut(
intent: SendMessageWithMyAppName(),
phrases: [
"Send message with \(.applicationName)"
],
shortTitle: "Send Message",
systemImageName: "message"
)
}
}
We're trying to add an AppIntent to our very complex and large app that works with Siri.
What we've found through numerous different configuration attempts is that when the Siri toggle is turned off on in the Shortcuts app for our app, when we speak our phrase to Siri we get this prompt: " hasn't added support for that with Siri." with an "Open " button.
When we speak the phrase while the app is open, we get the prompt that says "Turn on "" shortcuts with Siri?" with a "Turn On" button.
When the Siri toggle is turned on for our app in the Shortcuts app, all of our phrases work as expected.
Is there some kind of configuration we're missing? There doesn't seem to be a lot of documentation. We've tried to match an example app's approach (AcceleratingAppInteractionsWithAppIntents) to no avail.
Our app intents live in an xcframework if that makes a difference, but we've also tried moving it out of xcframework without success.
This is a blocker to us shipping this feature. Thanks!
When we use the "Find All Reminders" shortcut, there's these two filters "Is Completed and "Is Not Completed".
When I implement this in my app, the best I could get is just "Completed" and "Not Completed", I can't figure out how to add the "Is" in front.
In my entity:
@Property(title: "Completed")
var completed : Bool
In the EntityPropertyQuery:
static var properties = QueryProperties {
Property(\GTDItemAppEntity.$list) {
EqualToComparator { NSPredicate(format: "list.uuid = %@", $0.id as NSUUID) }
}
Property(\GTDItemAppEntity.$text) {
ContainsComparator { NSPredicate(format: "text CONTAINS[cd] %@", $0) }
EqualToComparator { NSPredicate(format: "text = %@", $0) }
}
Property(\GTDItemAppEntity.$completed) {
EqualToComparator { NSPredicate(format: $0 ? "completed = YES" : "completed = NO") }
}
}
If I change the property to
@Property(title: "Is Completed")
var completed : Bool
Then it will show as "Is Completed" and "Not Is Completed" in the filter!
Reminder:
My App:
In this WWDC25 session, it is explictely mentioned that apps should support AttributedString for text parameters to their App Intents.
However, I have not gotten this to work. Whenever I pass rich text (either generated by the new "Use Model" intent or generated manually for example using "Make Rich Text from Markdown"), my Intent gets an AttributedString with the correct characters, but with all attributes stripped (so in effect just plain text).
struct TestIntent: AppIntent {
static var title = LocalizedStringResource(stringLiteral: "Test Intent")
static var description = IntentDescription("Tests Attributed Strings in Intent Parameters.")
@Parameter
var text: AttributedString
func perform() async throws -> some IntentResult & ReturnsValue<AttributedString> {
return .result(value: text)
}
}
Is there anything else I am missing?
Hello,
I have an AppIntent that uses the AudioPlaybackIntent to trigger my app to open and initiate an AVPlayer that plays back a media stream I control. When the phone is unlocked, everything works as I expect. The app opens and plays the audio.
However, when the phone is locked, any attempt to invoke the intent causes a "Request Code" dialog to be displayed. This seems counter to what I would expect with the AudioPlaybackIntent usage. Am I able to accomplish what I'm after here with AppIntents? Does the fact that I'm using openAppWhenRun require me to have the phone unlocked somehow?
import AppIntents
import Foundation
struct PlayStationAppIntent: AudioPlaybackIntent {
static var title: LocalizedStringResource = "Play radio station"
static var description: IntentDescription = .init("Play radio station")
static var notification: Notification.Name = .init("playStation")
static var openAppWhenRun: Bool = true
init() {}
func perform() async throws -> some IntentResult {
AudioPlayerService.shared.play()
return .result()
}
}
I'm implementing an App Intent for my iOS app that helps users plan trip activities. It only works when run as a shortcut but not using voice through Siri. There are 2 issues:
The ShortcutsTripEntity will only accept a voice input for a specific trip but not others.
I'm stuck with a throwing error when trying to use requestDisambiguation() on the activity day @Parameter property.
How do I rectify these issues.
This is blocking me from completing a critical feature that lets users quickly plan activities through Siri and Shortcuts.
Expected behavior for trip input: The intent should make Siri accept the spoken trip input from any of the options.
Actual behavior for trip input: Siri only accepts the same trip when spoken but accepts any when selected by click/touch.
Expected behavior for day input: Siri should accept the spoken selected option.
Actual behavior for day input: Siri only accepts an input by click/touch but yet throws an error at runtime I'm happy to provide more code. But here's the relevant code:
struct PlanActivityTestIntent: AppIntent {
@Parameter(title: "Activity Day")
var activityDay: ShortcutsItineraryDayEntity
@Parameter(
title: "Trip",
description: "The trip to plan an activity for",
default: ShortcutsTripEntity(id: UUID().uuidString, title: "Untitled trip"),
requestValueDialog: "Which trip would you like to add an activity to?"
)
var tripEntity: ShortcutsTripEntity
@Parameter(title: "Activity Title", description: "The title of the activity", requestValueDialog: "What do you want to do or see?")
var title: String
@Parameter(title: "Activity Day", description: "Activity Day", default: ShortcutsItineraryDayEntity(itineraryDay: .init(itineraryId: UUID(), date: .now), timeZoneIdentifier: "UTC"))
var activityDay: ShortcutsItineraryDayEntity
func perform() async throws -> some ProvidesDialog {
// ...other code...
let tripsStore = TripsStore()
// load trips and map them to entities
try? await tripsStore.getTrips()
let tripsAsEntities = tripsStore.trips.map { trip in
let id = trip.id ?? UUID()
let title = trip.title
return ShortcutsTripEntity(id: id.uuidString, title: title, trip: trip)
}
// Ask user to select a trip. This line would doesn't accept a voice // answer. Why?
let selectedTrip = try await $tripEntity.requestDisambiguation(
among: tripsAsEntities,
dialog: .init(
full: "Which of the \(tripsAsEntities.count) trip would you like to add an activity to?",
supporting: "Select a trip",
systemImageName: "safari.fill"
)
)
// This line throws an error
let selectedDay = try await $activityDay.requestDisambiguation(
among: daysAsEntities,
dialog:"Which day would you like to plan an activity for?"
)
}
}
Here are some related images that might help:
Hello,
I'm evaluating if it's worth to expose shortcuts from our app, it seems to be working fine on my machine - Apple Silicon, latest Tahoe beta, Xcode 26 beta.
But if I compile the same code on our intel build agents which are running latest macOS 15 and Xcode 26 beta, once I install the bundle to /Applications on Tahoe I don't see any shortcuts.
Only other difference is that CI build is signed with distribution DeveloperID certificate - I re-signed the build with my dev certificate and it has no effect.
I found out that linkd is somehow involved in the discovery process and most relevant logs look like this:
default (...) linkd Registry com.**** is not link enabled com.apple.appintents
debug (...) linkd ApplicationService Created AppShortcutClient with bundleId: com.**** com.apple.appintents
error (...) linkd AppService Unable to find AppShortcutProvider for com.**** com.apple.appintents
Could you please advice where to look for the problem?
I’ll preface this by saying I’ve submitted a DTS ticket with essentially this exact text, but I thought I’d also post about it here to get some additional input. Apple engineers: the case ID is 14698374, for your reference.
I have an observable class, called NavigationModel, that powers navigation in my SwiftUI app. It has one important property, navigationSelection, that stores the currently selected view. This property is passed to a List in the sidebar column of a NavigationSplitView with two columns. The list has NavigationLinks that accept that selection as a value parameter. When a NavigationLink is tapped, the detail column shows the appropriate detail view per the navigationSelection property’s current value via a switch statement. (This navigationSelection stores an enum value.)
This setup allows for complete programmatic navigation as that selection is effectively global. From anywhere in the app — any button, command, or app intent — the selection can be modified since the NavigationModel class uses the @Observable Swift macro. In the app’s root file, an instance of the NavigationModel is created, added as an app intent dependency, and assigned (might be the wrong verb here, but you get it) to ContentView, which houses the NavigationSplitView code.
The problem lies when more than one window is opened. Because this is all just one instance of NavigationModel — initialized in the app’s root file — the navigation selection is shared across windows. That is, there is no way for one window to show a view and another to show another view — they’re bound to the same instance of NavigationModel. Again, this was done so that app intents and menu bar commands can modify the navigation selection, but this causes unintended behavior.
I checked Apple’s sample projects, namely the “Accelerating app interactions with App Intents” (https://developer.apple.com/documentation/appintents/acceleratingappinteractionswithappintents) and “Adopting App Intents to support system experiences” (https://developer.apple.com/documentation/appintents/adopting-app-intents-to-support-system-experiences) projects, to see how Apple recommends handling this case. Both of these projects have intents to display a view by modifying the navigation selection. They also have the same unintended behavior I’m experiencing in my app. If two windows are open, they share the navigation selection.
I feel pretty stupid asking for help with this, but I’ve tried a lot to get it to work the way I want it to. My first instinct was to create a new instance of NavigationModel for each window, and that’s about 90% of the way there, but app intents fail when there are no open windows because there are no instances of NavigationModel to modify. I also tried playing with FocusedValue and SceneStorage, but those solutions also didn’t play well with app intents and added too much convoluted code for what should be a simple issue.
In total, what I want is:
A window/scene-specific navigation selection property that works across TabViews and NavigationSplitViews
A way to reliably modify that property’s value across the app for the currently focused window
A way to set a value as a default, so when the app launches with a window, it automatically selects a value in the detail column
The navigation selection to reset across app and window launches, restoring the default selection
Does anyone know how to do this? I’ve scoured the internet, but again, no dice. Usually Apple’s sample projects are great with this sort of thing, but all of their projects that have scene-specific navigation selection with NavigationSplitView don’t have app intents. 🤷♂️
If anyone needs additional code samples, I’d be happy to provide them, but it’s basically a close copy of Apple’s own sample code found in those two links.