Hello,
I’m working on integrating SiriKit with my music app using INPlayMediaIntent. My app is live on TestFlight, and the Siri command is being recognized, but mediaItems is always empty in my Intent
Demo Project
Automation & Scripting
RSS for tagLearn about scripting languages and automation frameworks available on the platform to automate repetitive tasks.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I'm soliciting you because I'm having a problem using the 3D short cut for my ios application in uikit in the AppDelegate file but it's impossible to redirect the route when the user has completely killed the application. It works as a background application. I'd like it to redirect to the searchPage search page when the application is fully closed and the user clicks on search with 3D touch.
final class AppDelegate: UIResponder, UIApplicationDelegate {
lazy var window: UIWindow? = {
return UIWindow(frame: UIScreen.main.bounds)
}()
private let appDependencyContainer = Container()
private let disposeBag = DisposeBag()
var pendingDeeplink: String?
private lazy var onboardingNavigationController: UINavigationController = {
let navigationController = UINavigationController(nibName: nil, bundle: nil)
navigationController.setNavigationBarHidden(true, animated: false)
return navigationController
}()
private func handleShortcutItem(_ shortcutItem: UIApplicationShortcutItem) {
guard let windowScene = UIApplication.shared.connectedScenes.first as? UIWindowScene,
let window = windowScene.windows.first(where: { $0.isKeyWindow }),
let rootVC = window.rootViewController else {
DispatchQueue.main.asyncAfter(deadline: .now() + 1.0) { [weak self] in
self?.handleShortcutItem(shortcutItem)
}
return
}
if let presentedVC = rootVC.presentedViewController {
presentedVC.dismiss(animated: !UIAccessibility.isReduceMotionEnabled) { [weak self] in
self?.executeShortcutNavigation(shortcutItem)
}
} else {
executeShortcutNavigation(shortcutItem)
}
}
private func executeShortcutNavigation(_ shortcutItem: UIApplicationShortcutItem) {
DispatchQueue.main.asyncAfter(deadline: .now() + 0.1) { [weak self] in
guard let self = self else { return }
switch shortcutItem.type {
case ShortcutType.searchAction.rawValue:
self.mainRouter.drive(to: .searchPage(.show), origin: AppRoutingOrigin())
case ShortcutType.playAction.rawValue:
self.mainRouter.drive(to: .live(channel: Channel(), appTabOrigin: AppTabOrigin.navigation.rawValue), origin: AppRoutingOrigin())
case ShortcutType.myListHistoryAction.rawValue:
self.mainRouter.drive(to: .myList(.history), origin: AppRoutingOrigin())
default:
break
}
}
}
What I've tried:
Adding delays with DispatchQueue.main.asyncAfter
Checking for window availability and rootViewController
Dismissing presented view controllers before navigation
Environment:
iOS 15+
Swift 6
Using custom router system (mainRouter)
App supports both SwiftUI and UIKit
Questions:
What's the best practice for handling shortcuts on cold launch vs warm launch?
How can I ensure the router is properly initialized before navigation?
My question is similar to https://developer.apple.com/forums/thread/757298?answerId=791343022#791343022 but the solution from there did not help me.
My app sends messages. I need it to do so when a user says to Siri: "Send message with ". When a user says so, Siri shows "Open button and says " hasn't added support for that with Siri".
The code is pretty short and must work, but it doesn't. Could you please help and explain how to add the support mentioned above? How else I can use AppIntent and register the app as one capable to send messages when asked by Siri?
import AppIntents
@main
struct MyAppNameApp: App {
var body: some Scene {
WindowGroup {
ContentView()
}
}
init() {
MyAppNameShortcuts.updateAppShortcutParameters()
Task {
await MyAppNameShortcuts.updateAppShortcutParameters()
}
}
}
struct SendMessageWithMyAppName: AppIntent {
static var title: LocalizedStringResource = "Send message"
static let description = IntentDescription(
"Dictate a message and have MyAppName print it to the Xcode console.")
@Parameter(title: "Message", requestValueDialog: "What should I send?")
var content: String
static var openAppWhenRun = false
func perform() async throws -> some IntentResult {
print("MyAppName message: \(content)")
await MainActor.run {
NotificationCenter.default.post(name: .newMessageReceived, object: content)
}
return .result(dialog: "Message sent: \(content)")
}
}
struct MyAppNameShortcuts: AppShortcutsProvider {
static var appShortcuts: [AppShortcut] {
AppShortcut(
intent: SendMessageWithMyAppName(),
phrases: [
"Send message with \(.applicationName)"
],
shortTitle: "Send Message",
systemImageName: "message"
)
}
}
Consider the following in an AppIntent:
struct TestIntent: AppIntent {
static let title: LocalizedStringResource = "Test Intent"
static var parameterSummary: some ParameterSummary {
Summary("Test") {
\.$options
}
}
enum Option: Int, CaseIterable, AppEnum {
case one
case two
case three
case four
case five
case six
static let typeDisplayRepresentation: TypeDisplayRepresentation =
TypeDisplayRepresentation(
name: "Options"
)
static let caseDisplayRepresentations: [Option: DisplayRepresentation] = [
.one: DisplayRepresentation(title: "One"),
.two: DisplayRepresentation(title: "Two"),
.three: DisplayRepresentation(title: "Three"),
.four: DisplayRepresentation(title: "Four"),
.five: DisplayRepresentation(title: "Five"),
.six: DisplayRepresentation(title: "Six"),
]
}
@Parameter(title: "Options", default: [])
var options: [Option]
@MainActor
func perform() async throws -> some IntentResult {
print(options)
return .result()
}
}
In Shortcuts, this will turn into a dropdown where you can check multiple Option values. However, when perform() is called, options will be an empty array regardless of what the user selects. This is observed on both iOS 18.5 and macOS 15.5. However, on iOS 26.0 beta and macOS 26.0 beta, the issue seems to be resolved and options contains all the checked options. However, we do back deploy to current/previous iOS/macOS versions. How can we provide a multiple choice selection of fixed values on these older versions?
I use NSUserAppleScriptTask in my app to call Apple Script method to send Apple Events to Finder and System Events.
The script has been deployed in the folder ~/Library/Application Scripts/{app bundle id}/
I have configured the com.apple.security.automation.apple-events in the .entitlements file, but how to configure com.apple.security.scripting-targets to meet the AppStore review requirements
The existing official documentation is too incomplete to be of much use. If anyone has had similar experience, could you please share?
I'm trying to run sample Trails app from the documentation, unaltered.
When I do the build, I get a
Command ValidateAppShortcutStringsMetadata failed with a nonzero exit code
error. How do I debug this?
I'm trying this on Xcode 16.4.
AppleScript for the Music app no longer supports the current track event. Before macOS Tahoe, running the following script in Script Editor would return the current track information:
tell application "Music"
return name of current track
end tell
However, when I run this script on a device with macOS 26 Tahoe, I receive this error:
"Result: error "Music got an error: Can’t get name of current track." number -1728 from name of current track”
I've tested this extensively, and here are my findings:
Going to the “songs” tab and playing something from there makes everything work.
Playing any song directly will make it work with current track UNLESS this song is NOT in your Music library (either added through Apple Music or uploaded).
If you play a song not in your library, current track is not updated even if you clicked on it specifically.
Playing an album (in your library obviously) makes all the tracks within it appear in current track until autoplay takes over.
Any autoplayed track won’t appear in current track even if in your library (unless: see the last bulletpoint)
Music played through the “songs” tab all appear in current track even if autoplay kicks in. I assume this is because this tab is an iTunes legacy (visually and under the hood) and doesn’t use the modern autoplay. This tab also won’t play non-library songs unlike the “albums” tab which seems to use the correct autoplay and suffers the same symptoms as the “recently added”, “home”, “radio”, etc… tabs.
Is this a bug, or has Apple simply deprecated this functionality?
When my Intents extension resolves an INStartCallIntent and returns .continueInApp while the device is locked, the call does not proceed unless the user unlocks the device. After unlocking, the app receives the NSUserActivity and CallKit proceeds normally.
My expectation is that the native CallKit outgoing UI should appear and the call should start without requiring unlock — especially when using AirPods, where attention is not available.
Steps to Reproduce
Pair and connect AirPods.
Lock the iPhone.
Start music playback (e.g. Apple Music).
Place the phone face down (or cover Face ID sensors so attention isn’t available).
Say: “Hey Siri, call Tommy with DiscoMonday(My app name).”
Observed Behavior
Music mutes briefly.
Siri says “Calling Tommy with DiscoMonday.”
Lock screen shows “Require Face ID / passcode.”
After several seconds, music resumes.
The app is not launched, no NSUserActivity is delivered, and no CXStartCallAction occurs.
With the phone face up, the same phrase launches the app, triggers CXStartCallAction, and the call proceeds via CallKit after faceID.
Expected Behavior
From the lock screen, Siri should hand off INStartCallIntent to the app, which immediately requests CXStartCallAction and drives the CallKit UI (reportOutgoingCall(...startedConnectingAt:) → ...connectedAt:), without requiring device unlock, regardless of orientation or attention availability when AirPods are connected.
Topic:
App & System Services
SubTopic:
Automation & Scripting
Tags:
iOS
Siri and Voice
Intents
CallKit
I have a food logging app. I want my users to be able to say something like:
"Hey siri, log chicken and rice for lunch"
But appshortcuts provider is forcing me to add the name of the app to the phrase so it becomes:
"Hey siri, log chicken and rice for lunch in FoodLogApp".
After running a quick survey, I've found that many users dislike having to say the name of the app, it makes it too cumbersome.
My question is:
Is there a plan from apple 2026 so the users can converse with Siri and apps more naturally without having to say the name of the app? If so, is it already in Beta and can you point me towards it?
@available(iOS 17.0, *)
struct LogMealIntent: AppIntent {
static var title: LocalizedStringResource = "Log Meal"
static var description: LocalizedStringResource = "Log a meal"
func perform() async throws -> some IntentResult {
return .result()
}
}
@available(iOS 17.0, *)
struct LogMealShortcutsProvider: AppShortcutsProvider {
static var appShortcuts: [AppShortcut] {
AppShortcut(
intent: LogMealIntent(),
phrases: [
"Log chicken and rice for lunch in \(.applicationName)",
],
shortTitle: "Log meal",
systemImageName: "mic.fill"
)
}
}
Hello, I made myself an app to track my expenses.
The most important event is when I make a purchase via apple wallet.
What happens is sometimes the values from Merchant and Amount are;
Merchant = " "
Amount = 0.0
Has anyone experienced this, is there something I can do about it ? I was thinking that sometimes maybe speed connection and service is something that might make an impact
Does anyone here know something about the topic ?
Looking for any method to quickly flatten a PDF without opening Preview and without installing 3 party software. Any ideas?
Save as PDF in Preview works, but I don't want to have to open Preview each time I need to do this.
The Create PDF action which appears in Finder when you select 2 or more PDFs flattens PDFs, but it requires me to select 2 or more files, and I generally don't want to combine PDFs--I simply wish to flatten a PDF.
Most Automator and Shortcuts options I am aware of do not flatten PDFs, and in some cases, strip out form field data from PDFs.
I would like to have an AppEntity with a Property that is a Date, which is only the date, not the time. ie the equivalent of 09/14/2025, not 09/14/2025 09:00 UTC
How would I model this? How would I create an EntityPropertyQuery for this? If I add QueryProperties they have the UI in Shortcuts pick a time too.
Thanks!
So I'm developing an ios application which should be showing shortcuts, but its not. I'm not sure how to debug why the functionality isnt working.
Believe I'm correctly calling AppShortcutsProvider's updateAppShortcutParameters, but I dont see any errors in the console showing me a problem.
And in fact, I made a simplified just swift version that works before I tried to integrate it into a more complex project.
But now I'm at a loss as to what is going wrong or what debug tools I can use to figure it out. Any help would be appreciated.
When building my project I see:
025-08-18 14:07:49.371 appintentsmetadataprocessor[57506:35387547] Starting appintentsmetadataprocessor export
2025-08-18 14:07:49.414 appintentsmetadataprocessor[57506:35387547] Writing Metadata.appintents
2025-08-18 14:07:49.414 appintentsmetadataprocessor[57506:35387547] Metadata root: /Users/jpetersen/Library/Developer/Xcode/DerivedData/Unity-iPhone-dtnhxevagfkzsjdavesziaqrwisr/Build/Products/ReleaseForRunning-iphoneos/UnityFramework.framework/Metadata.appintents
AppIntentsSSUTraining (in target 'UnityFramework' from project 'Unity-iPhone')
cd /Users/jpetersen/no_doc_repos/payments_ios_investigation/SpotlightSearch/client/Build
/Applications/Xcode_16.1.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/appintentsnltrainingprocessor --infoplist-path /Users/jpetersen/Library/Developer/Xcode/DerivedData/Unity-iPhone-dtnhxevagfkzsjdavesziaqrwisr/Build/Products/ReleaseForRunning-iphoneos/UnityFramework.framework/Info.plist --temp-dir-path /Users/jpetersen/Library/Developer/Xcode/DerivedData/Unity-iPhone-dtnhxevagfkzsjdavesziaqrwisr/Build/Intermediates.noindex/Unity-iPhone.build/ReleaseForRunning-iphoneos/UnityFramework.build/ssu --bundle-id com.unity3d.framework --product-path /Users/jpetersen/Library/Developer/Xcode/DerivedData/Unity-iPhone-dtnhxevagfkzsjdavesziaqrwisr/Build/Products/ReleaseForRunning-iphoneos/UnityFramework.framework --extracted-metadata-path /Users/jpetersen/Library/Developer/Xcode/DerivedData/Unity-iPhone-dtnhxevagfkzsjdavesziaqrwisr/Build/Products/ReleaseForRunning-iphoneos/UnityFramework.framework/Metadata.appintents --archive-ssu-assets
2025-08-18 14:07:49.436 appintentsnltrainingprocessor[57507:35387550] Parsing options for appintentsnltrainingprocessor
2025-08-18 14:07:49.437 appintentsnltrainingprocessor[57507:35387550] Starting AppIntents SSU YAML Generation
2025-08-18 14:07:49.444 appintentsnltrainingprocessor[57507:35387550] Training 'Start ${+applicationName}' for English
2025-08-18 14:07:49.444 appintentsnltrainingprocessor[57507:35387550] Training 'Play ${ShortcutEntity}|Play ${ShortcutEntity} on ${+applicationName}' for English
2025-08-18 14:07:49.444 appintentsnltrainingprocessor[57507:35387550] Training Negative Phrases '' for English
2025-08-18 14:07:49.444 appintentsnltrainingprocessor[57507:35387550] Application name 'UnityFramework' for English
2025-08-18 14:07:49.449 appintentsnltrainingprocessor[57507:35387550] Generated AppIntents SSU YAML files in file:///Users/jpetersen/Library/Developer/Xcode/DerivedData/Unity-iPhone-dtnhxevagfkzsjdavesziaqrwisr/Build/Intermediates.noindex/Unity-iPhone.build/ReleaseForRunning-iphoneos/UnityFramework.build/ssu/
2025-08-18 14:07:49.449 appintentsnltrainingprocessor[57507:35387550] Copied AppIntents SSU YAML files to file:///Users/jpetersen/Library/Developer/Xcode/DerivedData/Unity-iPhone-dtnhxevagfkzsjdavesziaqrwisr/Build/Products/ReleaseForRunning-iphoneos/UnityFramework.framework/Metadata.appintents/
So I think it should be making the required app intent data :shrug:
I want to offer the user the opportunity to add more stuff to a list in AppIntents, but nothing I've tried "loops back" to the first Siri query. Checked several LLMs and they are suggest using "requestDialog" which doesn't exist, and calling recursively my AppIntent.
Is this even possible?
In my app, when invoking a Shortcut via Siri, the
application(_:continueUserActivity:restorationHandler:)
method in AppDelegate is called twice.
When I debug, both NSUserActivity objects are identical.
However, when I run the same Shortcut by tapping it manually, the method is only called once as expected.
Has anyone experienced this issue? How can I prevent Siri Shortcuts from delivering the same NSUserActivity twice?
l’m trying to automate Apple Music on macOS Tahoe 26 using ScriptingBridge. Scripts that previously worked for controlling playback, fetching track info, or manipulating playlists no longer function.
For example, code like this used to work:
`import ScriptingBridge
let music = SBApplication(bundleIdentifier: "com.apple.Music") as! MusicApplication
print(music.currentTrack?.name ?? "No track playing")`
But now it fails, returning nil for track info and failing to send playback commands.
Questions:
Has ScriptingBridge been deprecated or broken in Tahoe 26 for Apple Music?
Any guidance or example code would be appreciated.
After installing my notarized 3rd party app in a Tahoe VM, its embedded Automator actions can not be configured in Automator while defining a workflow: After adding the actions (enabling 3rd party extensions), their views / UI elements do not respond to any mouse event.
When enabling „show this action when running“, the options can be changed during execution of the workflow. Needless to say: Adjusting these action settings in Automator was working for years, macOS 12 - 15 and before.
Reported via Feedback Assistent (FB19015185).
Can anybody confirm this issue with Automator actions?