I would like to have an AppEntity with a Property that is a Date, which is only the date, not the time. ie the equivalent of 09/14/2025, not 09/14/2025 09:00 UTC
How would I model this? How would I create an EntityPropertyQuery for this? If I add QueryProperties they have the UI in Shortcuts pick a time too.
Thanks!
                    
                  
                Automation & Scripting
RSS for tagLearn about scripting languages and automation frameworks available on the platform to automate repetitive tasks.
  
    
    Selecting any option will automatically load the page
  
  
  
  
    
  
  
            Post
Replies
Boosts
Views
Activity
                    
                      I have a food logging app. I want my users to be able to say something like:
"Hey siri, log chicken and rice for lunch"
But appshortcuts provider is forcing me to add the name of the app to the phrase so it becomes:
"Hey siri, log chicken and rice for lunch in FoodLogApp".
After running a quick survey, I've found that many users dislike having to say the name of the app, it makes it too cumbersome.
My question is:
Is there a plan from apple 2026 so the users can converse with Siri and apps more naturally without having to say the name of the app? If so, is it already in Beta and can you point me towards it?
@available(iOS 17.0, *)
struct LogMealIntent: AppIntent {
    static var title: LocalizedStringResource = "Log Meal"
    static var description: LocalizedStringResource = "Log a meal"
    
    
    func perform() async throws -> some IntentResult {
       
        
        return .result()
    }
}
@available(iOS 17.0, *)
struct LogMealShortcutsProvider: AppShortcutsProvider {
    static var appShortcuts: [AppShortcut] {
        AppShortcut(
            intent: LogMealIntent(),
            phrases: [
                "Log chicken and rice for lunch in \(.applicationName)",
            ],
            shortTitle: "Log meal",
            systemImageName: "mic.fill"
        )
    }
}
                    
                  
                
                    
                      When my Intents extension resolves an INStartCallIntent and returns .continueInApp while the device is locked, the call does not proceed unless the user unlocks the device. After unlocking, the app receives the NSUserActivity and CallKit proceeds normally.
My expectation is that the native CallKit outgoing UI should appear and the call should start without requiring unlock — especially when using AirPods, where attention is not available.
Steps to Reproduce
Pair and connect AirPods.
Lock the iPhone.
Start music playback (e.g. Apple Music).
Place the phone face down (or cover Face ID sensors so attention isn’t available).
Say: “Hey Siri, call Tommy with DiscoMonday(My app name).”
Observed Behavior
Music mutes briefly.
Siri says “Calling Tommy with DiscoMonday.”
Lock screen shows “Require Face ID / passcode.”
After several seconds, music resumes.
The app is not launched, no NSUserActivity is delivered, and no CXStartCallAction occurs.
With the phone face up, the same phrase launches the app, triggers CXStartCallAction, and the call proceeds via CallKit after faceID.
Expected Behavior
From the lock screen, Siri should hand off INStartCallIntent to the app, which immediately requests CXStartCallAction and drives the CallKit UI (reportOutgoingCall(...startedConnectingAt:) → ...connectedAt:), without requiring device unlock, regardless of orientation or attention availability when AirPods are connected.
                    
                  
                
              
                
              
              
                
                Topic:
                  
	
		App & System Services
  	
                
                
                SubTopic:
                  
                    
	
		Automation & Scripting
		
  	
                  
                
              
              
                Tags:
              
              
  
  
    
      
      
      
        
          
            iOS
          
        
        
      
      
    
      
      
      
        
          
            Siri and Voice
          
        
        
      
      
    
      
      
      
        
          
            Intents
          
        
        
      
      
    
      
      
      
        
          
            CallKit
          
        
        
      
      
    
  
  
              
                
                
              
            
          
                    
                      I'm trying to run sample Trails app from the documentation, unaltered.
When I do the build, I get a
Command ValidateAppShortcutStringsMetadata failed with a nonzero exit code
error. How do I debug this?
I'm trying this on Xcode 16.4.
                    
                  
                
                    
                      I use NSUserAppleScriptTask in my app to call Apple Script method to send Apple Events to Finder and System Events.
The script has been deployed in the folder ~/Library/Application Scripts/{app bundle id}/
I have configured the com.apple.security.automation.apple-events in the .entitlements file, but how to configure com.apple.security.scripting-targets to meet the AppStore review requirements
The existing official documentation is too incomplete to be of much use. If anyone has had similar experience, could you please share?
                    
                  
                
                    
                      Consider the following in an AppIntent:
struct TestIntent: AppIntent {
    static let title: LocalizedStringResource = "Test Intent"
    static var parameterSummary: some ParameterSummary {
        Summary("Test") {
            \.$options
        }
    }
    enum Option: Int, CaseIterable, AppEnum {
        case one
        case two
        case three
        case four
        case five
        case six
        static let typeDisplayRepresentation: TypeDisplayRepresentation =
            TypeDisplayRepresentation(
                name: "Options"
            )
        static let caseDisplayRepresentations: [Option: DisplayRepresentation] = [
            .one: DisplayRepresentation(title: "One"),
            .two: DisplayRepresentation(title: "Two"),
            .three: DisplayRepresentation(title: "Three"),
            .four: DisplayRepresentation(title: "Four"),
            .five: DisplayRepresentation(title: "Five"),
            .six: DisplayRepresentation(title: "Six"),
        ]
    }
    @Parameter(title: "Options", default: [])
    var options: [Option]
    @MainActor
    func perform() async throws -> some IntentResult {
        print(options)
        return .result()
    }
}
In Shortcuts, this will turn into a dropdown where you can check multiple Option values. However, when perform() is called, options will be an empty array regardless of what the user selects. This is observed on both iOS 18.5 and macOS 15.5. However, on iOS 26.0 beta and macOS 26.0 beta, the issue seems to be resolved and options contains all the checked options. However, we do back deploy to current/previous iOS/macOS versions. How can we provide a multiple choice selection of fixed values on these older versions?
                    
                  
                
                    
                      My question is similar to https://developer.apple.com/forums/thread/757298?answerId=791343022#791343022 but the solution from there did not help me.
My app sends messages. I need it to do so when a user says to Siri: "Send message with ". When a user says so, Siri shows "Open  button and says " hasn't added support for that with Siri".
The code is pretty short and must work, but it doesn't. Could you please help and explain how to add the support mentioned above? How else I can use AppIntent and register the app as one capable to send messages when asked by Siri?
import AppIntents
@main
 struct MyAppNameApp: App {
     var body: some Scene {
         WindowGroup {
             ContentView()
         }
     }
     init() {
         MyAppNameShortcuts.updateAppShortcutParameters()
         Task {
             await MyAppNameShortcuts.updateAppShortcutParameters()
         }
     }
 }
struct SendMessageWithMyAppName: AppIntent {
    static var title: LocalizedStringResource = "Send message"
    static let description = IntentDescription(
        "Dictate a message and have MyAppName print it to the Xcode console.")
    @Parameter(title: "Message", requestValueDialog: "What should I send?")
    var content: String
    static var openAppWhenRun = false
    func perform() async throws -> some IntentResult {
        print("MyAppName message: \(content)")
        await MainActor.run {
            NotificationCenter.default.post(name: .newMessageReceived, object: content)
        }
        return .result(dialog: "Message sent: \(content)")
    }
}
struct MyAppNameShortcuts: AppShortcutsProvider {
    static var appShortcuts: [AppShortcut] {
        AppShortcut(
              intent: SendMessageWithMyAppName(),
              phrases: [
                  "Send message with \(.applicationName)"
              ],
              shortTitle: "Send Message",
              systemImageName: "message"
          )
    }
}
                    
                  
                
                    
                      I'm soliciting you because I'm having a problem using the 3D short cut for my ios application in uikit in the AppDelegate file but it's impossible to redirect the route when the user has completely killed the application. It works as a background application. I'd like it to redirect to the searchPage search page when the application is fully closed and the user clicks on search with 3D touch.
final class AppDelegate: UIResponder, UIApplicationDelegate {
    lazy var window: UIWindow? = {
        return UIWindow(frame: UIScreen.main.bounds)
    }()
 private let appDependencyContainer = Container()
    private let disposeBag = DisposeBag()
    var pendingDeeplink: String?
    private lazy var onboardingNavigationController: UINavigationController = {
        let navigationController = UINavigationController(nibName: nil, bundle: nil)
        navigationController.setNavigationBarHidden(true, animated: false)
       return navigationController
    }()
private func handleShortcutItem(_ shortcutItem: UIApplicationShortcutItem) {
            guard let windowScene = UIApplication.shared.connectedScenes.first as? UIWindowScene,
                  let window = windowScene.windows.first(where: { $0.isKeyWindow }),
                  let rootVC = window.rootViewController else {
                DispatchQueue.main.asyncAfter(deadline: .now() + 1.0) { [weak self] in
                    self?.handleShortcutItem(shortcutItem)
                }
                return
            }
            if let presentedVC = rootVC.presentedViewController {
                presentedVC.dismiss(animated: !UIAccessibility.isReduceMotionEnabled) { [weak self] in
                    self?.executeShortcutNavigation(shortcutItem)
                }
            } else {
                executeShortcutNavigation(shortcutItem)
            }
        }
    private func executeShortcutNavigation(_ shortcutItem: UIApplicationShortcutItem) {
        DispatchQueue.main.asyncAfter(deadline: .now() + 0.1) { [weak self] in
            guard let self = self else { return }
            switch shortcutItem.type {
            case ShortcutType.searchAction.rawValue:
                self.mainRouter.drive(to: .searchPage(.show), origin: AppRoutingOrigin())
            case ShortcutType.playAction.rawValue:
                self.mainRouter.drive(to: .live(channel: Channel(), appTabOrigin: AppTabOrigin.navigation.rawValue), origin: AppRoutingOrigin())
            case ShortcutType.myListHistoryAction.rawValue:
                self.mainRouter.drive(to: .myList(.history), origin: AppRoutingOrigin())
            default:
                break
            }
        }
    }
What I've tried:
Adding delays with DispatchQueue.main.asyncAfter
Checking for window availability and rootViewController
Dismissing presented view controllers before navigation
Environment:
iOS 15+
Swift 6
Using custom router system (mainRouter)
App supports both SwiftUI and UIKit
Questions:
What's the best practice for handling shortcuts on cold launch vs warm launch?
How can I ensure the router is properly initialized before navigation?
                    
                  
                
                    
                      Hello,
I’m working on integrating SiriKit with my music app using INPlayMediaIntent. My app is live on TestFlight, and the Siri command is being recognized, but mediaItems is always empty in my Intent
Demo Project
                    
                  
                
                    
                      We're trying to add an AppIntent to our very complex and large app that works with Siri.
What we've found through numerous different configuration attempts is that when the Siri toggle is turned off on in the Shortcuts app for our app, when we speak our phrase to Siri we get this prompt: " hasn't added support for that with Siri." with an "Open " button.
When we speak the phrase while the app is open, we get the prompt that says "Turn on "" shortcuts with Siri?" with a "Turn On" button.
When the Siri toggle is turned on for our app in the Shortcuts app, all of our phrases work as expected.
Is there some kind of configuration we're missing? There doesn't seem to be a lot of documentation. We've tried to match an example app's approach (AcceleratingAppInteractionsWithAppIntents) to no avail.
Our app intents live in an xcframework if that makes a difference, but we've also tried moving it out of xcframework without success.
This is a blocker to us shipping this feature. Thanks!
                    
                  
                
                    
                      Issue: CSV Headings Not Appearing in Shortcut-Generated File
I'm using an iPhone 16 Pro with iOS 18.5 and the latest Shortcuts app to log expenses into a CSV file. The shortcut works fine, except the resulting file doesn't include the column headings.
Here’s what I’ve done:
Created a file called Expenses.csv with this single header line:
Date,Price,Category,Store,Notes,Location
Saved it to both /iCloud Drive and /iCloud Drive/Shortcuts (via iCloud on my Windows PC).
My Shortcut builds the CSV line from inputs (date, price, category, etc.) and appends it to the file.
I renamed the variables only in the final “Text” block, since renaming in earlier blocks seems no longer possible in this Shortcuts version.
Despite this setup, the file doesn’t preserve the header row—it either doesn’t show up, or gets overwritten.
Goal:
Have a persistent CSV file with the correct headers once, and each new entry appended below the correct columns.
Can anyone help me figure out what I’m doing wrong?
                    
                  
                
                    
                      When we use the "Find All Reminders" shortcut, there's these two filters "Is Completed and "Is Not Completed".
When I implement this in my app, the best I could get is just "Completed" and "Not Completed", I can't figure out how to add the "Is" in front.
In my entity:
    @Property(title: "Completed")
    var completed : Bool
In the EntityPropertyQuery:
    static var properties = QueryProperties {
        Property(\GTDItemAppEntity.$list) {
            EqualToComparator { NSPredicate(format: "list.uuid = %@", $0.id as NSUUID) }
        }
        Property(\GTDItemAppEntity.$text) {
            ContainsComparator { NSPredicate(format: "text CONTAINS[cd] %@", $0) }
            EqualToComparator { NSPredicate(format: "text = %@", $0) }
        }
        Property(\GTDItemAppEntity.$completed) {
            EqualToComparator { NSPredicate(format: $0 ? "completed = YES" : "completed = NO") }
        }
    }
If I change the property to
    @Property(title: "Is Completed")
    var completed : Bool
Then it will show as "Is Completed" and "Not Is Completed" in the filter!
Reminder:
My App:
                    
                  
                
                    
                      Hi everyone,
I'm looking for a way to programmatically set the left/right audio balance to perfect center (50/50) using either a Terminal command or AppleScript.
Background:
The audio balance slider in System Settings > Sound > Output & Input works functionally, but I have difficulty determining when it's positioned at the exact center point. The visual nature of the slider makes it challenging for me to achieve the precision I need, and I end up adjusting it repeatedly trying to get it perfectly centered.
What I'm looking for:
A Terminal command that can set the audio balance to exact center
An AppleScript that accomplishes the same thing
Any other programmatic method to ensure perfect 50/50 balance
I've tried searching through the defaults command documentation and Core Audio frameworks but haven't found the right approach yet. Has anyone successfully automated this setting before?
Any help would be greatly appreciated!
Thanks in advance,
Dylan
                    
                  
                
                    
                      I'm trying to set a boolean value to myVariable using the "Folder" property, but the Applescript editor keeps interpreting it as a class.
Here is a shorted code. this is part of a bigger code to identify files dropped into a folder and create a new folder which it renames based on the date of the file that is dropped into the folder. Unfortunately, it keeps making folders every time it makes a new folder. Resalting in continuous loop of folders being created and renamed to "2025".
The plan is to us an IF condition to prevent the creation of folders when a folder/s are dropped into my folder with my Folder Action.
property directory : "Catalina:Users:Username:Desktop:Folder:File.pdf
tell application "Finder"
 set pathname to POSIX path of directory
 set item_info to the info for directory
 set myVariable to Folder of item_info
 return myVariable
end tell
I noticed the following when I compile the script
The color of the "Folder" is blue. I believe this means it's a class. Normally when I call a property, the color turns pink. it does it correctly when I use "set the file_name to the "name" of this_file". I also tried declaring the "Folder" property in brackets "Folder". did not help
I noticed the following when I run the script:
It returns ---error number -10004 "A privilege violation occurred. When it runs the "info for" command.
I gave the Script Editor Full File access, Full Accessibility access and the FolderActionsDispatcher has full Finder access.
Can anyone point me in the right direction!
What is the cause of the privilege violation or how would I find what the cause is?
How do I force the Script Editor to get the "Folder" property of a folder?
                    
                  
                
                    
                      Hello,
I have an AppIntent that uses the AudioPlaybackIntent to trigger my app to open and initiate an AVPlayer that plays back a media stream I control. When the phone is unlocked, everything works as I expect. The app opens and plays the audio.
However, when the phone is locked, any attempt to invoke the intent causes a "Request Code" dialog to be displayed. This seems counter to what I would expect with the AudioPlaybackIntent usage. Am I able to accomplish what I'm after here with AppIntents? Does the fact that I'm using openAppWhenRun require me to have the phone unlocked somehow?
import AppIntents
import Foundation
struct PlayStationAppIntent: AudioPlaybackIntent {
    static var title: LocalizedStringResource = "Play radio station"
    static var description: IntentDescription = .init("Play  radio station")
    static var notification: Notification.Name = .init("playStation")
    static var openAppWhenRun: Bool = true
    init() {}
    func perform() async throws -> some IntentResult {
        AudioPlayerService.shared.play()
        return .result()
    }
}
                    
                  
                
                    
                      Hello,
I'm evaluating if it's worth to expose shortcuts from our app, it seems to be working fine on my machine - Apple Silicon, latest Tahoe beta, Xcode 26 beta.
But if I compile the same code on our intel build agents which are running latest macOS 15 and Xcode 26 beta, once I install the bundle to /Applications on Tahoe I don't see any shortcuts.
Only other difference is that CI build is signed with distribution DeveloperID certificate - I re-signed the build with my dev certificate and it has no effect.
I found out that linkd is somehow involved in the discovery process and most relevant logs look like this:
default	(...)	linkd	Registry	com.**** is not link enabled	com.apple.appintents
debug	(...)	linkd	ApplicationService	Created AppShortcutClient with bundleId: com.****	com.apple.appintents
error	(...)	linkd	AppService	Unable to find AppShortcutProvider for com.****	com.apple.appintents
Could you please advice where to look for the problem?
                    
                  
                
                    
                      Hello,
I have two related questions:
in this AppIntent:
https://github.com/poml88/FLwatch/blob/moresimple/SharedPhoneWatch/AppIntents/AddInsulin.swift#L2
i am trying to work with are returned Double as the parameter.
But it does not fully work, because
there is a locale issue. in some languages the decimal point is a comme. If that is so, Siri returns 3,5 but the system does not use it as a double. How to solve that?
or, she is returning five, not 5 and again. The system does not recognise the double.
It seems Apple has some resolvers for this, for example: DoubleFromStringResolver.
https://developer.apple.com/documentation/appintents/resolvers
But I cannot figure out how to use them are how to call that resolver.
Can somebody help, please?
Thanks.
                    
                  
                
                    
                      Hi,
I’m developing an app, which just like Clock App, uses multiple counters.
I want to speak Siri commands, such as “Siri, count for one hour”. ‘count’ is the alternative app name.
My AppIntent has a parameter, and Siri understands if I say “Siri, count” and asks for duration in a separate step. It runs fine, but I can’t figure out how to run the command with the duration specified upfront, without any subsequent questions from Siri.
Clock App has this functionality, so it can be done.
    //title
    //perform()
    
    @Parameter(title: "Duration")
    var minutes: Measurement<UnitDuration>
}
I have a struct ShortcutsProvider: AppShortcutsProvider, phrases accept only parameters of type AppEnum or AppEntity.
                    
                  
                
              
                
              
              
                
                Topic:
                  
	
		App & System Services
  	
                
                
                SubTopic:
                  
                    
	
		Automation & Scripting
		
  	
                  
                
              
              
                Tags:
              
              
  
  
    
      
      
      
        
          
            Siri and Voice
          
        
        
      
      
    
      
      
      
        
          
            SiriKit
          
        
        
      
      
    
      
      
      
        
          
            App Intents
          
        
        
      
      
    
  
  
              
                
                
              
            
          
                    
                      Hi,
we're having trouble implementing search through Siri voice commands.
We already did it successfully for audio playback using INPlayMediaIntentHandling.
For search, none of the available ways works.
Both INSearchForMediaIntentHandling and ShowInAppSearchResultsIntent never open the App in the first place. We tried various commands, but e.g. "Search for " sometimes opens the Apple Music app and sometimes shows a Google search widget. Our app is never taken into consideration for providing any results.
We implemented all steps mentioned in WWDC videos and documentation (e.g. https://developer.apple.com/documentation/appintents/making-in-app-search-actions-available-to-siri-and-apple-intelligence), but nothing seems to work.
We're mainly testing on iOS 18 currently. Any idea why this is not working?
                    
                  
                
              
                
              
              
                
                Topic:
                  
	
		App & System Services
  	
                
                
                SubTopic:
                  
                    
	
		Automation & Scripting
		
  	
                  
                
              
              
                Tags:
              
              
  
  
    
      
      
      
        
          
            Siri and Voice
          
        
        
      
      
    
      
      
      
        
          
            SiriKit
          
        
        
      
      
    
      
      
      
        
          
            Intents
          
        
        
      
      
    
      
      
      
        
          
            App Intents
          
        
        
      
      
    
  
  
              
                
                
              
            
          
                    
                      Is there a way using a shell script or AppleScript to add a custom icon to a desktop shortcut? I can create the shortcut in a script but I have to manually change the icon.
thx much
                    
                  
                
              
                
              
              
                
                Topic:
                  
	
		App & System Services
  	
                
                
                SubTopic:
                  
                    
	
		Automation & Scripting