hi so im trying to create a simple app that has two pressable buttons that turn a spa mode on and off wishing another app
so far I have the UI figured out the app works the only issue is right now when I press spa on I have setup so it opens Siri Shortcuts and enables the shortcut is there a way to make the app revert back to mine after the action is done? or is there another way to open the other app and navigate to the button within that app and enable it behind my app
Shortcuts
RSS for tagHelp users quickly accomplish tasks related to your app with their voice or with a tap with the Shortcuts API.
Posts under Shortcuts tag
114 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
So Recently I’ve been making shortcuts and I noticed that I can prompt chatgpt with shortcuts to add extra functionality, I like this idea but there is 1 flaw with it and that is there is no ability to remove chats automatically, I suppose this would be fine except every time the shortcut is run it creates a new chat, usually I would have made it to remove the chat after the shortcut is finished, but since I can’t the chats would pile up as there is no way of getting rid of them automatically and it would be tedious to manually delete all the chats.
also sorry if I chose the wrong topic and subtopic, I couldn’t find a topic about shortcuts
Hello everyone,
I’m currently developing a Playground App for the Swift Student Challenge, and its core functionality relies heavily on Shortcuts Automation, App Shortcuts, and interactions with the Focus Mode status (e.g., reading Focus Status or execute Focus Filter).
Before finalizing my submission, I’d like to clarify whether these features will function as expected during the review process. Specifically:
Shortcuts Automation: My app uses custom shortcuts to trigger actions within the Playground. Will reviewers be able to test these shortcuts seamlessly, or do I need to provide explicit instructions for enabling/setting them up?
App Shortcuts: The app integrates system-level App Shortcuts (via App Intents). Are these supported in the test environment, and will reviewers see them during testing?
Focus Status Interaction: The app dynamically responds to changes in the device’s Focus Mode (e.g., adjusting UI and function based on FocusStatus). Does the evaluation environment allow access to Focus Status data, and are there restrictions on simulating Focus Mode changes?
I want to ensure these features are testable and don’t lead to unexpected issues during review. Any insights or advice from past participants, mentors, or Apple experts would be greatly appreciated!
Thank you in advance for your guidance!
(Public dupe of FB16477656)
The Shortcuts app allows you to parameterise the input for an action using variables or allowing "Ask every time". This option DOES NOT show when conforming my AppEntity.defaultQuery Struct to EntityStringQuery:
But it DOES shows when confirming to EntityQuery:
As discussed on this forum post (or FB13253161) my AppEntity.defaultQuery HAS TO confirm to EntityStringQuery to allow for searching by String from Siri Voice input.
To summarise:
With EntityQuery:
My Intent looks like it supports variables via the Shortcuts app. But will end up in an endless loop because there is no entities(matching string: String) function.
This will allow me to choose an item via the Shorcuts.app UI
With EntityStringQuery:
My Intent does not support variables via the Shortcuts app.
I am not allows to choose an item via the Shorcuts.app UI.
Even weirder, if i set up the shortcut with using a build with EntityQuery and then do another build with EntityStringQuery it works as expected.
Code:
/*
Works with Siri to find a match, doesn't show "Ask every time"
*/
public struct WidgetStationQuery: EntityStringQuery {
public init() { }
public func entities(matching string: String) async throws -> [Station] {
let stations = [Station(id: "car", name: "car"), Station(id: "bike", name: "bike")]
return stations.filter { $0.id.lowercased() == string.lowercased() }
}
public func entities(for identifiers: [Station.ID]) async throws -> [Station] {
let stations = [Station(id: "car", name: "car"), Station(id: "bike", name: "bike")]
return stations.filter { identifiers.contains($0.id.lowercased()) }
}
public func suggestedEntities() async throws -> [Station] {
return [Station(id: "car", name: "car"), Station(id: "bike", name: "bike")]
}
public func defaultResult() async -> Station? {
try? await suggestedEntities().first
}
}
/*
DOES NOT work with Siri to find a match, but Shortcuts shows "Ask every time"
*/
public struct WidgetBrokenStationQuery: EntityQuery {
public init() { }
public func entities(matching string: String) async throws -> [Station] {
let stations = [Station(id: "car", name: "car"), Station(id: "bike", name: "bike")]
return stations.filter { $0.id.lowercased() == string.lowercased() }
}
public func entities(for identifiers: [Station.ID]) async throws -> [Station] {
let stations = [Station(id: "car", name: "car"), Station(id: "bike", name: "bike")]
return stations.filter { identifiers.contains($0.id.lowercased()) }
}
public func suggestedEntities() async throws -> [Station] {
return [Station(id: "car", name: "car"), Station(id: "bike", name: "bike")]
}
public func defaultResult() async -> Station? {
try? await suggestedEntities().first
}
}```
I'm currently working on an AppIntent in my app to import Apple Pay transactions via Transaction triggers in Shortcuts. While I can access the transaction name with the following code:
@Parameter(title: "Transaction")
var transaction: String
I'm not sure how to retrieve the full details of the transaction, including:
Card or Pass
Merchant
Amount
Name
At the moment, transaction only provides the name as a string, but I need access to the complete transaction data. I know that by selecting specific fields like Amount, Merchant, etc., I can retrieve each piece of data individually, but it would be much easier and more user-friendly to simply retrieve the entire transaction object at once.
Has anyone successfully retrieved all details of an Apple Pay transaction in this context, and if so, could you share how to do so?
I am needing to add/delete shortcuts in our app without using the Add to Siri button. But have been stymied at every turn. The most recent example is some code that uses an "AppShortcutCenter" call, supposedly added in iOS 17. But there is no such thing as that which I can find. What gives, and what's needed to do custom shortcuts on demand programmatically?
I am currently facing an issue when trying to enable Shortcut support and would greatly appreciate your assistance in resolving it.
I have successfully enabled Shortcut support, and I can find my app and its respective functionalities within the Shortcuts app. However, I am unable to locate my app when attempting to create an automation within Shortcuts. I would appreciate any guidance or solutions you may offer regarding this matter.
Hi Devs,
I’ve created an app intent shortcut for our Best Buy app. This shortcut is visible on iOS 17.2 and later. However, I’ve marked it to support iOS 16+ as shown below:
import AppIntents
@available(iOS 16.0, *)
struct LaunchIntent: OpenIntent {
why we are not able to see shortcuts for iOS 16?
I've been exploring the Trails Sample App from this session at WWDC24.
The app has a TrailEntity of type AppEntity which is leveraged in multiple places throughout the app, including:
The GetTrailInfo App Intent with a trail parameter of type TrailEntity.
A parameterized App Shortcut which calls the GetTrailInfo intent.
The TrailDataManager's init calls updateSpotlightIndex(), which creates a CSSearchableItem for each Trail in the app, along with an associateAppEntity call linking the corresponding TrailEntity to each item that gets added to the CSSearchableIndex.
If you build the app and search "trails" in Spotlight, the Trails Sample App section includes instances of TrailEntity as search results. But if you comment out the App Shortcut that takes a TrailEntity as a parameter and rebuild, there are no instances of TrailEntity in the search results. In both cases, the console prints [Spotlight] Trails indexed by Spotlight.
Is this expected behavior? Why are the TrailEntity instances only appearing in Spotlight via the App Shortcut? Shouldn't the CSSearchableItem instances show up in Spotlight on their own regardless? If not, then what is the purpose of adopting Core Spotlight with App Entities? Does this add the app entities to the semantic index for "new Siri", even though they're not user facing in the Spotlight UI?
I have an App Intent that returns a MyEntity value with the following properties:
struct MyEntity: AppEntity {
@Property(title: "Title")
var title: String?
@Property(title: "Image")
var image: IntentFile?
}
I created a Shortcut that takes the output value of this intent and passes it as the input to the Send Message action. When I tap the MyEntity parameter in the message action, it shows to be of Type MyEntity. Below that, I can select 1 of 3 options: MyEntity, Title, or Image.
When I run the shortcut, a new message compose window appears with the following behavior depending on the selected option:
MyEntity - the message draft is empty
Title - the message draft shows the title string
Image - the message draft shows the image
My expected and desired result when MyEntity is selected would be a message draft populated with the image and the title string as text. How would I achieve this? Is it possible?
I've experimented with conforming MyEntity to Transferable. That's enabled use cases such as passing the MyEntity input as Type Image for example.
Do I need to create a custom UTType to represent MyEntity, or is that unrelated to my issue? I haven't explored this yet but seems potentially related!
I made a macOS document-based app that has a second scene that's a Window. It's name appears in the single window list of the Windows menu, but has no assigned shortcut. I've tried the following to assign a shortcut to it, but it doesn't add a "⌘L" as I want:
Window("Logs", id: "logs") {
LogsView()
}
.keyboardShortcut("l")
I can brute-force this using .commands to replace the menu item but that seems crude and unnecessary. Is it the only way?
Everytime i open the app shortcuts and go to text to speech when i press play the outcome is an evil laugh. What‘s happening? Is it a bug or? Im on ios 18.3
I have an image based app with albums, except in my app, albums are known as galleries.
When I tried to conform my existing OpenGalleryIntent with @AssistantIntent(schema: .photos.openAlbum), I had to change my existing gallery parameter to be called target in order to fit the predefined shape of this domain.
Previously, my intent was configured to display as “Open Gallery” with the description “Opens the selected Gallery” in the Shortcuts app. After conforming to the photos domain, it displays as “Open Album” with a description “Opens the Provided Album”.
Shortcuts is ignoring my configured title and description now. My code builds, but with the following build warnings:
Parameter argument title of a required Assistant schema intent parameter target should not be overridden
Implementation of the property title of an AppIntent conforming to AssistantSchemaIntent should not be overridden
Implementation of the property description of an AppIntent conforming to AssistantSchemaIntent should not be overridden
Is my only option to change the concept of a Gallery inside of my app into an Album? I don't want to do this... Conceptually, my app aligns well with this domain does, but I didn't consider that conforming to the shape of an AI schema intent would also dictate exactly how it's presented to the user.
FB16283840
Given that iOS 18.2 is out and following documentation and WWDC example (limited to iOS 18.2+), I am attempting to use @AssistantIntent(schema: .system.search) along an AppIntent.
Questions:
Has anyone made this to work on a real device?!
In my case (code below): when I run the intent from Shortcuts or Siri, it does NOT open the App but only calls the perform method (and the App is not foregrounded) -- changing openAppWhenRun has no effect! Strangely: If my App was backgrounded before invocation and I foreground it after, it has navigated to Search but just not foregrounded the App!
Am I doing anything wrong? (adding @Parameter etc doesn't change anything).
Where is the intelligence here? The criteria parameter can NOT be used in the Siri phrase -- build error if you try that since only AppEntity/AppEnum is permitted as variable in Siri phrase but not a StringSearchCriteria.
Said otherwise: What's the gain in using @AssistantIntent(schema: .system.search) vs a regular AppIntent in this case?!
Some code:
@available(iOS 18.2, *)
@AssistantIntent(schema: .system.search)
struct MySearchIntent: ShowInAppSearchResultsIntent {
static let searchScopes: [StringSearchScope] = [.general]
static let openAppWhenRun = true
var criteria: StringSearchCriteria
@MainActor
func perform() async throws -> some IntentResult {
NavigationHandler().to(.search(.init(query: criteria.term)), from: .siri)
return .result()
}
}
Along with this ShortCut in AppShortcutsProvider:
AppShortcut(
intent: MySearchIntent(),
phrases: [
"Search \(.applicationName)"
],
shortTitle: "Search",
systemImageName: "magnifyingglass"
)
Creating my first IOS appIntents.
I created two simple appIntents. One to create a random number and the other to store it (actually it just prints it).
Yet, when I run a shortcut that connects the two, the one that stores it is not receiving the entity.
It receives nil instead of the entity created in the first step.
I've been following along with "App Shortcuts" development but cannot get Siri to run my Intent. The intent on its own works in Shortcuts, along with a couple others that aren't in the AppShortcutsProvder structure.
I keep getting the following two errors, but cannot figure out why this is occurring with documentation or other forum posts.
No ConnectionContext found for 12909953344
Attempted to fetch App Shortcuts, but couldn't find the AppShortcutsProvider.
Here are the relevant snippets of code -
(1) The AppIntent definition
struct SetBrightnessIntent: AppIntent {
static var title = LocalizedStringResource("Set Brightness")
static var description = IntentDescription("Set Glass Display Brightness")
@Parameter(title: "Level")
var level: Int?
static var parameterSummary: some ParameterSummary {
Summary("Set Brightness to \(\.$level)%")
}
func perform() async throws -> some IntentResult {
guard let level = level else {
throw $level.needsValueError("Please provide a brightness value")
}
if level > 100 || level <= 0 {
throw $level.needsValueError("Brightness must be between 1 and 100")
}
// do stuff with level
return .result()
}
}
(2) The AppShortcutsProvider (defined in my iOS app target, there are no other targets)
struct MyAppShortcuts: AppShortcutsProvider {
static var shortcutTileColor: ShortcutTileColor = .grayBlue
@AppShortcutsBuilder
static var appShortcuts: [AppShortcut] {
AppShortcut(
intent: SetBrightnessIntent(),
phrases: [
"set \(.applicationName) brightness to \(\.$level)",
"set \(.applicationName) brightness to \(\.$level) percent"
],
shortTitle: LocalizedStringResource("Set Glass Brightness"),
systemImageName: "sun.max"
)
}
}
Does anything here look wrong? Is there some magical key that I need to specify in Info.plist to get Siri to recognize the AppShortcutsProvider?
On Xcode 16.2 and iOS 18.2 (non-beta).
Hello everyone,
I would like to dictate a text with Shortcuts and then send it to one of two e-mail addresses (private or business).
I would like to be able to select one of two email addresses.
Unfortunately, I am not able to pass an email address as a parameter to the Send email function. Is it possible to do this?
I'm new to apple and I'm all but not a programmer. So take pity on me :)
Best, Niko
Exploring AppIntents and Shortcuts. No matter what I try Siri won't understand parameters in an initial spoken phrase.
For example, if I ask: "Start my planning for School in TestApp", Siri responds: "What's plan?", I say: "School" and Siri responds "Ok, starting Shool plan"
What am I missing so it won't pick up parameters right away?
Logs inside func entities(matching string: String) are only called after "What's plan?" question and me answering "School". No logs after the initial phrase
Tried to use Apple's Trails example as a reference but with no luck
I’m trying to run a Shortcut from my iPhone’s Action button and immediately dictate input by voice, without tapping or switching modes.
Currently, it always defaults to a text field. Voice dictation works seamlessly if I trigger the shortcut via a voice command, but not when I use the Action button.
Is there a way, API, or future plan from Apple to allow voice dictation after the Action button triggers a Shortcut? Any guidance or ideas would be really helpful!
I have a very basic App Intent extension in my macOS app that does nothing than accepting two parameters, but running it in Shortcuts always produces the error "The action “Compare” could not run because an internal error occurred.".
What am I doing wrong?
struct CompareIntent: AppIntent {
static let title = LocalizedStringResource("intent.compare.title")
static let description = IntentDescription("intent.compare.description")
static let openAppWhenRun = true
@Parameter(title: "intent.compare.parameter.original")
var original: String
@Parameter(title: "intent.compare.parameter.modified")
var modified: String
func perform() async throws -> some IntentResult {
return .result()
}
}