Handle requests for your app’s services from users using Siri or Maps.

Posts under SiriKit tag

38 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

How to support Siri on a watchOS app with iOS app(play media intent)
Project Info:A music player iOS App with watchOS app embedded. Project Structure: app target: music intent extenstion: intent (for iOS platform) watchOS app target: watchKit watchOS extension: watchKit Extension iOS app use intent extension to support SiriKit with play media intent, it works very well. now i want to support Siri on watchOS app, but i don't know how. i have tried to add new watch extension target, but it doesn't work. i t keeps saying "my app doesn't support *** instruction". Please share if I have missed to read through some documentation / reference that solves this problem.
0
0
554
Nov ’23
siri short 10s timeout
When I use the wake up word to call up siri shortcut extension, when the request exceeds 10s, intenthandler is automatically released by the system. Is it true that siri shortcut extension cannot run longer than 10s? Is there any way to extend the timeout period?
0
1
432
Dec ’23
Can't get audio data from INSendMessageIntent
guard let fileURL = intent.attachments?.first?.audioMessageFile?.fileURL else { print("Couldn't get fileNameWithExtension from intent.attachments?.first?.audioMessageFile?.fileURL?.lastPathComponent") return failureResponse } defer { fileURL.stopAccessingSecurityScopedResource() } let fileURLAccess = fileURL.startAccessingSecurityScopedResource() print("FileURL: \(fileURLAccess)") let tempDirectory = FileManager.default.temporaryDirectory let tempFileURL = tempDirectory.appendingPathComponent(UUID().uuidString + "_" + fileURL.lastPathComponent) do { // Check if the file exists at the provided URL guard FileManager.default.fileExists(atPath: fileURL.path) else { print("Audio file does not exist at \(fileURL)") return failureResponse } fileURL.stopAccessingSecurityScopedResource() // Check if the temporary file already exists and remove it if necessary if FileManager.default.fileExists(atPath: tempFileURL.path) { try FileManager.default.removeItem(at: tempFileURL) print("Removed existing temporary file at \(tempFileURL)") } // Copy the audio file to the temporary directory try FileManager.default.copyItem(at: fileURL, to: tempFileURL) print("Successfully copied audio file from \(fileURL) to \(tempFileURL)") // Update your response based on the successful upload // ... } catch { // Handle any errors that occur during file operations print("Error handling audio file: \(error.localizedDescription)") return failureResponse } guard let audioData = try? Data(contentsOf: tempFileURL), !audioData.isEmpty else { print("Couldn't get audioData from intent.attachments?.first?.audioMessageFile?.data") return failureResponse } Error: FileURL: false Audio file does not exist at file:///var/mobile/tmp/SiriMessages/BD57CB69-1E75-4429-8991-095CB90959A9.caf is something I'm missing?
1
0
428
Feb ’24
Convert SiriKit Custom Intents using non configurable parameters to AppIntents
I have an Intents definition file for a Custom Intent that I want to convert to an AppIntent. The Custom Intent has the checkbox "Configurable in Shortcuts" not checked and therefore the "Convert to App Intent" Button is greyed out. I can however still do a conversion using the Menu-Item "Editor"->"Convert to App Intent". The intent has a number of parameters that are not configurable, but were set in code. This way it was possible to donate shortcuts with the parameters (and even the title) set in code. The automatic conversion using the menu item however produces a result that does not match the legacy Custom Intent (Parameters appear in the Shortcuts App etc). I also did not find any way to create AppIntents that have parameters that can be set in code, before the intent is donated. I would leave the old legacy Custom Intents as they are, but as soon as make use of any of the new iOS 16 Shortcut features (App Shortcuts) the existing donated Custom Intents disappear in the Shortcuts App. Given the apparent inability to convert them into AppIntents due to the missing code-set parameters, I would be happy for any advice on potential solutions.
0
0
504
Feb ’24
Siri intents crashing when resolving with needsValue
So I'm working on a logging app that uses Siri to log diaper changes for babies. There are 3 types of diaper changes, wet, dirty, both. I created a enum for these values in the intent definition file and made it configurable and resolvable. in the resolve function, I added this line of code public func resolveDiaperType(for intent: DiaperIntentIntent, with completion: @escaping (DiaperTypeResolutionResult) -> Void) { let needsValue = intent.diaperType == .unknown if needsValue { completion(.needsValue()) } else { completion(.success(with: intent.diaperType)) } } But as soon as .needsValue() is called, the UI will ask user to select one value, and then crash the app. I tried removing a lot of different params and code blocks, needsValue is the only thing that's crashing for me. If I make the default diaperType parameter as .dirty instead of .unknown, it works. Basically it won't let me work with an empty enum parameter. I get the SIGABRT error and the app crashes. I have like 4 intents. 3 of them uses enums. All 3 crash on the enum input UI. all 3 work correctly when the enum is given a value instead of .unknown. The problem is, I NEED to ask user the type. If I give it a default value and resolve it with .needsValue(), it still crashes. I cannot ask the user for a value. I haver made siri intents with enums input before. And those intents STILL WORK. They were just made for older Xcode versions Is this an Xcode bug? Testing on iOS 17.2 simulator Xcode 15.2
0
0
450
Mar ’24
Build Error to taking debug of Siri Intent for AppleWatch.
I want to make apple watch app with Siri Kit. So I integrated siri intents and code on Xcode. But I caught the following build error during debugging of intents. Please advice me. Failed to install the app on the device. Domain: com.apple.dt.CoreDeviceError Code: 3002 User Info: { DVTErrorCreationDateKey = "2024-03-11 02:38:02 +0000"; IDERunOperationFailingWorker = IDEInstallCoreDeviceWorker; NSURL = "file:///Users/cion/Library/Developer/Xcode/DerivedData/Hydrator-dqlkxgxzgyjnzwhgguipzumyovfi/Build/Products/Debug-iphoneos/Hydrator.app/"; } “Hydrator”をインストールできません Domain: IXUserPresentableErrorDomain Code: 6 Failure Reason: このアプリはこのデバイス用に作成されていません。 Recovery Suggestion: This app was not built to support this device family; app is compatible with ( 1, 2 ) but this device supports ( 4 ) This app was not built to support this device family; app is compatible with ( 1, 2 ) but this device supports ( 4 ) Domain: MIInstallerErrorDomain Code: 10 User Info: { FunctionName = MIIsApplicableToCurrentDeviceFamilyWithError; LegacyErrorString = DeviceFamilyNotSupported; SourceFileLine = 86; } Event Metadata: com.apple.dt.IDERunOperationWorkerFinished : { "device_isCoreDevice" = 1; "device_isWireless" = 1; "device_model" = "Watch5,2"; "device_osBuild" = "10.3.1 (21S651)"; "device_platform" = "com.apple.platform.watchos"; "dvt_coredevice_version" = "355.7.7"; "dvt_mobiledevice_version" = "1643.60.2"; "launchSession_schemeCommand" = Run; "launchSession_state" = 1; "launchSession_targetArch" = "arm64_32"; "operation_duration_ms" = 4026; "operation_errorCode" = 6; "operation_errorDomain" = "com.apple.dt.CoreDeviceError.3002.IXUserPresentableErrorDomain"; "operation_errorWorker" = IDEInstallCoreDeviceWorker; "operation_name" = IDERunOperationWorkerGroup; "param_debugger_attachToExtensions" = 1; "param_debugger_attachToXPC" = 1; "param_debugger_type" = 1; "param_destination_isProxy" = 0; "param_destination_platform" = "com.apple.platform.watchos"; "param_diag_MainThreadChecker_stopOnIssue" = 0; "param_diag_MallocStackLogging_enableDuringAttach" = 0; "param_diag_MallocStackLogging_enableForXPC" = 1; "param_diag_allowLocationSimulation" = 1; "param_diag_checker_tpc_enable" = 1; "param_diag_gpu_frameCapture_enable" = 0; "param_diag_gpu_shaderValidation_enable" = 0; "param_diag_gpu_validation_enable" = 0; "param_diag_memoryGraphOnResourceException" = 0; "param_diag_queueDebugging_enable" = 1; "param_diag_runtimeProfile_generate" = 0; "param_diag_sanitizer_asan_enable" = 0; "param_diag_sanitizer_tsan_enable" = 0; "param_diag_sanitizer_tsan_stopOnIssue" = 0; "param_diag_sanitizer_ubsan_stopOnIssue" = 0; "param_diag_showNonLocalizedStrings" = 0; "param_diag_viewDebugging_enabled" = 1; "param_diag_viewDebugging_insertDylibOnLaunch" = 1; "param_install_style" = 0; "param_launcher_UID" = 2; "param_launcher_allowDeviceSensorReplayData" = 0; "param_launcher_kind" = 0; "param_launcher_style" = 0; "param_launcher_substyle" = 2; "param_runnable_appExtensionHostRunMode" = 0; "param_runnable_productType" = "com.apple.product-type.app-extension"; "param_structuredConsoleMode" = 1; "param_testing_launchedForTesting" = 0; "param_testing_suppressSimulatorApp" = 0; "param_testing_usingCLI" = 0; "sdk_canonicalName" = "watchos10.2"; "sdk_osVersion" = "10.2"; "sdk_variant" = watchos; } System Information macOS Version 14.3.1 (Build 23D60) Xcode 15.2 (22503) (Build 15C500b) Timestamp: 2024-03-11T11:38:02+09:00
0
0
456
Mar ’24
PTT in the background, cannot activate Siri without unlocking
Hello, We're interested in using the PTT Framework with our PTT capable hardware, as the framework has intended. The problem is activating Siri with any of our specified Intent's doesn't work when the phone is locked. The iPhone always says "You'll have to unlock your iPhone first". Reading up on the problem, it seems pretty common in the fact that Apple doesn't allow Siri Intents to be executed while the phone is locked. It's a sensible precaution by default, but there are countless threads of real use cases that users want to use Siri without unlocking (with PTT, or well, without). There appears to be no options for PTT to enable this, any flags on the Siri Intent to allow benign App actions or queries, user UI configuration through Settings -> Siri & Search to manually allow it even when the phone is locked. Neither are there any entitlements (that I'm aware of) that would allow trivial and non-secure Siri App Intents. The only advice we have for our users (and albeit against the intention of the limitation in the first place) is to: Disable Auto Lock, Disable Face ID and to Disable Passcode. It is in fact 2024, and users do expect a better experience than this with Siri, or am I missing something?
1
0
499
Jun ’24
MyApp hasn't added support for that with Siri
I added a custom intents extension to my project. All seems to be correctly implemented: in Targets' Signing & Capabilities Siri appears in info.plist, NSUserActivityTypes specify the custom intent in project entitlements, Siri is specified in intent info.plist's, NSExtensionPrincipalClass specify the intent class The app asks permission to use Siri - I confirm permission. The app implements the button "Add to Siri" - I add the shortcut. If I start the shortcut manually, I'm able to perform all the provided actions. If I start the the intent selecting the app executable on the simulator, I'm able to perform all the actions using Siri with my voice But ... If I start the the intent selecting the app executable on the device, when I call Siri it responds " hasn't added support for that with Siri" If I try to call directly the shortcut with Siri on the device, Siri responds " hasn't added support for that with Siri" I googled on the Internet, but I'm still not able to understand what is going wrong. There are some settings that I'm forgetting? Any help would be appreciated. Thank you!
2
0
461
Mar ’24
Issue with Siri Intent or App Intent not functioning properly in Speech Framework
Description: Problem Statement: State the problem clearly: The Siri Intent for the "Next","Previous","Repeat" command is not working as expected within the Speech Framework. Steps to Reproduce: Provide a detailed description of the steps to reproduce the issue. For example: Open the Speech Framework application. Tap on the Siri button to activate voice input. Say "Next" to trigger the intended action. Observe that the action is not executed correctly. IN Our Demo App: Steps of my demo application as below: Open SIRI Speak: Check In Response: Open dialog as below: What user wants? One 2) Next 3) Yes 4) Goodbye Speak: Next In Response: SIRI repeat same dialog (Step: 2) 3) Speak: Yes, or One or Goodbye In Response: SIRI goes to next dialog. Expected Behavior: Should be get "Next" Value in siri kit intent or app intent. Actual Behavior: But it give previous user input key word give in siri kit intent and recuresively repeat dialog in app intent. Device versions and Region and Language: Device model: IPhone 11 and OS version: 17.4.1 Region: Us and Language: English(US) Impact: User Cant use Iterative dialog in one context. Additional: How Different command work on app intent and siri kit intent on diffrent diffrent device. you can follow No vise in order. || No || Diffrent Device test on Diffrent sinario || SiriKit intent || app Intent || | 1 | ISG iPhone 11 - Next | Not | Not | | 2 | ISG iPhone 11 - Yes | Not | Yes (But Using Enum) | | 3 | ISG iPhone 11 - GoodBye | Not | Yes (But Using Enum) | | 4 | ISG iPhone 11 - One | Yes | Yes | | 5 | iPad - Next | Not | Not | | 6 | iPad - One | Yes | Yes | | 7 | iPad - GoodBye | Not | Yes | | 8 | iPad - Yes | Not | Yes | | 9 | Simulator - iPhone 15 - Next, Yes, One, GoodBye | Yes | Yes | Please help me in it...
0
0
454
Apr ’24
Can we add an action to the 'Shortcuts' app in macOS using 'Intents extension'?
I have macOS application and I want to provide an action for it in the 'Shortcuts' app QuickAction list. I was using InApp handling to present my intent created in the intent.intentdefinition file as action in the shortcuts app and it was working. However, this action perform a very lightweight task so I intent to have the action implemented as an extension in my xcode project. According to my minimum deployment(i.e macOS 11.0) I found that 'Intents Extension' could be used. I have added the 'Intents extension' target to my main application and created an intent using the intent.intentdefinition file. However, my intent does not appear in the shortcuts app. I have verified it multiple time to ensure I am not missing anything, but still the intent is not present in the shortcuts app action. I wanted to be know, Is this even possible? cause this apple documentation only mentions about iOS and watchOS app. It also does not mention If our custom intent(created using Intents extension) in the intents extension can be exposed to the shortcuts app. For macOS 13.0+, I have used the 'AppIntents extension' and I m able to achieve the same. So, I suppose the same should be possible using the 'Intents extension'
1
0
293
May ’24
AppIntentVocabulary (INPlayMediaIntent) is unstable.
I am developing an iOS app that supports INPlayMediaIntent. We are trying to increase the recognition rate of content names, which are song titles, using AppIntentVocabulary. As a sample, some extracts are shown below. <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd"> <plist version="1.0"> <dict> <key>IntentPhrases</key> <array> <dict> <key>IntentName</key> <string>INPlayMediaIntent</string> <key>IntentExamples</key> <array> <string>Mezamashi Appで湖畔の朝を再生</string> <string>湖畔の朝をMezamashi Appで再生して</string> </array> </dict> </array> <key>ParameterVocabularies</key> <array> <dict> <key>ParameterNames</key> <array> <string>INPlayMediaIntent.playlistTitle</string> </array> <key>ParameterVocabulary</key> <array> <dict> <key>VocabularyItemIdentifier</key> <string>ID1</string> <key>VocabularyItemSynonyms</key> <array> <dict> <key>VocabularyItemPronunciation</key> <string>aogamagaeru</string> <key>VocabularyItemPhrase</key> <string>青ガマガエル</string> </dict> </array> </dict> <dict> <key>VocabularyItemIdentifier</key> <string>ID2</string> <key>VocabularyItemSynonyms</key> <array> <dict> <key>VocabularyItemPronunciation</key> <string>kohon no asa</string> <key>VocabularyItemPhrase</key> <string>湖畔の朝</string> </dict> </array> </dict> <dict> <key>VocabularyItemIdentifier</key> <string>ID3</string> <key>VocabularyItemSynonyms</key> <array> <dict> <key>VocabularyItemPronunciation</key> <string>kumageratachi no uta</string> <key>VocabularyItemPhrase</key> <string>クマゲラたちの歌</string> </dict> </array> </dict> </array> </dict> </array> </dict> </plist> When running on the iOS 17.5 simulator in Xcode 15.4, the results are as follows. mediaName = VocabularyItemIdentifier mediaIdentifier = nil <INMediaSearch: 0x6000026212c0> { reference = 0; mediaType = 0; sortOrder = 0; albumName = <null>; mediaName = ID1; genreNames = ( ); artistName = <null>; moodNames = ( ); releaseDate = <null>; mediaIdentifier = <null>; } However, when running on an iOS 17.5 device, the following applies. mediaName = VocabularyItemPhrase mediaIdentifier = VocabularyItemIdentifier <INMediaSearch: 0x301efd9e0> { reference = 0; mediaType = 5; sortOrder = 0; albumName = <null>; mediaName = 青ガマガエル; genreNames = ( ); artistName = <null>; moodNames = ( ); releaseDate = <null>; mediaIdentifier = ID1; } The results are not stable, for example, sometimes everything else returns null. I have tried everything, but it is just taking a long time. Does anyone have any advice on this?
1
0
219
Jun ’24
Framework compilation with Command CodeSign failed with a nonzero exit code
I am trying to extract a protocol from an app to be able to use it in a siri intent. Yet when I compile it I get: note: Injecting stub binary into codeless framework (in target 'Virtual Tags Framework' from project 'Virtual Tags Framework') /Users/fabriziobartolomucci/Library/Developer/Xcode/DerivedData/Virtual_Tags_Framework-chxutmulwgujeiceazyyzaphwner/Build/Products/Debug-iphonesimulator/Virtual_Tags_Framework.framework/Frameworks/ARKit.framework/Versions/A: bundle format unrecognized, invalid, or unsuitable Command CodeSign failed with a nonzero exit code
1
0
189
1w
SiriKit Extension still needed with AppIntents?
I have followed the SoupChef example in migrating Custom Intents from SiriKit to AppIntents. However, we only require one iOS release back, so we can require iOS 17. Thus, I eliminated everything that was strictly for backwards compatibility, most notably the SiriKit Extension that required enormous amounts of code to try to coordinate with the real app which worked poorly anyway. I tested for example that an NFC tag Automation created in Shortcuts works to execute an AppIntent while the app is backgrounded. I am now receiving a beta report that indicates someone trying to execute one of our migrated AppIntents from their HomePod is not working, and they say it used to work sometimes (not all the time). I'm sure most such cases used to require the SiriKit Extension in the old SiriKit world. I am terrified that I may need to rebuild that monster once again when the new (to me) AppIntent API seemed so beautiful without it and seemed to work without it. The AppIntent API documentation seems to indicate that SiriKit Extensions are no longer related or required. What is the truth here? Do I need to re-implement everything twice in the SiriKit Extension like a barbarian, or can we live in the new world with AppIntents? Thank you.
1
0
205
2d
muting SIRI via AppIntents
We're using App Intents to launch are control our app via Siri. Siri's responses have been fairly random, some with a "Done" popup, others with a verbal confirmation, others saying "I'm sorry, there's been a problem". The latter is bogus and doesn't look good to potential investors when the app is actually working fine. There appears to be no way in code that I've been able to find so far that would have been tell Siri to STFU. Let us handle our own errors. Otherwise is there a means to supply Siri with a dictionary of restored messages that could be triggered inside the app?
1
0
193
2d