Posts under App & System Services topic

Post

Replies

Boosts

Views

Activity

Macbook M5 Development Kernel Panic
Hi, I'm posting a boot crash here. Environment Hardware: Macbook M5 Pro OS Version: macOS 26.3.1 (25D2128) and matching version of KDK from official apple download page Kernel Version: Darwin Kernel Version 25.3.0 Reproducibility: Consistent Here is my panic log --- I truncated one field "SOCDNandContainer" as the original log is too long to post, hitting the size limit. I followed a blog post to boot the development kernel as the ReadMe file from KDK only contains instructions for Intel Macs. https://jaitechwriteups.blogspot.com/2025/10/boot-custom-macos-kernel-on-macos-apple.html I've tried a few 26.2 KDKs before 26.3.1 public launch, and they all showed same errors (26.1 and 26.0 KDKs don't have any development kernel for T8142 chip). Also, I own two fresh M5 Pro, and it is consistent across the machines. The highlight is panic(cpu 8 caller 0xfffffe0050e18010): [Exclaves] $JgOSLogServerComponent.RedactedLogServer.init(logServerNotific:OSLogServerComponent\/OSLogServerComponent_Swift.swift:815: Fatal error: invalid rawValue for TightbeamComponents.RedactedLogSer at PC ... Is this a genuine bug or am I following a wrong guide to boot the development kernel? I don't think the blog is wrong because I'm able to boot the "release" kernel included in the KDK on the same M5 Pro, and the "development" kernel on M4 Mac Mini, using the same routine. Just to be clear, I'm not compiling XNU myself, but am using the ones included in the kit.
0
0
158
3w
Parameter recognition on AppShortcuts invocation not consistent
While playing around with AppShortcuts I've been encountering some problems around getting the invocation phrase detected and/or the parameter get recognized after invocation phrase via Siri. I've found some solutions or explanations here in other posts (Siri not recognizing the parameter in the phrase & Inform iOS about AppShortcutsProvider), but I still have one issue and it's about consistency. For context, I've defined the parameter to be an AppEntity with it's respective query conforming to the EntityStringQuery Protocol in order to be able to fetch entities with the string given by Siri struct AnIntent: AppIntent { // other parts hidden for clarity @Parameter var entity: ModelEntity } For an invocation phrase akin to "Do something with in ", if the user uses the phrase with a entity previously donated via suggestedEntities() the AppShortcut get executed without problems. If the user uses a phrase with no parameter, like "do something with ", if the user gets asked to input the missing parameter and inputs one, it may or may not get recognized and be asked to input a parameter again, like in a loop. This happens even if the parameter given is one that was donated. I've found that when this happens the entities(matching string: String) function in the EntityQuery doesn't get called. The input can be of one word or sometimes two and it will not be called. So in other words entities(matching string: String) does not get called on every user parameter input Is this behavior correct? Do parameters have some restrictions on length or anything? Does Siri shows the user suggested entities when asked for entity input? It doesn't on my end. Additional question related to AppShortcuts: On AppShortcut definition, where the summary inside the parameter presentation is used? I see that it was defined in the AppIntentsSampleApp for the GetTrailInfo Intent but didn't find where it was used
0
0
114
Apr ’25
Multiple upload happening for createItem in File Provider extension
I'm new to swift and iOS development. I'm trying to create a file provider extension for my app. I have able to use createItem, modifyItem, fetchContents functions. But when I try to add a GarageBand file or a big size mp3 file(18 Mb) I can see multiple upload is happening. I checked the FruitBasket project where they are doing chunked upload when file size is more than 100 Mb. How do I fix this to only one upload? I'm getting suggestions like I have to do debounce upload but that seems not a proper solutions.
0
0
371
Dec ’25
Custom AppEntity nested dictionary
I am creating an AppIntent to be used with Shortcuts and I would like to return a flexible dictionary of values with nested structures. As far as I understand the custom AppEntity only uses the displayRepresentation to store a title and subtitle which are LocalizedStringResource. types. Although I can convert my dictionary into a string I found no way in shortcuts to be able to retrieve the original structure of it and inspect individual elements like in subsequent actions. Is there a way to do this? Thank you in advance Nick Karanatsios
0
0
242
Dec ’25
How to debug a CoreSpotlight extension?
My CoreSpotlight extension seems to exceed the 6 MB memory limit. What’s the best way to debug this? I've tried to attach the debugger on the Simulator but the extension seems to be never launched when I trigger the reindex from Developer settings. Is this supposed to work? On device, I am able to attach the debugger. However, I can neither transfer the debug session to Instruments, nor display the memory graph. So I've no idea how the memory is used. Any recommendations how to move forward? Is there a way to temporarily disable the memory limit since even with LLDB attached, the extension is killed.
0
1
193
Apr ’25
FSKit removeItem Not Being Called
Environment macOS Version: 26.1 Xcode Version: 16.2 Description I'm developing a custom file system using FSKit and have encountered an issue where the removeItem(_:named:fromDirectory:) method in my FSVolume.Operations implementation is not being invoked when attempting to delete files or directories through Finder or the command line. Implementation My volume implements the required FSVolume.Operations protocol with the following removeItem implementation: func removeItem( _ item: FSItem, named name: FSFileName, fromDirectory directory: FSItem ) async throws { logger.info("remove: \(name)") if let item = item as? MyFSItem, let directory = directory as? MyFSItem { directory.removeItem(item) } else { throw fs_errorForPOSIXError(POSIXError.EIO.rawValue) } } Steps to Reproduce Mount the custom FSKit-based file system using: mount -F -t MyFS /dev/diskX /tmp/mountpoint Create files using Finder or terminal (works correctly - createItem is called) Attempt to delete a file using any of the following methods: Terminal command: rm -rf /path/to/mounted/file option + cmd + delete to remove the file in Finder Expected Behavior The removeItem(_:named:fromDirectory:) method should be called, logging "remove: [filename]" and removing the item from the directory's children collection. Actual Behavior The removeItem method is never invoked. No logs appear from this method in Console.app. The deletion operation either fails silently or returns an error, but the callback never occurs. Additional Context Working operations: Other operations work correctly including: createItem - files and directories can be created lookupItem - items can be looked up successfully enumerateDirectory - directory listing works read and write - file I/O operations work correctly Volume state: The volume is properly mounted and accessible Files can be created, read, and written successfully Volume capabilities configured: var supportedVolumeCapabilities: FSVolume.SupportedCapabilities { let capabilities = FSVolume.SupportedCapabilities() capabilities.supportsHardLinks = true capabilities.supportsSymbolicLinks = true capabilities.supportsPersistentObjectIDs = true capabilities.doesNotSupportVolumeSizes = true capabilities.supportsHiddenFiles = true capabilities.supports64BitObjectIDs = true capabilities.caseFormat = .insensitiveCasePreserving return capabilities } Questions Are there specific volume capabilities or entitlements required for removeItem to be invoked? Is there a specific way deletion operations need to be enabled in FSKit? Could this be related to how file permissions or attributes are set during createItem? Are there any known issues with deletion operations in the current FSKit implementation? Do I need to implement additional protocols or set specific flags to support item deletion? Any guidance would be greatly appreciated. Has anyone successfully implemented deletion operations in FSKit? Thank you!
0
0
248
Nov ’25
KDK for current stable version (26.1) missing
The current stable macOS version, 26.1 (build 25B78) is missing a corresponding Kernel Debug Kit (KDK) on the developer downloads page. This means I can't do any kernel-level development tasks currently. For example, if I try to build a new kernel collection with kmutil I get the message Missing Developer Kit: As of macOS 13.0, you will need to install a KDK matching your build 25B78 to rebuild kernel collections. but there is no build 25B78 KDK available to download. The latest 26.1 KDK on the download page is 25B5062e (from a beta I believe) and the final stable KDK for build 25B78 (which kernel development tools require) was never published. Is there any workaround for this to correctly do kernel-level development targeting the latest stable release, or a timeline for when the KDK will release? Thanks!
0
3
332
Nov ’25
Deliver/bundle entire Shortcut automations with an app
Is there any way of creating complete Shortcuts automations and bundling them with my app? Specifically, I would like the user to be able to Take a photo and open it with my app Or take a screenshot and open it with my app Of course I could offer a Share extension, but going through the Share menu and selecting my app there is time consuming for the user. I would like the user to be able to configure his or her action button such that it takes a new picture and opens it with my app right away. I can, of course, offer the respective App Shortcuts and let the user combine them into a pipeline with the Take Screenshot or Take Photo system actions. However, only power users would do this. Hence, I would like to bundle this complete pipeline with my app, such that the user just has to assign his/her Action Button to this pipeline if he/she wants to use this feature. How to go about this? I was thinking of exporting the shortcut into a file, bundling it with the app as a resource, and offering it via a Share action for the user to install it, or by sharing it on iCloud and adding the iCloud link to the UI of my app. What is the recommended approach?
0
0
354
Dec ’25
Notification Tones of tonetype SystemSounds is Not Routing to Connected Bluetooth HFP Accessory (WM500)
When an iOS device is connected to a Bluetooth accessory that utilizes the Hands-Free Profile (HFP), we are encountering an incorrect audio routing behavior specifically for system notification tones. Accessory Connected: The iOS device is successfully connected to a Bluetooth accessory (specifically, a WM500 device) using the HFP profile for voice communication. Voice Audio: Audio streams related to phone calls or voice communication (using the HFP/SCO link) are correctly routed to the WM500. Notification Tones Issue: System notification tones, which are played using the tonetype.systemsounds API, are not being routed to the connected HFP accessory (WM500). Instead, they are incorrectly played through the iOS device's built-in speaker. This causes a poor user experience, as critical application alerts and system notifications are missed when the user is relying on the connected HFP accessory for all audio output.
0
0
64
Dec ’25
Shortcuts: How to add “-pressed” to a file name in a shortcut
Hi there, Does anyone know how to modify this Image compressor Shortcut https://www.icloud.com/shortcuts/e13d8013598f4f33830386a956a163dd so that the image it creates has the original file name + “-pressed”? Eg “Image_123” becomes “Image_123-pressed” I know of the action ‘Rename file’ but can’t make it work. Any help much appreciated:)
0
0
213
Jan ’26
App Intents with Custom Automation/Triggers
Currently, we are developing an all-in-one DualSense utility for macOS. We are exploring how to integrate shortcuts into our app. Our vision is to have the user use the native Shortcuts app to choose the controller buttons that should trigger the shortcut action, such as opening Steam, turning on audio haptics, and more. As we explore this approach, we want to see whether we need to build the UI in our app to set the triggers or can we do this inside of Shortcuts? Can button presses recorded by our app trigger shortcuts? Can those button inputs be customized inside of Shortcuts or should we develop it into our app? And if we have it in our app, can our app see, select, and trigger shortcuts?
0
0
391
Jan ’26
API to Programmatically Establish SCO Connection for HFP Accessories in iOS
When an iOS device is connected to a Bluetooth accessory that utilizes the Hands-Free Profile (HFP), we are encountering an incorrect audio routing behavior specifically for system notification tones. Accessory Connected: The iOS device is successfully connected to a Bluetooth accessory (specifically, a WM500 device) using the HFP profile for voice communication. Voice Audio: Audio streams related to phone calls or voice communication (using the HFP/SCO link) are correctly routed to the WM500. Notification Tones Issue: System notification tones, which are played using the tonetype.systemsounds API, are not being routed to the connected HFP accessory (WM500). Instead, they are incorrectly played through the iOS device's built-in speaker. Accessory team has suggested to establish SCO connection to route the tones through WM500. But iOS does not provide an external API (like Android's startBluetoothSco) to explicitly force the establishment of an SCO connection for notification tones. Is there any other approach to establish SCO connection in iOS to route notification tones through WM500
0
0
134
Dec ’25
Terrible performance on iPad 11th BLE attribute notification messages.
We've been developing an iOS app in Swift for several years that run on iPad tablets in which our proprietary device emits EEG signals via BLE to the app running on the iPad tablet. The device emits the data as BLE notification messages in which the MTU is set to the maximum size that is allowed between our device and the iPad. Our device when communicating with the app running on a 10th generation iPad running iOS 18.5 it takes less than 200ms to transmit an interval of EEG signals which occurs at 500ms. Under the same conditions same version of iOS & app and the same device but using an iPad 11th generation, it takes anywhere from 800ms to 1.1 seconds (4x to 5x) to transmit an interval. Our device transmits the EEG signal using several ATT notification messages using the maximum MTU size. We are perplexed about such a huge step down in performance when running on the iPad 11th generation tablets. iPad generation Chipset Firmware -------------------------------------------------------------- 10th BCM_4387 22.5.614.3457 11th SRS_7923 HCI Rev. 2504 sub. 5003 We know that the 10th generation iPad used chipset manufactured by Broadcom. Whereas the iPad 11th generation that we've received uses a SRS chipset in which I'm unfamiliar with that chipset's manufacturer. We question if this performance degradation is due from the chipset manufacturer, the firmware revision when using attribute notifications messages over BLE in such a context. Using PacketLogger as to log the communication between the iPad tablets and our device and after analysis we haven't found anything that identifies difference in configuration settings that are exchanged between our device and iPad tablets that account for this performance degradation. Fortunately, our device is designed to work in complex environments & contexts and thus it has mechanisms accounting for transmission delays and interferences. I'd appreciate if any other Apple Developer or Apple staff is aware of the degradation when transmitting BLE attribute notification messages with the newer Apple devices using this series of chipset. If so, then: Are there any recommendations of solutions to improve this latency? Is this is being addressed for iPad 11th generation tablets? Regards, Steven Belbin Principal Developer at NeuroServo Inc.
0
1
145
Jul ’25
"Show on all spaces" toggles OFF after programmatically setting wallpaper via AppleScript
I'm building an automated wallpaper updater that fetches images from an API and sets them as desktop wallpaper on macOS Tahoe. The automation uses AppleScript combined with database manipulation to ensure wallpaper applies to all spaces. Current implementation (via Apple Shortcuts): wallpaper_path="$1" osascript -e "tell application \"System Events\" to tell every desktop to set picture to POSIX file \"$wallpaper_path\"" sqlite3 ~/Library/Application\ Support/Dock/desktoppicture.db "UPDATE data SET space=NULL WHERE space IS NOT NULL;" 2>/dev/null killall -HUP Dock Issue First run: Works perfectly - sets wallpaper on all spaces/desktops, "Show on all spaces" is ON After first run: "Show on all spaces" automatically toggles OFF in System Settings Second run onwards: New wallpaper only updates on the active space, inactive spaces show old wallpaper Expected: "Show on all spaces" should remain ON after programmatic wallpaper changes Actual: System Settings automatically disables it, breaking subsequent updates Tested workarounds (all failed): UPDATE data SET space=NULL to clear per-space entries Using every desktop instead of current desktop in AppleScript killall Dock vs killall -HUP Dock vs killall -USR1 Dock Clearing space_id entries from pictures table Running DELETE FROM pictures WHERE space_id IS NOT NULL before setting The database manipulation doesn't prevent macOS from automatically creating per-space entries and disabling the "Show on all spaces" toggle. Question: Is there a way to programmatically set wallpaper while preserving the "Show on all spaces" setting on macOS Tahoe? Environment: macOS: Tahoe (latest) Architecture: Apple Silicon Use case: Daily automated wallpaper updates via Shortcuts
0
0
340
Feb ’26
BLE Notification & Write Latency/Batching on iOS (vs Android) – CoreBluetooth Real-Time Question
I am using a Raspberry Pi 5 (BLE 5.0) to read sensor data and send it via D-Bus and BlueZ to a Flutter application (flutter_blue_plus) for both iOS and Android. The goal is to display these real-time sensor updates directly on the device. On Android, the data transmission is immediate and the real-time visualization is extremely smooth and fast. However, on iOS, both BLE write and notification commands appear with noticeable latency—not only in real-time displays, but also when comparing ordinary notification feedback between the Raspberry Pi terminal and the iOS app. It seems that iOS buffers several BLE packets internally and then dispatches them in batches, which always introduces an additional delay. Additional setup details: I sample and transmit data every 25ms, sending binary packets of 20 bytes (length shouldn’t be a limiting factor). On the iOS side I am using an iPhone 15 Pro with iOS 18.6.2 (BLE 5.3). The Raspberry Pi (using btmon for logging) confirms after connection setup that the connection interval is fixed at 30ms (and cannot be changed). I have tried sending BLE packets every 30ms so that exactly one packet arrives per interval, but this made no difference—the latency and batch delivery remain. Interestingly, faster transmission rates (e.g. sending every 10ms) make the real-time display look smoother on iOS, but the guaranteed overall system latency does not improve. Also these methods used: write-without-response, using app in release modus (no debugging) Is there anyone familiar with this problem or a potential solution? Or is iOS simply not optimized for true real-time BLE data streaming and visualization? Any pointers, technical insights or workarounds would be greatly appreciated.
0
0
216
Nov ’25
APDU Command Execution Issues with Core Bluetooth and Secure Element Communication
I'm experiencing intermittent failures when executing APDU (Application Protocol Data Unit) commands through Core Bluetooth to communicate with external secure elements. The communication flow involves establishing a BLE connection, discovering services and characteristics, and then sending structured APDU commands for card management operations. While the initial connection and characteristic discovery work reliably, I'm encountering inconsistent behavior during APDU command execution where commands either timeout, return unexpected response codes, or fail to complete the expected transaction sequences. The issue appears to be more prevalent when sending multiple APDU commands in rapid succession or when the commands involve cryptographic operations. I've implemented proper error handling and retry mechanisms, but the failures seem to occur at the Core Bluetooth level rather than in my application logic. The peripheral device responds correctly to the same commands when tested with other platforms, suggesting the issue might be related to iOS-specific BLE behavior or timing constraints. I'm using standard Core Bluetooth APIs (CBPeripheral, CBCharacteristic) with proper delegate implementations and have verified that the peripheral remains connected throughout the operation. Has anyone encountered similar issues with APDU command execution over BLE on iOS, and are there any known workarounds or best practices for ensuring reliable command delivery and response handling?
0
0
47
Oct ’25
SiriKit: INPlayMediaIntent with a targeted speaker
I've got a streaming Radio app that loads an HLS stream into an AVAudioPlayer. I've set up an Intents extension that notifies SiriKit that my app must handle the INPlayMediaIntent in app, and, I'm able to successfully initiate the stream playing from my phone using the string "Play ". My intent handler in app looks like this: completionHandler(INPlayMediaIntentResponse(code: .success, userActivity: nil)) DispatchQueue.main.async { AudioPlayerService.shared.play() } The Audio Player service, in its init, does the following: try AVAudioSession.sharedInstance().setCategory( .playback, mode: .default, policy: .longFormAudio ) Additionally, in my Info.plist, I have the AirPlay optimization policy set to Long Form Audio. Having said all that, when I try to route my app to play "on a given HomePod speaker" ("play on ") the speaker routing instructions are never followed. I've looked and not been able to find where I might be able to instruct my app to follow the correct path here. I was assuming I could not trigger this behavior manually, as I believe I don't really have any control over AirPlay routing. Is there any guidance for working with SiriKit to do the right thing with regards to audio routing?
0
0
147
Feb ’26
Macbook M5 Development Kernel Panic
Hi, I'm posting a boot crash here. Environment Hardware: Macbook M5 Pro OS Version: macOS 26.3.1 (25D2128) and matching version of KDK from official apple download page Kernel Version: Darwin Kernel Version 25.3.0 Reproducibility: Consistent Here is my panic log --- I truncated one field "SOCDNandContainer" as the original log is too long to post, hitting the size limit. I followed a blog post to boot the development kernel as the ReadMe file from KDK only contains instructions for Intel Macs. https://jaitechwriteups.blogspot.com/2025/10/boot-custom-macos-kernel-on-macos-apple.html I've tried a few 26.2 KDKs before 26.3.1 public launch, and they all showed same errors (26.1 and 26.0 KDKs don't have any development kernel for T8142 chip). Also, I own two fresh M5 Pro, and it is consistent across the machines. The highlight is panic(cpu 8 caller 0xfffffe0050e18010): [Exclaves] $JgOSLogServerComponent.RedactedLogServer.init(logServerNotific:OSLogServerComponent\/OSLogServerComponent_Swift.swift:815: Fatal error: invalid rawValue for TightbeamComponents.RedactedLogSer at PC ... Is this a genuine bug or am I following a wrong guide to boot the development kernel? I don't think the blog is wrong because I'm able to boot the "release" kernel included in the KDK on the same M5 Pro, and the "development" kernel on M4 Mac Mini, using the same routine. Just to be clear, I'm not compiling XNU myself, but am using the ones included in the kit.
Replies
0
Boosts
0
Views
158
Activity
3w
Parameter recognition on AppShortcuts invocation not consistent
While playing around with AppShortcuts I've been encountering some problems around getting the invocation phrase detected and/or the parameter get recognized after invocation phrase via Siri. I've found some solutions or explanations here in other posts (Siri not recognizing the parameter in the phrase & Inform iOS about AppShortcutsProvider), but I still have one issue and it's about consistency. For context, I've defined the parameter to be an AppEntity with it's respective query conforming to the EntityStringQuery Protocol in order to be able to fetch entities with the string given by Siri struct AnIntent: AppIntent { // other parts hidden for clarity @Parameter var entity: ModelEntity } For an invocation phrase akin to "Do something with in ", if the user uses the phrase with a entity previously donated via suggestedEntities() the AppShortcut get executed without problems. If the user uses a phrase with no parameter, like "do something with ", if the user gets asked to input the missing parameter and inputs one, it may or may not get recognized and be asked to input a parameter again, like in a loop. This happens even if the parameter given is one that was donated. I've found that when this happens the entities(matching string: String) function in the EntityQuery doesn't get called. The input can be of one word or sometimes two and it will not be called. So in other words entities(matching string: String) does not get called on every user parameter input Is this behavior correct? Do parameters have some restrictions on length or anything? Does Siri shows the user suggested entities when asked for entity input? It doesn't on my end. Additional question related to AppShortcuts: On AppShortcut definition, where the summary inside the parameter presentation is used? I see that it was defined in the AppIntentsSampleApp for the GetTrailInfo Intent but didn't find where it was used
Replies
0
Boosts
0
Views
114
Activity
Apr ’25
Multiple upload happening for createItem in File Provider extension
I'm new to swift and iOS development. I'm trying to create a file provider extension for my app. I have able to use createItem, modifyItem, fetchContents functions. But when I try to add a GarageBand file or a big size mp3 file(18 Mb) I can see multiple upload is happening. I checked the FruitBasket project where they are doing chunked upload when file size is more than 100 Mb. How do I fix this to only one upload? I'm getting suggestions like I have to do debounce upload but that seems not a proper solutions.
Replies
0
Boosts
0
Views
371
Activity
Dec ’25
Custom AppEntity nested dictionary
I am creating an AppIntent to be used with Shortcuts and I would like to return a flexible dictionary of values with nested structures. As far as I understand the custom AppEntity only uses the displayRepresentation to store a title and subtitle which are LocalizedStringResource. types. Although I can convert my dictionary into a string I found no way in shortcuts to be able to retrieve the original structure of it and inspect individual elements like in subsequent actions. Is there a way to do this? Thank you in advance Nick Karanatsios
Replies
0
Boosts
0
Views
242
Activity
Dec ’25
How to debug a CoreSpotlight extension?
My CoreSpotlight extension seems to exceed the 6 MB memory limit. What’s the best way to debug this? I've tried to attach the debugger on the Simulator but the extension seems to be never launched when I trigger the reindex from Developer settings. Is this supposed to work? On device, I am able to attach the debugger. However, I can neither transfer the debug session to Instruments, nor display the memory graph. So I've no idea how the memory is used. Any recommendations how to move forward? Is there a way to temporarily disable the memory limit since even with LLDB attached, the extension is killed.
Replies
0
Boosts
1
Views
193
Activity
Apr ’25
FSKit removeItem Not Being Called
Environment macOS Version: 26.1 Xcode Version: 16.2 Description I'm developing a custom file system using FSKit and have encountered an issue where the removeItem(_:named:fromDirectory:) method in my FSVolume.Operations implementation is not being invoked when attempting to delete files or directories through Finder or the command line. Implementation My volume implements the required FSVolume.Operations protocol with the following removeItem implementation: func removeItem( _ item: FSItem, named name: FSFileName, fromDirectory directory: FSItem ) async throws { logger.info("remove: \(name)") if let item = item as? MyFSItem, let directory = directory as? MyFSItem { directory.removeItem(item) } else { throw fs_errorForPOSIXError(POSIXError.EIO.rawValue) } } Steps to Reproduce Mount the custom FSKit-based file system using: mount -F -t MyFS /dev/diskX /tmp/mountpoint Create files using Finder or terminal (works correctly - createItem is called) Attempt to delete a file using any of the following methods: Terminal command: rm -rf /path/to/mounted/file option + cmd + delete to remove the file in Finder Expected Behavior The removeItem(_:named:fromDirectory:) method should be called, logging "remove: [filename]" and removing the item from the directory's children collection. Actual Behavior The removeItem method is never invoked. No logs appear from this method in Console.app. The deletion operation either fails silently or returns an error, but the callback never occurs. Additional Context Working operations: Other operations work correctly including: createItem - files and directories can be created lookupItem - items can be looked up successfully enumerateDirectory - directory listing works read and write - file I/O operations work correctly Volume state: The volume is properly mounted and accessible Files can be created, read, and written successfully Volume capabilities configured: var supportedVolumeCapabilities: FSVolume.SupportedCapabilities { let capabilities = FSVolume.SupportedCapabilities() capabilities.supportsHardLinks = true capabilities.supportsSymbolicLinks = true capabilities.supportsPersistentObjectIDs = true capabilities.doesNotSupportVolumeSizes = true capabilities.supportsHiddenFiles = true capabilities.supports64BitObjectIDs = true capabilities.caseFormat = .insensitiveCasePreserving return capabilities } Questions Are there specific volume capabilities or entitlements required for removeItem to be invoked? Is there a specific way deletion operations need to be enabled in FSKit? Could this be related to how file permissions or attributes are set during createItem? Are there any known issues with deletion operations in the current FSKit implementation? Do I need to implement additional protocols or set specific flags to support item deletion? Any guidance would be greatly appreciated. Has anyone successfully implemented deletion operations in FSKit? Thank you!
Replies
0
Boosts
0
Views
248
Activity
Nov ’25
KDK for current stable version (26.1) missing
The current stable macOS version, 26.1 (build 25B78) is missing a corresponding Kernel Debug Kit (KDK) on the developer downloads page. This means I can't do any kernel-level development tasks currently. For example, if I try to build a new kernel collection with kmutil I get the message Missing Developer Kit: As of macOS 13.0, you will need to install a KDK matching your build 25B78 to rebuild kernel collections. but there is no build 25B78 KDK available to download. The latest 26.1 KDK on the download page is 25B5062e (from a beta I believe) and the final stable KDK for build 25B78 (which kernel development tools require) was never published. Is there any workaround for this to correctly do kernel-level development targeting the latest stable release, or a timeline for when the KDK will release? Thanks!
Replies
0
Boosts
3
Views
332
Activity
Nov ’25
Deliver/bundle entire Shortcut automations with an app
Is there any way of creating complete Shortcuts automations and bundling them with my app? Specifically, I would like the user to be able to Take a photo and open it with my app Or take a screenshot and open it with my app Of course I could offer a Share extension, but going through the Share menu and selecting my app there is time consuming for the user. I would like the user to be able to configure his or her action button such that it takes a new picture and opens it with my app right away. I can, of course, offer the respective App Shortcuts and let the user combine them into a pipeline with the Take Screenshot or Take Photo system actions. However, only power users would do this. Hence, I would like to bundle this complete pipeline with my app, such that the user just has to assign his/her Action Button to this pipeline if he/she wants to use this feature. How to go about this? I was thinking of exporting the shortcut into a file, bundling it with the app as a resource, and offering it via a Share action for the user to install it, or by sharing it on iCloud and adding the iCloud link to the UI of my app. What is the recommended approach?
Replies
0
Boosts
0
Views
354
Activity
Dec ’25
Notification Tones of tonetype SystemSounds is Not Routing to Connected Bluetooth HFP Accessory (WM500)
When an iOS device is connected to a Bluetooth accessory that utilizes the Hands-Free Profile (HFP), we are encountering an incorrect audio routing behavior specifically for system notification tones. Accessory Connected: The iOS device is successfully connected to a Bluetooth accessory (specifically, a WM500 device) using the HFP profile for voice communication. Voice Audio: Audio streams related to phone calls or voice communication (using the HFP/SCO link) are correctly routed to the WM500. Notification Tones Issue: System notification tones, which are played using the tonetype.systemsounds API, are not being routed to the connected HFP accessory (WM500). Instead, they are incorrectly played through the iOS device's built-in speaker. This causes a poor user experience, as critical application alerts and system notifications are missed when the user is relying on the connected HFP accessory for all audio output.
Replies
0
Boosts
0
Views
64
Activity
Dec ’25
Shortcuts: How to add “-pressed” to a file name in a shortcut
Hi there, Does anyone know how to modify this Image compressor Shortcut https://www.icloud.com/shortcuts/e13d8013598f4f33830386a956a163dd so that the image it creates has the original file name + “-pressed”? Eg “Image_123” becomes “Image_123-pressed” I know of the action ‘Rename file’ but can’t make it work. Any help much appreciated:)
Replies
0
Boosts
0
Views
213
Activity
Jan ’26
NFC application
Does mobile NFC support copying Mifare cards
Replies
0
Boosts
0
Views
73
Activity
May ’25
App Intents with Custom Automation/Triggers
Currently, we are developing an all-in-one DualSense utility for macOS. We are exploring how to integrate shortcuts into our app. Our vision is to have the user use the native Shortcuts app to choose the controller buttons that should trigger the shortcut action, such as opening Steam, turning on audio haptics, and more. As we explore this approach, we want to see whether we need to build the UI in our app to set the triggers or can we do this inside of Shortcuts? Can button presses recorded by our app trigger shortcuts? Can those button inputs be customized inside of Shortcuts or should we develop it into our app? And if we have it in our app, can our app see, select, and trigger shortcuts?
Replies
0
Boosts
0
Views
391
Activity
Jan ’26
API to Programmatically Establish SCO Connection for HFP Accessories in iOS
When an iOS device is connected to a Bluetooth accessory that utilizes the Hands-Free Profile (HFP), we are encountering an incorrect audio routing behavior specifically for system notification tones. Accessory Connected: The iOS device is successfully connected to a Bluetooth accessory (specifically, a WM500 device) using the HFP profile for voice communication. Voice Audio: Audio streams related to phone calls or voice communication (using the HFP/SCO link) are correctly routed to the WM500. Notification Tones Issue: System notification tones, which are played using the tonetype.systemsounds API, are not being routed to the connected HFP accessory (WM500). Instead, they are incorrectly played through the iOS device's built-in speaker. Accessory team has suggested to establish SCO connection to route the tones through WM500. But iOS does not provide an external API (like Android's startBluetoothSco) to explicitly force the establishment of an SCO connection for notification tones. Is there any other approach to establish SCO connection in iOS to route notification tones through WM500
Replies
0
Boosts
0
Views
134
Activity
Dec ’25
Add icon to Desktop shortcut
Is there a way using a shell script or AppleScript to add a custom icon to a desktop shortcut? I can create the shortcut in a script but I have to manually change the icon. thx much
Replies
0
Boosts
0
Views
107
Activity
Jul ’25
Terrible performance on iPad 11th BLE attribute notification messages.
We've been developing an iOS app in Swift for several years that run on iPad tablets in which our proprietary device emits EEG signals via BLE to the app running on the iPad tablet. The device emits the data as BLE notification messages in which the MTU is set to the maximum size that is allowed between our device and the iPad. Our device when communicating with the app running on a 10th generation iPad running iOS 18.5 it takes less than 200ms to transmit an interval of EEG signals which occurs at 500ms. Under the same conditions same version of iOS & app and the same device but using an iPad 11th generation, it takes anywhere from 800ms to 1.1 seconds (4x to 5x) to transmit an interval. Our device transmits the EEG signal using several ATT notification messages using the maximum MTU size. We are perplexed about such a huge step down in performance when running on the iPad 11th generation tablets. iPad generation Chipset Firmware -------------------------------------------------------------- 10th BCM_4387 22.5.614.3457 11th SRS_7923 HCI Rev. 2504 sub. 5003 We know that the 10th generation iPad used chipset manufactured by Broadcom. Whereas the iPad 11th generation that we've received uses a SRS chipset in which I'm unfamiliar with that chipset's manufacturer. We question if this performance degradation is due from the chipset manufacturer, the firmware revision when using attribute notifications messages over BLE in such a context. Using PacketLogger as to log the communication between the iPad tablets and our device and after analysis we haven't found anything that identifies difference in configuration settings that are exchanged between our device and iPad tablets that account for this performance degradation. Fortunately, our device is designed to work in complex environments & contexts and thus it has mechanisms accounting for transmission delays and interferences. I'd appreciate if any other Apple Developer or Apple staff is aware of the degradation when transmitting BLE attribute notification messages with the newer Apple devices using this series of chipset. If so, then: Are there any recommendations of solutions to improve this latency? Is this is being addressed for iPad 11th generation tablets? Regards, Steven Belbin Principal Developer at NeuroServo Inc.
Replies
0
Boosts
1
Views
145
Activity
Jul ’25
"Show on all spaces" toggles OFF after programmatically setting wallpaper via AppleScript
I'm building an automated wallpaper updater that fetches images from an API and sets them as desktop wallpaper on macOS Tahoe. The automation uses AppleScript combined with database manipulation to ensure wallpaper applies to all spaces. Current implementation (via Apple Shortcuts): wallpaper_path="$1" osascript -e "tell application \"System Events\" to tell every desktop to set picture to POSIX file \"$wallpaper_path\"" sqlite3 ~/Library/Application\ Support/Dock/desktoppicture.db "UPDATE data SET space=NULL WHERE space IS NOT NULL;" 2>/dev/null killall -HUP Dock Issue First run: Works perfectly - sets wallpaper on all spaces/desktops, "Show on all spaces" is ON After first run: "Show on all spaces" automatically toggles OFF in System Settings Second run onwards: New wallpaper only updates on the active space, inactive spaces show old wallpaper Expected: "Show on all spaces" should remain ON after programmatic wallpaper changes Actual: System Settings automatically disables it, breaking subsequent updates Tested workarounds (all failed): UPDATE data SET space=NULL to clear per-space entries Using every desktop instead of current desktop in AppleScript killall Dock vs killall -HUP Dock vs killall -USR1 Dock Clearing space_id entries from pictures table Running DELETE FROM pictures WHERE space_id IS NOT NULL before setting The database manipulation doesn't prevent macOS from automatically creating per-space entries and disabling the "Show on all spaces" toggle. Question: Is there a way to programmatically set wallpaper while preserving the "Show on all spaces" setting on macOS Tahoe? Environment: macOS: Tahoe (latest) Architecture: Apple Silicon Use case: Daily automated wallpaper updates via Shortcuts
Replies
0
Boosts
0
Views
340
Activity
Feb ’26
BLE Notification & Write Latency/Batching on iOS (vs Android) – CoreBluetooth Real-Time Question
I am using a Raspberry Pi 5 (BLE 5.0) to read sensor data and send it via D-Bus and BlueZ to a Flutter application (flutter_blue_plus) for both iOS and Android. The goal is to display these real-time sensor updates directly on the device. On Android, the data transmission is immediate and the real-time visualization is extremely smooth and fast. However, on iOS, both BLE write and notification commands appear with noticeable latency—not only in real-time displays, but also when comparing ordinary notification feedback between the Raspberry Pi terminal and the iOS app. It seems that iOS buffers several BLE packets internally and then dispatches them in batches, which always introduces an additional delay. Additional setup details: I sample and transmit data every 25ms, sending binary packets of 20 bytes (length shouldn’t be a limiting factor). On the iOS side I am using an iPhone 15 Pro with iOS 18.6.2 (BLE 5.3). The Raspberry Pi (using btmon for logging) confirms after connection setup that the connection interval is fixed at 30ms (and cannot be changed). I have tried sending BLE packets every 30ms so that exactly one packet arrives per interval, but this made no difference—the latency and batch delivery remain. Interestingly, faster transmission rates (e.g. sending every 10ms) make the real-time display look smoother on iOS, but the guaranteed overall system latency does not improve. Also these methods used: write-without-response, using app in release modus (no debugging) Is there anyone familiar with this problem or a potential solution? Or is iOS simply not optimized for true real-time BLE data streaming and visualization? Any pointers, technical insights or workarounds would be greatly appreciated.
Replies
0
Boosts
0
Views
216
Activity
Nov ’25
APDU Command Execution Issues with Core Bluetooth and Secure Element Communication
I'm experiencing intermittent failures when executing APDU (Application Protocol Data Unit) commands through Core Bluetooth to communicate with external secure elements. The communication flow involves establishing a BLE connection, discovering services and characteristics, and then sending structured APDU commands for card management operations. While the initial connection and characteristic discovery work reliably, I'm encountering inconsistent behavior during APDU command execution where commands either timeout, return unexpected response codes, or fail to complete the expected transaction sequences. The issue appears to be more prevalent when sending multiple APDU commands in rapid succession or when the commands involve cryptographic operations. I've implemented proper error handling and retry mechanisms, but the failures seem to occur at the Core Bluetooth level rather than in my application logic. The peripheral device responds correctly to the same commands when tested with other platforms, suggesting the issue might be related to iOS-specific BLE behavior or timing constraints. I'm using standard Core Bluetooth APIs (CBPeripheral, CBCharacteristic) with proper delegate implementations and have verified that the peripheral remains connected throughout the operation. Has anyone encountered similar issues with APDU command execution over BLE on iOS, and are there any known workarounds or best practices for ensuring reliable command delivery and response handling?
Replies
0
Boosts
0
Views
47
Activity
Oct ’25
The system does not return peripheralIsReadyToSendWriteWithoutResponse for a long time.
mac/ios acts as a BLE client. After successfully establishing a BLE connection, it sends large amounts of data to the peer device. After sending data for a period of time, the system does not return peripheralIsReadyToSendWriteWithoutResponse for a long time, causing the data transmission to stall.
Replies
0
Boosts
0
Views
56
Activity
Oct ’25
SiriKit: INPlayMediaIntent with a targeted speaker
I've got a streaming Radio app that loads an HLS stream into an AVAudioPlayer. I've set up an Intents extension that notifies SiriKit that my app must handle the INPlayMediaIntent in app, and, I'm able to successfully initiate the stream playing from my phone using the string "Play ". My intent handler in app looks like this: completionHandler(INPlayMediaIntentResponse(code: .success, userActivity: nil)) DispatchQueue.main.async { AudioPlayerService.shared.play() } The Audio Player service, in its init, does the following: try AVAudioSession.sharedInstance().setCategory( .playback, mode: .default, policy: .longFormAudio ) Additionally, in my Info.plist, I have the AirPlay optimization policy set to Long Form Audio. Having said all that, when I try to route my app to play "on a given HomePod speaker" ("play on ") the speaker routing instructions are never followed. I've looked and not been able to find where I might be able to instruct my app to follow the correct path here. I was assuming I could not trigger this behavior manually, as I believe I don't really have any control over AirPlay routing. Is there any guidance for working with SiriKit to do the right thing with regards to audio routing?
Replies
0
Boosts
0
Views
147
Activity
Feb ’26