Delve into the world of built-in app and system services available to developers. Discuss leveraging these services to enhance your app's functionality and user experience.

Posts under General subtopic

Post

Replies

Boosts

Views

Activity

Menu bar and Help window
by selecting Help under my Help menu item or cmd-h no window appears until the menu bar is touched. myApp.swift file: import SwiftUI import QuickLook @main struct myApp: App { @Environment(\.openWindow) var openWindow @State var helpURL: URL? = nil var body: some Scene { WindowGroup { ContentView() } .commandsReplaced { CommandGroup(before: .appInfo) { Button("Quit") { NSApp.terminate(nil) }.keyboardShortcut("q") } CommandGroup(before: .help) { //FIXME: only shows help if the menu bar is touched again. Button("Help") { helpURL = Bundle.main.url(forResource: "help", withExtension: "txt") } .keyboardShortcut("h") .quickLookPreview($helpURL) } } } } How should this be coded for proper action? Thanks
1
0
55
Jul ’25
Inquiry about the data format for Live Caller ID Lookup
Hello! https://github.com/apple/live-caller-id-lookup-example/blob/main/Sources/PIRService/PIRService.docc/DataFormat.md The link above shows the data format that the user who gets a call, can get. I wonder if it is also possible to add other fields, for example: "summary". I am currently in the design-phase of an app that aims to present what the last call between the two parties was about, and that information can be gotten from an API that I will build according to Apple's principles that is comaptible with the Live Caller ID Lookup protocol. Therefore adding a field that will present a short summary of the last call will be very handy. Is that possible?
17
0
849
Jul ’25
Screentime API new issues on iOS 17.4.1 and 17.5.1
Hi, I have a released screentime app ScreenZen. The last few days I've seen a disturbing spike in bug reports coming from people with 17.4.1 and 17.5.1 phones with no update to the app itself. People reported they saw the issue immediately after updating their iOS version. Unfortunately it is not replicable on all phones with those versions, so we haven't been able to replicate it on our test phones. It appears the issue is the ApplicationToken passed into ShieldActionExtension and ShieldConfigurationExtension does not match any of the ApplicationTokens that the user selected to block through FamilyControls. (The selected ApplicationTokens are being loaded through a group UserDefaults and they are indeed being loaded in the ShieldActionExtension in the bug reports).This is preventing the app from loading the correct settings and handling the blocking accordingly. I am trying to isolate this better with a new release with better logging, but would appreciate any help on this issue.
3
6
2.4k
Jul ’25
ApplicationTokens changing
We persist ApplicationTokens in a storage container that ShieldConfigurationExtension has access to. In rare, cases all the ApplicationTokens for a user seem to change. We know this because the Application parameter passed into configuration(shielding application: Application) -> ShieldConfiguration function has a Token that does not match (using == ) any of the ones we are persisting in storage. Interestingly, the persisted ones still work, so I don't believe storage has gotten corrupted or anything. We can use them to add or remove shields, we can use them to display labels of the apps they represent, etc. But they don’t match what’s passed into the ShieldConfiguration extension. If the user goes into the FamilyPicker at this point and selects an app of a token that we are already persisting, the FamilyPickerSelection will have a token matching the new one that is passed into ShieldConfigurationExtension, not the one we persisted when they last selected that app. This leads me to believe the tokens are updated/rotated in some cases. When and why does this happen, and how can we handle it gracefully?
7
4
1.6k
Jul ’25
Background Modes for Audio Playback
Summary: I'm developing an iOS audio app in Flutter that requires background audio playback for long-form content. Despite having a paid Apple Developer Program account, the "Background Modes" capability does not appear as an option when creating or editing App IDs in the Developer Portal, preventing me from enabling the required com.apple.developer.background-modes entitlement. Technical Details: In the app that I am developing, users expect uninterrupted playback when app is backgrounded or device is locked similar to Audible, Spotify, or other audio apps that continue playing in background The Problem: When building for device testing or App Store submission, Xcode shows: Provisioning profile "iOS Team Provisioning Profile: com.xxxxx-vxxx" doesn't include the com.apple.developer.background-modes entitlement. However, the "Background Modes" capability is completely missing from the Developer Portal when creating or editing any App ID. I cannot enable it because the option simply doesn't exist in the capabilities list. What I've Tried: Multiple browsers/devices: Safari, Chrome, Firefox, incognito mode, different computers Account verification: Confirmed paid Individual Developer Program membership is active New App IDs: Created multiple new App IDs - capability never appears for any of them Documentation review: Followed all Apple documentation for configuring background execution modes Different regions: Tried changing portal language to English (US) Cache clearing: Logged out, cleared cookies, tried different sessions Apple Support Response: Contacted Developer Support (Case #102633509713). Received generic documentation links and was directed to Developer Forums rather than technical escalation. Has anyone else experienced the "Background Modes" capability missing from their Developer Portal? Has anyone successfully used the App Store Connect API to add background-modes when the GUI doesn't show it? What's the proper escalation path when Developer Support provides generic responses instead of technical assistance? Things I have attempted to solve this: audio_service package: Implemented as potential workaround, but still requires the system-level entitlement Manual provisioning profiles: Cannot create profiles with required entitlement if capability isn't enabled on App ID Other perhaps important facts about the environment where I am building the app: macOS Sonoma Xcode 15.x Flutter 3.5.4+ Apple Developer Program (Individual, paid)
0
0
86
Jul ’25
A Summary of the WWDC25 Group Lab - watchOS (Part 2)
At WWDC25 we launched a new type of Lab event for the developer community - Group Labs. A Group Lab is a panel Q&A designed for a large audience of developers. Group Labs are a unique opportunity for the community to submit questions directly to a panel of Apple engineers and designers. Here are the highlights from the WWDC25 Group Lab for watchOS (part 2). 7. For widget (complication) update budgets, is there an overall budget or are scheduled update separate from APNS updates? For context I have a complication that is updated on a fixed schedule (every 20 min), but there can be times of the day that are more "interesting" where pushes make sense. Like timeline updates, the system budgets WidgetKit push notifications and delivers them opportunistically. You can use WidgetKit push notification updates as an addition to timeline updates. For more information, see Updating widgets with WidgetKit push notifications. 8. It seems like the new Control Center widgets can be sourced from either the iPhone or directly on the Watch. Can we control whether a control appears in the watch list, or will it always be a combination of all controls from both sources? iPhone controls will be automatically available on the companion Apple Watch, even if they don’t have an associated watchOS app. When an iPhone control is tapped on the Apple Watch, the action is performed on the iPhone. Controls whose actions foreground the iOS app will not appear on Apple Watch. If a watchOS app has controls, no controls will appear on Apple Watch from the companion iOS app. 9. From UI/UX perspective, what are the current practices for Designing watchOS apps that feels native. The WWDC23 session Design and build apps for WatchOS 10 covers the details of watchOS design principles and how to apply them in your app using SwiftUI. A lot of SwiftUI APIs, such as NavigationSplitView, vertical tab view, list view, and etc, already implement the look and feel native to watchOS. 10. When adopting the new design system on watchOS, it seems like the main place we will use the glass effect is for our buttons in toolbar? Standard buttons in system apps seem to continue to use a flat appearance and full width. We leave the choice to you – You can use the new GlassButtonStyle API or .buttonStyle(.glass) to apply the liquid glass material to buttons. Learn when to use the Liquid Glass styles in Get to know the new design system. 11. Is there any way to gracefully migrate extensions when their bundleIDs have to change? e.g., converting a multi-target watch app to single-target, which drops the .watchkitextension from both the app and WidgetKit ext bundleIDs Updating a watchOS app to single-target is covered in TechNote TN3157: Updating your watchOS project for SwiftUI and WidgetKit. Xcode provides a tool that can do the update automatically, and the technote describes the details about how to use it and how to clean up the project after the automatic update. If there's something that technote doesn't address, please reach out to us on the Developer Forums. 12. What is the status of WatchConnectivity? Is that still the preferred way for iOS + watchOS communications? The Watch Connectivity framework is still supported, and is appropriate for the communication between an watchOS app and its companion iOS app. The systems also provide other APIs for the apps to exchange data. For example, watchOS supports Apple Push Notification service (APNs). If data for your widget changes on your server, your widget can receive a WidgetKit push notification, and update accordingly. That’s the preferred mechanism for widget updates.
0
0
73
Jul ’25
A Summary of the WWDC25 Group Lab - watchOS (Part 1)
At WWDC25 we launched a new type of Lab event for the developer community - Group Labs. A Group Lab is a panel Q&A designed for a large audience of developers. Group Labs are a unique opportunity for the community to submit questions directly to a panel of Apple engineers and designers. Here are the highlights from the WWDC25 Group Lab for watchOS (part 1). 1. I'm really excited about the new design system on all platforms. Liquid Glass is super cool. What do developers need to keep in mind when building for watchOS 26? To adopt the new design system, start with updating your app for watchOS 10 – If you have done so, your app will be mostly ready for watchOS 26. For more information, see Design and build apps for WatchOS 10. You can then look into Liquid Glass specific APIs to fine tune your app. This topic is covered in Adopting Liquid Glass. If you have SwiftUI views using any custom style, make sure they are still legible and fit with the new design system. 2. Something that really stood out to me were updates to the Smart Stack, with the system prioritizing Widgets when they're most relevant. Tell me more about these new opportunities for apps. Workout apps that record workouts using HealthKit may be automatically suggested on the watch face and appear in the Smart Stack without adding a widget. Relevant widgets are a great way to present information related to a date, location, point-of-interest type, sleep schedule, or fitness condition in the Smart Stack when it is relevant. Relevant widgets don't need to display a empty state view when they are not relevant. They are only shown in the Smart Stack when relevant. The watchOS 26 Design ToolKit in the Apple Design Resources includes a set of templates that you can use to layout your widgets. 3. Is the Wrist Flick gesture available to developers in the same way as Double Tap is? The system uses Wrist Flick to dismiss notifications and incoming calls, silence timers and alarms, or return to the watch face. There is no separate API for the Wrist Flick gesture. Apps that are using XCUIAutomation to make sure their user interface behaves as intended can use the XCUIDeviceHandGesture.flick to automate tests that verify that their app responds appropriately to the Wrist Flick gesture. For apps using automated testing, the XCUIDeviceHandGesture.doubleTap can be also be used to automate testing of the app with the Double Tap gesture. See XCUIDevice.perform(handGesture:) 4. Can HRV measurements be triggered on demand via API in watchOS? Guidelines or processes for enabling energy-intensive biometric sampling on development devices for IRB-approved research? You don’t have direct control on the sampling rate in watchOS. You can use HealthKit (HKQuantityTypeIdentifierHeartRateVariabilitySDNN, to be specific) to query the HRV data, once the system has sampled and persisted the data to HealthKit. If that doesn’t help, we suggest that you file a feedback report with your concrete use case for us to investigate. Specific to IRB-approved research using Apple Watch or its companion iPhone, you might want to look at this FAQ and SensorKit to see if they can be of any help. 5. What is the best advice for someone who is new to making a watchOS app that’s been on iOS and iPadOS? You can start with exploring the system experience features on watchOS, such as notifications, controls, and widgets, and getting familiar with the system spaces, like Smart Stack, watch face, and control center. Knowing the watchOS app design principles and practices is important as well. Design and build apps for WatchOS 10 is a great resource for this topic. SwiftUI is an amazing across-platform framework, and you will use it to create your watchOS app. If you're already using it, great! Keep in mind some watch-only constraints. Comparing to iPhone or iPad, Apple Watch has a limited battery and smaller screen size, which significantly impacts how people use your app and how your app works. 6. Was there any extension this year to the 7 day limit on querying Apple Health data on the watch? There is no change on the limit this year. You can get this official limit at runtime using earliestPermittedSampleDate. There are some exceptions, and so don't be surprised if you see some data types are retained longer. The companion iPhone holds the full set of the health data. If you need to access the health data that has been purged from the Apple Watch, consider doing it with your iOS app, and then passing the result to your watchOS app.
0
0
92
Jul ’25
How to get a phone into a state where it's possible to test text filtering?
I'm currently finding it impossible to get a text filtering extension to be invoked when there's an incoming text message. There isn't a problem with the app/extension because this is the same app and code that is already developed, tested, and unchanged since I last observed it working. I know if there's any history of the incoming number being "known" then the extension won't get invoked, and I used to find this no hindrance to testing previously provided that: the incoming number isn't in contacts there's no outgoing messages to that number there's no outgoing phone calls to the number. This always used to work in the past, but not anymore. However, I've ensured the incoming text's number isn't in contacts, in fact I've deleted all the contacts. I've deleted the entire phone history, incoming and outgoing, and I've also searched in messages and made sure there's no interactions with that number. There's logging in the extension so I can see its being invoked when turned on from the settings app, but its not getting invoked when there's a message. The one difference between now and when I used to have no problem with this - the phone now has iOS 18.5 on it. Its as if in iOS 18.5 there ever was any past association with a text number, its not impossible to remove that association. Has there been some known change in 18.5 that would affect this call filtering behavior and not being able to rid of the incoming message caller as being "known" to the phone? Update I completely reset the phone and then I was able to see the the message filter extension being invoked. That's not an ideal situation though. What else needs to be done beyond what I mentioned above in order to get a phone to forget about a message's number and thus get an message filtering extension to be invoked when there's a message from that number?
0
0
146
Jul ’25
Translation API not working without the Apple Translate app installed
Hello, My app depends on the Translation framework (iOS 17.4+), but I've found out that it doesn't work unless the Apple Translate app is installed on the device. After I've deleted Apple's translation app, I started getting the following errors: Optional(Foundation.Locale.Language(components: Foundation.Locale.Language.Components(languageCode: Optional(en), script: nil, region: Optional(GB)))) Optional(Foundation.Locale.Language(components: Foundation.Locale.Language.Components(languageCode: Optional(es), script: nil, region: Optional(ES)))) Error sending 1 paragraphs Error Domain=TranslationErrorDomain Code=16 "Translation failed" UserInfo={NSLocalizedDescription=Translation failed, NSLocalizedFailureReason=Offline models not available for language pair} Failed to translate input 0; returning error: Error Domain=TranslationErrorDomain Code=16 "Translation failed" UserInfo={NSLocalizedDescription=Translation failed, NSLocalizedFailureReason=Offline models not available for language pair} Received unbridged NSError to API, converting to `.internalError`: Error Domain=TranslationErrorDomain Code=16 "Translation failed" UserInfo={NSLocalizedDescription=Translation failed, NSLocalizedFailureReason=Offline models not available for language pair} TranslationError(cause: Translation.TranslationError.Cause.internalError, sourceLanguage: nil, targetLanguage: nil) This is an example from trying to translate text from English to Spanish. And I was receiving the error even though I have the dictionaries downloaded. Once I reinstalled Apple's Translate app, it started working again. This sadly means that users of my app must have the factory translation app installed, otherwise they won't be able to use my app. Some people choose to delete the factory apps. Why is the framework not available then? :(
2
0
81
Jul ’25
Does Core Spotlight work with document-based apps?
I have a SwiftUI document-based app that for the sake of this discussion stores accounting information: chart of accounts, transactions, etc. Each document is backed by a SwiftData DB. I'd like to incorporate search into the app so that users can find transactions matching certain criteria, so I went to Core Spotlight. Indexing & search within the app seem to work well. The issue is that Spotlight APIs appear to be App based & not Document based. I can't find a way to separate Spotlight data by document. I've tried having each document maintain a UUID as a document-specific identifier and include the identifier in every CSSearchableItem. When performing a query I filter the results with CSUserQueryContext.filterQueries that filter by the document identifier. That works to limit results to the specific file for search operations. Index updates via CSSearchableIndexDelegate.reindex* methods seem to be App-centric. A user may have file #1 open, but the delegate is being asked to update CSSearchableItems for IDs in other files. Is there a proper way to use Spotlight for in-app search with a document-based app? Is there a way to keep Spotlight-indexed data local within the app & not make it available across the system? I.e. I'd like to search within the app only. System-level searches should not surface this data.
7
0
140
Jul ’25
The menu can't be shown in background process in MacOS 26(beta)
After I upgraded to MacOS 26(beta), my program caused the system to pop up a window as shown in the following picture. My application is a process with only a tray icon. I found that my tray icon is not displayed in the current version, even though I clicked the "Always Allow" button. Here are my questions: 1.Will this related feature remain consistent in the official release? 2.How can I create a cmd process that only displays a system tray icon (no main window) like Alfred?
2
1
71
Jul ’25
Quick Look Preview Extension works on macOS but not iOS
Hi, I have a document-based SwiftUI multiplatform app, where the document is saved as JSON. Obviously I don't want Quick Look to show the JSON of my file, so I made a Quick Look Preview extension for each platform. The macOS one works… okay, sometimes it's tricky to test and I need to use qlmanage to empty the cache or to show the preview, but it does work. I can even debug it. But the iOS one just never seems to be run. If I show Quick Look in the Files app on iOS, or if I AirDrop a file from my app to my iPhone, it shows as JSON, with an option to open it in my app. If I run the iOS Preview Extension in the debugger and launch the Files app, and then try to use Quick Look in a file there, it shows the JSON and the debugger just stays in the state 'Waiting to Attach'. The preview extensions are not data based; in both of them I have implemented func preparePreviewOfFile(at url: URL) async throws in PreviewViewController.swift. Pretty much the same code except one is using a UIHostingController and the other is using an NSHostingController. The only difference in the Info.plists for the two extensions is that the iOS one uses NSExtensionMainStoryboard while the macOS one uses NSExtensionPrincipalClass. This is how they were set up when I created them from the relevant Quick Look Preview Extension templates. I made a sample project with a much simpler document, UI, etc. and I have the same issue there. The macOS preview works, the iOS one never runs. I have checked that the correct preview extension is embedded in the target for each OS (under Embed Foundation Extensions in the Build Phases) Is there anything I need to do differently in iOS, or anything I might have inadvertently got wrong? Is there a way to run something similar to qlmanage on iOS, since that sometimes seems to help on macOS? Incidentally, I have also added Quick Look Thumbnail extensions, and they work on both platforms.
2
0
147
Jul ’25
Our customer's events on calendar are disappeared
Our app provides a calendar that integrates with the default calendar app. Specifically, we use iOS EventKit to perform CRUD operations on calendar data. Recently, we have received reports from users that all of their events have disappeared. However, after reviewing our implementation and logs, we have not been able to identify the cause. Some users have also reported that all data in their default calendar app has disappeared as well. Does anyone have any idea what might be causing this? To delete an event within our app, users must press the delete button and then confirm the deletion in a dialog. Additionally, it is not possible to delete more than two events at once. We've seen many people in the community discussing a bug where calendar events disappear after updating to iOS 18. If you have any information about when or why this happens, we'd appreciate it if you could share your insights.
0
3
84
Jul ’25
Access resource in swift package from xcframework
I have an iOS app that includes a local Swift package. This Swift package contains some .plist files added as resources. The package also depends on an XCFramework. I want to read these .plist files from within the XCFramework. What I’d like to know is: Is this a common or recommended approach—having resources in a Swift package and accessing them from an XCFramework? Previously, I had the .plist files added directly to the main app target, and accessing them from the XCFramework felt straightforward. With the new setup, I’m trying to determine whether this method (placing resources in a Swift package and accessing them from an XCFramework) is considered good practice. For context: I am currently able to read the .plist files from the XCFramework by passing Bundle.module through one of the APIs exposed by the XCFramework.
3
1
128
Jun ’25
No such module 'JournalingSuggestions'
I followed this tutorial to add JournalingSuggestions API, but it keeps showing me No such module 'JournalingSuggestions'. How can I fix this? import SwiftUI import JournalingSuggestions struct ContentView: View { @State var suggestionTitle: String? = nil var body: some View { VStack { JournalingSuggestionsPicker { Text("Select Journaling Suggestion") } onCompletion: { suggestion in suggestionTitle = suggestion.title } Text(suggestionTitle ?? "") } .padding() } } #Preview { ContentView() }
2
0
52
Jun ’25
iOS magnetometer data processing
Hello, I’m developing an app to detect movement past a strong magnet, targeting both Android and iOS. On Android, I’m using the Sensor API, which provides calibrated readings with temperature compensation, factory (or online) soft-iron calibration, and online hard-iron calibration. The equivalent on iOS appears to be the CMCalibratedMagneticField data from the CoreMotion framework. However, I’m encountering an issue with the iOS implementation. The magnetometer data on iOS behaves erratically compared to Android. While Android produces perfectly symmetric peaks, iOS shows visual peaks that report double the magnetic field strength. Additionally, there’s a "pendulum" effect: the field strength rises, drops rapidly, rises again to form a "double peak" structure, and takes a while to return to the local Earth magnetic field average. The peaks on iOS are also asymmetric. I’m wondering if this could be due to sensor fusion algorithms applied by iOS, which might affect the CMCalibratedMagneticField data. Are there other potential reasons for this behavior? Any insights or suggestions would be greatly appreciated. Thank you!
0
0
25
Jun ’25