Delve into the world of built-in app and system services available to developers. Discuss leveraging these services to enhance your app's functionality and user experience.

Posts under General subtopic

Post

Replies

Boosts

Views

Activity

Is it work startMonitoring from DeviceActivityMonitorExtension's eventDidReachThreshold function?
I want to monitor again from the bellow function of DeviceActivityMonitorExtension. I have the function of startMonitoring like this. override func eventDidReachThreshold(_ event: DeviceActivityEvent.Name, activity: DeviceActivityName) { super.eventDidReachThreshold(event, activity: activity) startMonitoring() } public func startMonitoring() { let startTime = DateComponents(hour: 0, minute: 0, second: 0) let endTime = DateComponents(hour: 23, minute: 59, second: 59)//DateComponents(hour: 11, minute: 0, second: 0)// let schedule = DeviceActivitySchedule( intervalStart: startTime,//DateComponents(hour: 0, minute: 0, second: 0), intervalEnd: endTime, repeats: true //warningTime: DateComponents(minute:1) ) let selection: FamilyActivitySelection = savedSelection() ?? FamilyActivitySelection() let center = DeviceActivityCenter() let selections = self.savedSelection() ?? FamilyActivitySelection() let applications = selections.applicationTokens let categories = selections.categoryTokens let webCategories = selections.webDomainTokens let store = ManagedSettingsStore() store.shield.applicationCategories = ShieldSettings.ActivityCategoryPolicy.specific(categories, except: Set()) store.shield.applications = applications store.shield.webDomains = webCategories let scheduleHard = DeviceActivitySchedule( intervalStart: startTime,//DateComponents(hour: 0, minute: 0, second: 0), intervalEnd: endTime, repeats: true //warningTime: DateComponents(minute:1) ) let event = DeviceActivityEvent( applications: selection.applicationTokens, categories: selection.categoryTokens, webDomains: selection.webDomainTokens, threshold: DateComponents(minute: 0)//timeLimitToUseApp i.e for 15 mins ) do { try center.startMonitoring( .weekend, during: scheduleHard, events: [ .weekend: event, ] ) print("ScreenTime Monitoring Started") } catch let error { print(error.localizedDescription) } } Please provide us with a solution about starting monitoring from DeviceActivityMonitoringExtension's eventDidReachThreshold function or if there is any other way.
0
0
325
Mar ’25
Action Extensions: How do Amazon & Google open their apps?
Both follow the same pattern: show the image that is being shared along with a CTA button about doing something with it in their app. When you tap the button, their app opens. Is there some kind of magic conditions that tapping the button creates that makes extensionContext.open(_ URL: URL, completionHandler: ((Bool) -> Void)?) accept a URL for opening the app? Or are they just using the "walk the responder chain" hack and using the user's intent to do something in their app as sufficient justification for using it? I've tried opening a registered URL scheme for my app synchronously with the button tap, but it still is refusing to open (callback returns false).
0
0
42
Nov ’25
Rosetta 2 - Build 1.0.0.0.1.1744447383
Goal: Manually install an explicit version of Rosetta2 Background: Me and some customers have an old app (intel) which perfectly worked with Rosetta2. In the last week of April most macheines were updated to Mac Os 15.4.1. and the app still starts but certain functionality is bronken. Some fields ind the forms don't write back to the database, some data can't be read from the database. (Most installations will fade out over the next month but it would be great to have the app fully working for data migration.) First try was to step back to 15.4. (Clean Install - Install App - Rosetta installs as expected): no change, app still broken Second try back to 15: (Clean Install - Install App - Rosetta installs as expected): App still broken (!) This is interesting as the app worked for month using Mac Os 15! Third try: Back to MacOS 14 (Clean Install - Install App - Rosetta installs as expected): App is working like nothing happend. (All attempts on same hardware of course.) Reasoning: Rosetta2 was the only software (besides the app itself) installed after clean MacOS installs. Now, my guess is that there were might be a change in Rosetta2 as the app worked on MacOs 15 up the update 15.4.1. was installed. Checking versions (pkgutil --pkg-info com.apple.pkg.RosettaUpdateAuto): Rosetta version MacOS 14: 1.0.0.0.1.1722778371 Rosetta version on MacOS 15.4.1: 1.0.0.0.1.1744447383 To fully verify the cause it would be great to uninstall Rosetta on MacOS15.4.1 machine and explicit install lower version (1.0.0.0.1.1722778371) which must be available somewhere as MacOS14 still gets this version. I know how to uninstall - is there a possibility to manually install an explicit version of Rosetta2?
0
1
90
May ’25
CNContact poster
Hi all, From what I’ve seen on forums and other sources, it appears that nothing can be done to set the contact poster programmatically. Setting the imageData property affects only the thumbnail image. Does anyone know if this is explicitly documented somewhere? I need this information for a POC document. I watched the iOS 17 keynote (where it was introduced), the Platform State of Union, and other WWDC videos, but I couldn’t find any mention of it. The Contacts framework documentation only explains what can be retrieved from this property and doesn’t mention any way to set the contact poster. If anyone has any information on this, please help! Thanks in advance!
0
0
76
Mar ’25
Universal links stopped working, CDN responds with 404 for our domain
From some moment of time, Universal Links stopped working for our app. As per my understanding, application reinstall or update caused system to fetch AASA file from CDN, which started to reply with 404 for our domain (https://app-site-association.cdn-apple.com/a/v1/app.link.digidentity.eu). In the meantime, nothing has changed inside our app or on our BE (https://app.link.digidentity.eu/.well-known/apple-app-site-association). Executing "curl -v https://app-site-association.cdn-apple.com/a/v1/app.link.digidentity.eu" returns following result * IPv6: (none) * IPv4: 17.253.15.197, 17.253.29.202, 17.253.37.203, 17.253.37.208, 17.253.57.197, 17.253.57.208, 17.253.29.196 * Trying 17.253.15.197:443... * Connected to app-site-association.cdn-apple.com (17.253.15.197) port 443 * ALPN: curl offers h2,http/1.1 * (304) (OUT), TLS handshake, Client hello (1): * CAfile: /etc/ssl/cert.pem * CApath: none * (304) (IN), TLS handshake, Server hello (2): * (304) (IN), TLS handshake, Unknown (8): * (304) (IN), TLS handshake, Certificate (11): * (304) (IN), TLS handshake, CERT verify (15): * (304) (IN), TLS handshake, Finished (20): * (304) (OUT), TLS handshake, Finished (20): * SSL connection using TLSv1.3 / AEAD-CHACHA20-POLY1305-SHA256 / [blank] / UNDEF * ALPN: server accepted http/1.1 * Server certificate: * subject: C=US; ST=California; O=Apple Inc.; CN=app-site-association.cdn-apple.com * start date: Jul 7 00:05:26 2025 GMT * expire date: Sep 30 19:08:48 2025 GMT * subjectAltName: host "app-site-association.cdn-apple.com" matched cert's "app-site-association.cdn-apple.com" * issuer: CN=Apple Public Server ECC CA 11 - G1; O=Apple Inc.; ST=California; C=US * SSL certificate verify ok. * using HTTP/1.x > GET /a/v1/app.link.digidentity.eu HTTP/1.1 > Host: app-site-association.cdn-apple.com > User-Agent: curl/8.7.1 > Accept: */* > * Request completely sent off < HTTP/1.1 404 Not Found < Apple-Failure-Details: {"cause":"dial tcp: lookup app.link.digidentity.eu on 10.100.53.53:53: dial tcp 10.100.53.53:53: connect: connection refused"} < Apple-Failure-Reason: SWCERR00302 Network error (temporary) < Apple-From: https://app.link.digidentity.eu/.well-known/apple-app-site-association < Apple-Try-Direct: true < Cache-Control: max-age=3600,public < Content-Length: 10 < Content-Type: text/plain; charset=utf-8 < Date: Thu, 21 Aug 2025 10:36:47 GMT < Vary: Accept-Encoding < Expires: Thu, 21 Aug 2025 10:36:57 GMT < Age: 2952 < Via: http/1.1 uklon5-vp-vst-011.ts.apple.com (acdn/1.16221), https/1.1 uklon5-vp-vfe-007.ts.apple.com (acdn/4.16219), http/1.1 defra1-edge-lx-005.ts.apple.com (acdn/260.16276), http/1.1 defra1-edge-bx-006.ts.apple.com (acdn/260.16276) < X-Cache: hit-fresh, hit-stale, hit-fresh, hit-fresh < CDNUUID: e06b4b03-f97d-48f8-97bb-774359a39fa2-4464142837 < Connection: keep-alive < Not Found * Connection #0 to host app-site-association.cdn-apple.com left intact On our end, we did not find any reason why it can be not available for Apple to fetch. Is SWCERR00302 an indication of problem on our end? Any help is appreciated
0
0
157
Aug ’25
Critical SKAdNetwork Attribution
Subject/Title: Critical SKAdNetwork Attribution Failures (Bug Type: 237, Failure Type: 1201 in ASDErrorDomain) Issue Summary We are encountering repeated SKAdNetwork attribution failures (failureType: 1201 in ASDErrorDomain) for ad impression events processed through the ad network mj797d8u6f.skadnetwork. These failures are causing significant revenue losses, as ad impressions are not being properly attributed to installs. The issue occurs across multiple campaigns and involves both SKAdNetwork API 3.0 and 4.0, suggesting a systemic problem with attribution validation or network communication. This problem is critical as it disrupts advertisers’ ability to track conversions, optimize campaigns, and allocate budgets effectively. Technical Details Key Logs: Below are anonymized samples of the failed SKAdNetwork events: Log Sample 1 (Failure): { "bug_type": "237", "timestamp": "2025-01-07 22:49:15.00 -0500", "os_version": "iPhone OS 18.2.1 (22C161)", "roots_installed": 0, "incident_id": "78523BD9-1F58-4738-B526-8A8A63203214" } { "advertisementStoryId": "3D2E7EBB-1A57-4DF8-9375-2C465F423038", "apiVersion": "3.0", "eventType": "adImpression", "resultType": "finalized", "anonymous": true, "failureType": 1201, "failureDomain": "ASDErrorDomain", "clientEventId": "0F456623-584F-4913-BBD3-C3FD1219D104", "os": "iOS", "topic": "xp_amp_skad_perf", "adType": "app", "adNetworkId": "mj797d8u6f.skadnetwork", "eventTime": 1736305200000, "osBuildNumber": "22C161", "hardwareFamily": "iPhone", "api": "SKAdNetwork" } Log Sample 2 (Failure): { "bug_type": "237", "timestamp": "2025-01-07 22:49:15.00 -0500", "os_version": "iPhone OS 18.2.1 (22C161)", "roots_installed": 0, "incident_id": "0CBF612D-F0D9-449E-A34E-DE2DB92BEC0D" } { "advertisementStoryId": "946E568C-D2C1-478F-BFF3-4996C48F9B39", "apiVersion": "3.0", "eventType": "adImpression", "resultType": "finalized", "anonymous": true, "failureType": 1201, "failureDomain": "ASDErrorDomain", "clientEventId": "1A3D48FB-4452-4FD8-BB25-1195470A53DC", "os": "iOS", "topic": "xp_amp_skad_perf", "adType": "app", "adNetworkId": "mj797d8u6f.skadnetwork", "eventTime": 1736298000000, "osBuildNumber": "22C161", "hardwareFamily": "iPhone", "api": "SKAdNetwork" } Log Sample 3 (Success Example for Comparison): { "bug_type": "237", "timestamp": "2025-01-07 22:49:15.00 -0500", "os_version": "iPhone OS 18.2.1 (22C161)", "roots_installed": 0, "incident_id": "BFEAC86B-8195-4DB0-96FF-2028107256AD" } { "advertisementStoryId": "946E568C-D2C1-478F-BFF3-4996C48F9B39", "apiVersion": "3.0", "eventType": "adImpression", "resultType": "finalized", "anonymous": true, "clientEventId": "F6265488-E0FB-448A-A406-3F7254BCA9D7", "os": "iOS", "topic": "xp_amp_skad_perf", "adType": "app", "adNetworkId": "mj797d8u6f.skadnetwork", "eventTime": 1736294400000, "osBuildNumber": "22C161", "hardwareFamily": "iPhone", "api": "SKAdNetwork" } Failure Details: Failure Type: 1201 Failure Domain: ASDErrorDomain Ad Network ID: mj797d8u6f.skadnetwork API Versions Affected: 3.0, 4.0 Timeframe of Failures: All logs occur within 2025-01-07 22:00:00 UTC to 23:00:00 UTC. Environment: OS Version: iOS 18.2.1 (Build 22C161). Device Type: iPhone (hardwareFamily: iPhone). App Configuration: Includes the ad network ID in the Info.plist under SKAdNetworkItems. Impact Details Financial Loss: Based on failure rates, we estimate $20–$65/day per advertiser for small campaigns and $75–$375/day per advertiser for larger campaigns. If 100 advertisers are affected, daily losses range from $2,000–$37,500. Over a week, losses could exceed $70,000 to $262,500 or more. Operational Impact: Advertisers cannot track installs or optimize campaigns, leading to inefficient ad spending and potential budget reallocation to other networks. Damaged trust between advertisers and the ad network. Reputation Risk: Continued failures harm the credibility of the SKAdNetwork framework, critical in a post-ATT (App Tracking Transparency) ecosystem. Steps to Reproduce Serve an ad impression through the ad network mj797d8u6f.skadnetwork. Monitor SKAdNetwork attribution for that impression. Observe repeated failures (failureType: 1201) despite the resultType: finalized status. Recommendations for Investigation Attribution Timeout: Verify if these failures stem from delayed responses or missed attribution windows. Ad Network Configuration: Confirm the ad network’s integration complies with SKAdNetwork API 3.0 and 4.0 requirements. Infrastructure Review: Investigate potential bottlenecks or failures in Apple’s attribution servers (ASDErrorDomain) or communication delays. Contact Details Name: [Your Full Name] Role: [Your Role] (e.g., Ad Network Analyst/Developer) Organization: [Your Company Name] Email: [Your Email Address] Phone: [Your Phone Number] Submission Instructions You can submit this report via the following channels: Apple Feedback Assistant: https://feedbackassistant.apple.com/ Bug Reporting Tool: https://developer.apple.com/bug-reporting/ Apple DTS: https://developer.apple.com/support/technical/
0
0
386
Jan ’25
Timer app which works in background mode
I am developing multi timer app which works in background mode. at first, I could go multi timer in background mode using background mode 'audio' which uses slient wav file. However, app has rejected background mode 'audio' should not use which not for audio app. I want to know how to develop timer app which works in background mode in ios platform. native ios timer that let us alarmed time. so I want develop that kind of app Sincerely,
0
0
51
Aug ’25
Background App wake up when Live Activity Offline Push Arrived not reliable
we have three problem when using the push notification on Live Activity. 1. What is the specific callback strategy for the activityUpdates property in ActivityKit? We found that in actual user scenarios, there is a probability that we may not receive callbacks. From the community experience, there are some resource optimization strategies that do not perform callbacks. From this perspective, the explanation is kind of vague. Is there any clear feedback to understand why callbacks are performed/not performed? 2.what is the specific description of the wake-up strategy, when background app receive Live Activity offline start Push? From community experience, we can see that the system may wake up for a duration of 0-30s due to resource optimization strategies, or not wake up/not deal with it. Is there an official description of the wake-up strategy? or we also have to follow this description: Wake up of apps using content-available pushes are heavily throttled. You can expect 1-2 wakeup per hour as a best case scenario in the hands of your users. so this cannot be assumed to be a reliable wake-up on demand mechanism for an app. 3 How can we determine user have selected (allow or always allow) of the Live Activity permission? When we use real-time activity offline push, there are two system prompts in iOS: the first prompt : allow and disallow real-time activity the second prompt : always allow and disallow Is there an interface that can directly determine which permission the user has chosen (allow/always allow)? (By the way, we can get disallow status). At present, we haven't seen any interface in the official documentation/interface that can determine (allow/always allow). The difference here will affect the generation of Update Token. Without Update Token, we can not update our activity instance.
0
1
657
Feb ’25
FinanceKit - Any way to get merchant location info from transactions?
Hi all — I’m building a Wallet-style transaction details view using FinanceKit and I’m running into a gap around merchant location. What I’m seeing FinanceKit gives me great core fields (amount, currency, status, dates, MCC, merchantName, transactionDescription), but I’m not seeing any address or place/location metadata on a Transaction. For example, a small/local merchant where I can plausibly infer a single place: Fetched transaction: Transaction( id: 8D142B16-3E0E-40B8-945A-2E7C0CF65F1D, accountID: 14939CF4-DBC3-4A9D-8292-5FEA495B8461, transactionAmount: 47.24 USD, creditDebitIndicator: .debit, transactionDescription: "Local Dental Care", originalTransactionDescription: "Local Dental Care", merchantCategoryCode: 8021, merchantName: "Local Dental Care", transactionType: .pointOfSale, status: .booked, transactionDate: 2025-08-20 22:27:50 +0000, postedDate: 2025-08-21 11:22:06 +0000 ) Because this appears to be a single-location practice, I can usually resolve it to a place using MapKit search heuristics. But for big-box chains, I don’t get enough signal to determine which store: Fetched transaction: Transaction( id: 3F8E9F74-7565-4D24-9038-8FD709184799, accountID: 14939CF4-DBC3-4A9D-8292-5FEA495B8461, transactionAmount: 441.77 USD, creditDebitIndicator: .debit, transactionDescription: "The Home Depot", originalTransactionDescription: "The Home Depot", merchantCategoryCode: 5200, merchantName: "The Home Depot", transactionType: .pointOfSale, status: .booked, transactionDate: 2023-12-27 23:07:02 +0000, postedDate: 2023-12-29 03:09:41 +0000 ) There’s no store number, address, phone, or any stable identifier. With hundreds of locations, I can’t deterministically choose a map pin or fetch the right brand assets. What I’m trying to achieve I’d like to replicate the Apple Wallet experience: show a small map snapshot and merchant visuals (logo/name that match Apple Maps / the Place Card) on the transaction detail screen. Without a location hint, I have to either: Ask users to pick a store manually, or Make a guess based on a coarse, app-defined region …neither of which feels great. Questions Is there any way in FinanceKit today to access merchant location or a resolvable identifier (e.g., address, city/state, store number, Apple Maps place identifier, network merchant ID/MID, terminal ID, etc.)? If not, can FinanceKit expose additional merchant metadata (even opt-in / privacy-preserving) to enable Wallet-like enrichment? A few examples that would unblock this: merchantAddress (or components: street/city/region/postalCode/country) merchantPhone (often unique per store) merchantIdentifier (stable per physical location, e.g., network merchant ID / store number) mapsPlaceURL or mapsPlaceIdentifier (linkage to the Apple Maps Place Card) brandAssetURL (logo/brand reference similar to what Wallet shows) With even one of the above, I could reliably: Render an accurate map snapshot, Fetch the correct brand assets, and Avoid prompting the user or inferring via fuzzy search. Context / constraints I do not want to (and shouldn’t need to) request or monitor the user’s device location to resolve a merchant’s store location. For small merchants, MapKit text search is often enough. For large chains, I need a store-level identifier. If there’s an existing field or recommended approach I’m missing, I’d love pointers. If not, please consider this a feature request for richer merchant metadata in FinanceKit so developers can build Wallet-quality transaction details. Thanks!
0
0
56
Aug ’25
A Summary of the WWDC25 Group Lab - watchOS (Part 2)
At WWDC25 we launched a new type of Lab event for the developer community - Group Labs. A Group Lab is a panel Q&A designed for a large audience of developers. Group Labs are a unique opportunity for the community to submit questions directly to a panel of Apple engineers and designers. Here are the highlights from the WWDC25 Group Lab for watchOS (part 2). 7. For widget (complication) update budgets, is there an overall budget or are scheduled update separate from APNS updates? For context I have a complication that is updated on a fixed schedule (every 20 min), but there can be times of the day that are more "interesting" where pushes make sense. Like timeline updates, the system budgets WidgetKit push notifications and delivers them opportunistically. You can use WidgetKit push notification updates as an addition to timeline updates. For more information, see Updating widgets with WidgetKit push notifications. 8. It seems like the new Control Center widgets can be sourced from either the iPhone or directly on the Watch. Can we control whether a control appears in the watch list, or will it always be a combination of all controls from both sources? iPhone controls will be automatically available on the companion Apple Watch, even if they don’t have an associated watchOS app. When an iPhone control is tapped on the Apple Watch, the action is performed on the iPhone. Controls whose actions foreground the iOS app will not appear on Apple Watch. If a watchOS app has controls, no controls will appear on Apple Watch from the companion iOS app. 9. From UI/UX perspective, what are the current practices for Designing watchOS apps that feels native. The WWDC23 session Design and build apps for WatchOS 10 covers the details of watchOS design principles and how to apply them in your app using SwiftUI. A lot of SwiftUI APIs, such as NavigationSplitView, vertical tab view, list view, and etc, already implement the look and feel native to watchOS. 10. When adopting the new design system on watchOS, it seems like the main place we will use the glass effect is for our buttons in toolbar? Standard buttons in system apps seem to continue to use a flat appearance and full width. We leave the choice to you – You can use the new GlassButtonStyle API or .buttonStyle(.glass) to apply the liquid glass material to buttons. Learn when to use the Liquid Glass styles in Get to know the new design system. 11. Is there any way to gracefully migrate extensions when their bundleIDs have to change? e.g., converting a multi-target watch app to single-target, which drops the .watchkitextension from both the app and WidgetKit ext bundleIDs Updating a watchOS app to single-target is covered in TechNote TN3157: Updating your watchOS project for SwiftUI and WidgetKit. Xcode provides a tool that can do the update automatically, and the technote describes the details about how to use it and how to clean up the project after the automatic update. If there's something that technote doesn't address, please reach out to us on the Developer Forums. 12. What is the status of WatchConnectivity? Is that still the preferred way for iOS + watchOS communications? The Watch Connectivity framework is still supported, and is appropriate for the communication between an watchOS app and its companion iOS app. The systems also provide other APIs for the apps to exchange data. For example, watchOS supports Apple Push Notification service (APNs). If data for your widget changes on your server, your widget can receive a WidgetKit push notification, and update accordingly. That’s the preferred mechanism for widget updates.
0
0
130
Jul ’25
A Summary of the WWDC25 Group Lab - watchOS (Part 1)
At WWDC25 we launched a new type of Lab event for the developer community - Group Labs. A Group Lab is a panel Q&A designed for a large audience of developers. Group Labs are a unique opportunity for the community to submit questions directly to a panel of Apple engineers and designers. Here are the highlights from the WWDC25 Group Lab for watchOS (part 1). 1. I'm really excited about the new design system on all platforms. Liquid Glass is super cool. What do developers need to keep in mind when building for watchOS 26? To adopt the new design system, start with updating your app for watchOS 10 – If you have done so, your app will be mostly ready for watchOS 26. For more information, see Design and build apps for WatchOS 10. You can then look into Liquid Glass specific APIs to fine tune your app. This topic is covered in Adopting Liquid Glass. If you have SwiftUI views using any custom style, make sure they are still legible and fit with the new design system. 2. Something that really stood out to me were updates to the Smart Stack, with the system prioritizing Widgets when they're most relevant. Tell me more about these new opportunities for apps. Workout apps that record workouts using HealthKit may be automatically suggested on the watch face and appear in the Smart Stack without adding a widget. Relevant widgets are a great way to present information related to a date, location, point-of-interest type, sleep schedule, or fitness condition in the Smart Stack when it is relevant. Relevant widgets don't need to display a empty state view when they are not relevant. They are only shown in the Smart Stack when relevant. The watchOS 26 Design ToolKit in the Apple Design Resources includes a set of templates that you can use to layout your widgets. 3. Is the Wrist Flick gesture available to developers in the same way as Double Tap is? The system uses Wrist Flick to dismiss notifications and incoming calls, silence timers and alarms, or return to the watch face. There is no separate API for the Wrist Flick gesture. Apps that are using XCUIAutomation to make sure their user interface behaves as intended can use the XCUIDeviceHandGesture.flick to automate tests that verify that their app responds appropriately to the Wrist Flick gesture. For apps using automated testing, the XCUIDeviceHandGesture.doubleTap can be also be used to automate testing of the app with the Double Tap gesture. See XCUIDevice.perform(handGesture:) 4. Can HRV measurements be triggered on demand via API in watchOS? Guidelines or processes for enabling energy-intensive biometric sampling on development devices for IRB-approved research? You don’t have direct control on the sampling rate in watchOS. You can use HealthKit (HKQuantityTypeIdentifierHeartRateVariabilitySDNN, to be specific) to query the HRV data, once the system has sampled and persisted the data to HealthKit. If that doesn’t help, we suggest that you file a feedback report with your concrete use case for us to investigate. Specific to IRB-approved research using Apple Watch or its companion iPhone, you might want to look at this FAQ and SensorKit to see if they can be of any help. 5. What is the best advice for someone who is new to making a watchOS app that’s been on iOS and iPadOS? You can start with exploring the system experience features on watchOS, such as notifications, controls, and widgets, and getting familiar with the system spaces, like Smart Stack, watch face, and control center. Knowing the watchOS app design principles and practices is important as well. Design and build apps for WatchOS 10 is a great resource for this topic. SwiftUI is an amazing across-platform framework, and you will use it to create your watchOS app. If you're already using it, great! Keep in mind some watch-only constraints. Comparing to iPhone or iPad, Apple Watch has a limited battery and smaller screen size, which significantly impacts how people use your app and how your app works. 6. Was there any extension this year to the 7 day limit on querying Apple Health data on the watch? There is no change on the limit this year. You can get this official limit at runtime using earliestPermittedSampleDate. There are some exceptions, and so don't be surprised if you see some data types are retained longer. The companion iPhone holds the full set of the health data. If you need to access the health data that has been purged from the Apple Watch, consider doing it with your iOS app, and then passing the result to your watchOS app.
0
0
126
Jul ’25
[macos 15.2 (24C101)] Custom input method does not work as expected
Environment: macOS 15.2 (24C101) with Xcode 16.2 (16C5032a) Goal: I am trying to build a simple IMKInputController-based input method. Problem: My .app bundle registers successfully and I can select it as an input source. When selected, it blocks keyboard input, but my handle method does not seem to execute or produce output. I have placed NSLog statements in my controller's init and handle methods. Code for the controller: import InputMethodKit // The IMKTextInput protocol is provided by the framework. // We don't need to define our own bridging protocol for this test. public class HelloWorldController: IMKInputController { public override init!(server: IMKServer!, delegate: Any!, client inputClient: Any!) { super.init(server: server, delegate: delegate, client: inputClient) NSLog("HelloWorldIME: Controller has been initialized.") } public override func handle(_ event: NSEvent!, client sender: Any!) -> Bool { NSLog("HelloWorldIME: handle() method was called.") // ================== FINAL FIX APPLIED HERE ================== // 1. First, we ensure the client is a fundamental Objective-C object. guard let clientObject = sender as? NSObject else { NSLog("HelloWorldIME: Error - client object is not an NSObject.") return false } NSLog("HelloWorldIME: Successfully cast client to NSObject.") // 2. Now that we have an NSObject, we can safely check if it responds to the selector. let selector = #selector(IMKTextInput.insertText(_:replacementRange:)) if !clientObject.responds(to: selector) { NSLog("HelloWorldIME: Error - client object does not respond to the insertText selector.") return false } NSLog("HelloWorldIME: Client responds to insertText. Preparing to insert text.") // 3. Since we've confirmed it responds, we can now safely treat it as an IMKTextInput // and call the method. let client = clientObject as! IMKTextInput let stringToInsert = "A" let replacementRange = NSRange(location: NSNotFound, length: 0) client.insertText(stringToInsert, replacementRange: replacementRange) NSLog("HelloWorldIME: Called insertText with string '\(stringToInsert)'. Action complete.") // ======================================================== return true } }
0
0
127
Jun ’25
iOS magnetometer data processing
Hello, I’m developing an app to detect movement past a strong magnet, targeting both Android and iOS. On Android, I’m using the Sensor API, which provides calibrated readings with temperature compensation, factory (or online) soft-iron calibration, and online hard-iron calibration. The equivalent on iOS appears to be the CMCalibratedMagneticField data from the CoreMotion framework. However, I’m encountering an issue with the iOS implementation. The magnetometer data on iOS behaves erratically compared to Android. While Android produces perfectly symmetric peaks, iOS shows visual peaks that report double the magnetic field strength. Additionally, there’s a "pendulum" effect: the field strength rises, drops rapidly, rises again to form a "double peak" structure, and takes a while to return to the local Earth magnetic field average. The peaks on iOS are also asymmetric. I’m wondering if this could be due to sensor fusion algorithms applied by iOS, which might affect the CMCalibratedMagneticField data. Are there other potential reasons for this behavior? Any insights or suggestions would be greatly appreciated. Thank you!
0
0
58
Jun ’25
Live caller id lookup - Which secret key is used for PIR database HE encryption on test env
Hi, I'm trying to setup PIR service for live caller id lookup (in python but based on swift example: https://github.com/apple/live-caller-id-lookup-example). The swift example provides utilities for database setup and encryption, but I can't find any specification about which key is used for database encryption and how the ios system knows about this key in order to be able to construct the PIR requests. So my question is how does the PIR service communicate the secret key to ios system or vice versa? (specific to the test environment, before onboarding)
0
0
291
Feb ’25
LiveCommunicationKit
What I want to achieve now is that when the app is not running, upon receiving a notification, it displays an interface similar to CallKit with accept and decline buttons. Here is part of my code: @available(iOS 17.4, *) class LiveCommunicationManager: NSObject, ConversationManagerDelegate { static let shared = LiveCommunicationManager() var isInvalidate:Bool = false var configuration: ConversationManager! override init() { let config = ConversationManager.Configuration( ringtoneName: "notes_of_the_optimistic", iconTemplateImageData: UIImage(named: "AppIcon")?.pngData(), // 图标的 PNG 数据 maximumConversationGroups: 1, // 最大对话组数 maximumConversationsPerConversationGroup: 1, // 每个对话组内最大对话数 includesConversationInRecents: false, // 是否在通话记录中显示 supportsVideo: false, // 是否支持视频 supportedHandleTypes: [.generic,.phoneNumber,.emailAddress] // 支持的通话类型 ) configuration = ConversationManager.init(configuration: config) } func reportIncomingCall(uuid: UUID, callerName: String) { configuration.delegate = self let local = Handle(type: .generic, value: callerName, displayName: callerName) let update = Conversation.Update(localMember: local,members: [local],activeRemoteMembers: [local]) Task{ do { try await configuration.reportNewIncomingConversation(uuid: uuid, update: update) print("成功报告新来电") } catch { print("报告新来电失败: \(error.localizedDescription)") } } } func conversationManager(_ manager: ConversationManager, conversationChanged conversation: Conversation) { print("会话状态改变了") } func conversationManagerDidBegin(_ manager: ConversationManager) { print("会话已经开始了") manager.delegate = self } func conversationManagerDidReset(_ manager: ConversationManager) { print("会话将要清除了") } func conversationManager(_ manager: ConversationManager, perform action: ConversationAction) { print("会话接听了") configuration.invalidate() } func conversationManager(_ manager: ConversationManager, timedOutPerforming action: ConversationAction) { print("会话超时了") } func conversationManager(_ manager: ConversationManager, didActivate audioSession: AVAudioSession) { print("会话激活了") } func conversationManager(_ manager: ConversationManager, didDeactivate audioSession: AVAudioSession) { print("会话死亡了") } } 在Appdelegate里设置了这些: func application(_ application: UIApplication, didReceiveRemoteNotification userInfo: [AnyHashable: Any], fetchCompletionHandler completionHandler: @escaping (UIBackgroundFetchResult) -&gt; Void) { // 在这里处理离线推送通知 completionHandler(.noData) // 返回后台任务完成 if let aps = userInfo["aps"] as? [String: Any], let alert = aps["alert"] as? [String : Any]{ // 静默推送的处理逻辑 if #available(iOS 17.4, *) { let manager = LiveCommunicationManager.shared if manager.isInvalidate { return } if let msgType = userInfo["msgType"] as? Int{ if msgType == 5{ manager.configuration.invalidate() }else{ let callerName = alert["title"] as? String ?? "Fanvil" manager.reportIncomingCall(uuid: UUID(), callerName: callerName) } } } } } Xcode has been configured with the necessary capabilities, such as Background Fetch, Voice over IP, Background Processing, and Push Notification. The issue now is that sometimes the code works as expected, allowing the app to wake up when not running and displaying the system interface with accept and decline buttons. However, after a few successful attempts, the app stops waking up, and no notification appears. But when I manually open the app, the didReceiveRemoteNotification method gets triggered. I’d like to know why this stops working after a few times.
0
1
213
Apr ’25
Request array with AppIntents
Hi, I’m trying to get an array of strings from the user using AppIntents, but I’m encountering an issue. The shortcut ends without prompting the user for input or saving the value, though it doesn’t crash. I need to get the user to input multiple tasks in an array, but the current approach isn’t working as expected. Here’s the current method I’m using: // Short code snippet showing the current method private func collectTasks() async throws -> [String] { var collectedTasks: [String] = tasks ?? [] while true { if !collectedTasks.isEmpty { let addMore = try await $input.requestConfirmation("Would you like to add another task?") if !addMore { break } } let newTask = try await $input.requestValue("Please enter a task:") collectedTasks.append(newTask) } return collectedTasks } The Call func perform() async throws -> some IntentResult { let finalTasks = try await collectTasks() // Some more Code } Any advice or suggestions would be appreciated. Thanks in advance!
0
0
332
Feb ’25
Filtering MMS Messages with Multimedia Content (Images, Videos, etc.)
Hi Apple Developer, I’m working on a message-filtering application and reviewing Apple's documentation on message filtering. The documentation clearly states that MMS messages can be filtered. (https://developer.apple.com/documentation/identitylookup/sms-and-mms-message-filtering) When we refer to MMS, it includes images, short videos, and other supported multimedia formats. However, the ILMessageFilterQueryRequest only provides the message body as a String, meaning we can access text and links but not images or other media files. Could you please confirm whether Apple allows third-party applications to access multimedia content sent from unknown numbers? Looking forward to your quick response. Thanks, Rijul Singhal
0
0
278
Feb ’25
iMessage functionality
Hey guys! I've recently noticed a number of PaaS'es and CPaaS'es offering bulk outgoing messaging using the iMessage the same way it's done with the SMS. I always thought that iMessage sort of only allowed businesses to send outgoings subject to user contacting their account first (to avoid being spammed). But then there's those I mentioned above. Have you faced anything like this? Did Apple make changes to the model so that businesses can now initiate conversations with users? If so, how does it work?
0
0
94
Oct ’25
how can i get the LiveCommunicationKit events
i have codes looks like: import UIKit import LiveCommunicationKit @available(iOS 17.4, *) class LiveCallKit: NSObject, ConversationManagerDelegate { @available(iOS 17.4, *) func conversationManager(_ manager: ConversationManager, conversationChanged conversation: Conversation) { } @available(iOS 17.4, *) func conversationManagerDidBegin(_ manager: ConversationManager) { } @available(iOS 17.4, *) func conversationManagerDidReset(_ manager: ConversationManager) { } @available(iOS 17.4, *) func conversationManager(_ manager: ConversationManager, perform action: ConversationAction) { } @available(iOS 17.4, *) func conversationManager(_ manager: ConversationManager, timedOutPerforming action: ConversationAction) { } @available(iOS 17.4, *) func conversationManager(_ manager: ConversationManager, didActivate audioSession: AVAudioSession) { } @available(iOS 17.4, *) func conversationManager(_ manager: ConversationManager, didDeactivate audioSession: AVAudioSession) { } @objc public enum InterfaceKind : Int, Sendable, Codable, Hashable { /// 拒绝/挂断 case reject /// 接听. case answer } var sessoin: ConversationManager var callId: UUID var completionHandler: ((_ actionType: InterfaceKind,_ payload: [AnyHashable : Any]) -&gt; Void)? var payload: [AnyHashable : Any]? @objc init(icon: UIImage!) { let data:Data = icon.pngData()!; let cfg: ConversationManager.Configuration = ConversationManager.Configuration(ringtoneName: "ring.mp3", iconTemplateImageData: data, maximumConversationGroups: 1, maximumConversationsPerConversationGroup: 1, includesConversationInRecents: false, supportsVideo: false, supportedHandleTypes: Set([Handle.Kind.generic])) self.sessoin = ConversationManager(configuration: cfg) self.callId = UUID() super.init() self.sessoin.delegate = self } @objc func toIncoming(_ payload: [AnyHashable : Any], displayName: String,actBlock: @escaping(_ actionType: InterfaceKind,_ payload: [AnyHashable : Any])-&gt;Void) async { self.completionHandler = actBlock do { self.payload = payload self.callId = UUID() var update = Conversation.Update(members: [Handle(type: .generic, value: displayName, displayName: displayName)]) let actNumber = Handle(type: .generic, value: displayName, displayName: displayName) update.activeRemoteMembers = Set([actNumber]) update.localMember = Handle(type: .generic, value: displayName, displayName: displayName); update.capabilities = [ .playingTones ]; try await self.sessoin.reportNewIncomingConversation(uuid: self.callId, update: update) try await Task.sleep(nanoseconds: 2000000000); } catch { } } } i want to listen the button event,but i can't find the solutions!please give me a code demo
0
0
197
Mar ’25