Delve into the world of built-in app and system services available to developers. Discuss leveraging these services to enhance your app's functionality and user experience.

Posts under General subtopic

Post

Replies

Boosts

Views

Activity

After iOS18.0 rejects the call, CXCall's hasConnected is YES, which should be NO.
Dear Apple R&D Team, I want to report a bug of CallKit caused by Voicemail in iOS18.0. The following are the header files referenced by our code: #import <CallKit/CXCallObserver.h> #import <CallKit/CXCall.h> Our VoIP app uses the callObserver callback function provided by CXCallObserverDelegate to monitor changes in the system phone status (see the attached picture). The problems we encountered are: Before upgrading to iOS18.0, when we rejected a call, we would receive a callObserver callback, and the status value of hasConnected of CXCall was NO; After upgrading to iOS18.0, when we also rejected a call, the status value of hasConnected of CXCall was YES. So we checked the new features of iOS18.0 and found that it was the influence of the new feature Voicemail. On iOS18.0 devices, if you reject a call, you will enter the Voicemail state by default, which means that I rejected the call, but callObserver told me that the call was connected, which is inconsistent with the user's intuitive experience and will also cause our App to respond incorrectly (cannot resume audio automatically). In fact, the Voicemail function is enabled by default for devices upgraded to iOS 18.0. After I reject a call, I have to wait for the caller to hang up before I can receive callObserver telling me that the call has been hung up. This experience is very bad, because when the caller does not hang up the call, our App cannot sense that I have hung up the call and thinks that I am still on the call, but in fact I did hang up the call. Please refer to the code in the attached picture. According to our understanding, the status should flow like this: When I receive an incoming call, the callObserver callback will trigger, call.hasConnected is NO and call.hasEnded is NO; When I reject the call and switch to Voicemail, the callObserver callback will trigger, call.hasConnected should be NO and call.hasEnded should be YES; When I manually restore the call through Voicemail, the callObserver callback will trigger, call.hasConnected should be YES and call.hasEnded should be NO; When the caller or I finally hang up the call, the callObserver callback will trigger, call.hasConnected is YES, call.hasEnded is YES; However, the current status is as follows: When I receive an incoming call, the callObserver callback will trigger, call.hasConnected is NO and call.hasEnded is NO; When I reject the call and switch to Voicemail, the callObserver callback will trigger, call.hasConnected is YES and call.hasEnded is NO; When I manually restore the call through Voicemail, the callObserver callback will not trigger; Only when the caller finally hangs up, the callObserver callback will trigger, call.hasConnected is YES and call.hasEnded is YES; Our request is very simple. We need to pause audio when a call comes in and resume audio when the call is hung up or rejected. This is our sample code: - (void)callObserver:(CXCallObserver *)callObserver callChanged:(CXCall *)call API_AVAILABLE(ios(10.0)) { bool pause_audio = false; BOOL calling = !call.onHold && !call.hasConnected && !call.hasEnded; BOOL disconnected = !call.onHold && !call.hasConnected && call.hasEnded; BOOL connected = !call.onHold && call.hasConnected && !call.hasEnded; BOOL hang_up = !call.onHold && call.hasConnected && call.hasEnded; if (calling) { xc_log(XC_LOG_INFO, "A phone call is %s", call.outgoing ? "outgoing" : "incoming"); pause_audio = true; } else if (disconnected) { xc_log(XC_LOG_INFO, "An %s phone call has been disconnected", call.outgoing ? "outgoing" : "incoming"); // action of phone call end, resume audio } else if (connected) { xc_log(XC_LOG_INFO, "An %s phone call has just been connected", call.outgoing ? "outgoing" : "incoming"); pause_audio = true; } else if (hang_up) { xc_log(XC_LOG_INFO, "An %s phone call has been hang up", call.outgoing ? "outgoing" : "incoming"); // action of phone call end, resume audio } else { xc_log(XC_LOG_INFO, "Unknown telephony state occured"); } if (pause_audio) { dispatch_async(dispatch_get_main_queue(), ^{ // action of phone call begin, pause audio }); } } I would appreciate any help !!!
0
2
501
Oct ’24
How to expand Call Directory Extension capacity for adding more than 2_000_000 entries?
I've found an app that has a call blocking feature an is able to add more than 10_000_000 entries. As I understand it doesn't use more than one extension for it because I see only one in the Call Blocking & Identifying settings menu. How to implement that? My limit now is around 1_800_000 entries.
0
0
425
Nov ’24
Screen Time API - Device Activity Report
I need some assistance with the Screen Time API’s DeviceActivityReport extension. I know the extension is sandboxed but I need the data inside my app. Jomo is currently doing this so it’s not impossible. I see they’re saying it’s an estimate which is about 5 - 10 off of the actual screen time, but how are they doing this? Any attempt to store the screen time data inside some sort of database or UserDefaults always fails of course due to the sandbox. Any advice would be greatly appreciated!
0
0
460
Oct ’24
Core Spotlight searching only for title
I just adding a way to donate my app's data to Core Spotlight using CSSearchableIndex, but I'm finding that spotlight is only searching for the title of the CSSearchableItem I create. I know the index is working, because it always finds the item through the title property, but nothing else. This is how I'm creating the CSSearchableItem: - (CSSearchableItem *) createSearchableItem { CSSearchableItemAttributeSet* attributeSet = [[CSSearchableItemAttributeSet alloc] initWithContentType: UTTypeText]; attributeSet.title = [self titleForIndex]; attributeSet.displayName = [self titleForIndex]; attributeSet.contentDescription = [self contentDescriptionForIndex]; attributeSet.thumbnailData = [self thumbnailDataForIndex]; attributeSet.textContent = [self contentDescriptionForIndex]; CSSearchableItem *item = [[CSSearchableItem alloc] initWithUniqueIdentifier: [self referenceURLString] domainIdentifier:@"com.cjournal.cjournal-Logs" attributeSet:attributeSet]; item.expirationDate = [NSDate distantFuture]; return item; } There's a lot of confusing tips around which say specifying the 'textContent' should work, and/or setting the displayName is essential, but none of these are working. Is there something I'm missing with my setup? Thanks.
0
0
81
Jun ’25
App Intents: requestConfirmation method not working with Siri invocation
Hello, I am implementing an App Intent which shows a confirmation dialog before proceeding with the operation execution. It works fine when the intent is started from a shortcut, but it always fails when started from Siri: I obtain the error message depicted in the attached screenshot ("An error occurred, try again"). That message appears as soon as the requestConfirmation method is called in the perform method of my App Intent: try await requestConfirmation(actionName: .do, dialog: "app_intent_sim_confirmation_message") { SIMRechargeIntentSummaryView(...) } ... How can I solve the problem? Thanks
0
0
359
Oct ’24
Critical SKAdNetwork Attribution
Subject/Title: Critical SKAdNetwork Attribution Failures (Bug Type: 237, Failure Type: 1201 in ASDErrorDomain) Issue Summary We are encountering repeated SKAdNetwork attribution failures (failureType: 1201 in ASDErrorDomain) for ad impression events processed through the ad network mj797d8u6f.skadnetwork. These failures are causing significant revenue losses, as ad impressions are not being properly attributed to installs. The issue occurs across multiple campaigns and involves both SKAdNetwork API 3.0 and 4.0, suggesting a systemic problem with attribution validation or network communication. This problem is critical as it disrupts advertisers’ ability to track conversions, optimize campaigns, and allocate budgets effectively. Technical Details Key Logs: Below are anonymized samples of the failed SKAdNetwork events: Log Sample 1 (Failure): { "bug_type": "237", "timestamp": "2025-01-07 22:49:15.00 -0500", "os_version": "iPhone OS 18.2.1 (22C161)", "roots_installed": 0, "incident_id": "78523BD9-1F58-4738-B526-8A8A63203214" } { "advertisementStoryId": "3D2E7EBB-1A57-4DF8-9375-2C465F423038", "apiVersion": "3.0", "eventType": "adImpression", "resultType": "finalized", "anonymous": true, "failureType": 1201, "failureDomain": "ASDErrorDomain", "clientEventId": "0F456623-584F-4913-BBD3-C3FD1219D104", "os": "iOS", "topic": "xp_amp_skad_perf", "adType": "app", "adNetworkId": "mj797d8u6f.skadnetwork", "eventTime": 1736305200000, "osBuildNumber": "22C161", "hardwareFamily": "iPhone", "api": "SKAdNetwork" } Log Sample 2 (Failure): { "bug_type": "237", "timestamp": "2025-01-07 22:49:15.00 -0500", "os_version": "iPhone OS 18.2.1 (22C161)", "roots_installed": 0, "incident_id": "0CBF612D-F0D9-449E-A34E-DE2DB92BEC0D" } { "advertisementStoryId": "946E568C-D2C1-478F-BFF3-4996C48F9B39", "apiVersion": "3.0", "eventType": "adImpression", "resultType": "finalized", "anonymous": true, "failureType": 1201, "failureDomain": "ASDErrorDomain", "clientEventId": "1A3D48FB-4452-4FD8-BB25-1195470A53DC", "os": "iOS", "topic": "xp_amp_skad_perf", "adType": "app", "adNetworkId": "mj797d8u6f.skadnetwork", "eventTime": 1736298000000, "osBuildNumber": "22C161", "hardwareFamily": "iPhone", "api": "SKAdNetwork" } Log Sample 3 (Success Example for Comparison): { "bug_type": "237", "timestamp": "2025-01-07 22:49:15.00 -0500", "os_version": "iPhone OS 18.2.1 (22C161)", "roots_installed": 0, "incident_id": "BFEAC86B-8195-4DB0-96FF-2028107256AD" } { "advertisementStoryId": "946E568C-D2C1-478F-BFF3-4996C48F9B39", "apiVersion": "3.0", "eventType": "adImpression", "resultType": "finalized", "anonymous": true, "clientEventId": "F6265488-E0FB-448A-A406-3F7254BCA9D7", "os": "iOS", "topic": "xp_amp_skad_perf", "adType": "app", "adNetworkId": "mj797d8u6f.skadnetwork", "eventTime": 1736294400000, "osBuildNumber": "22C161", "hardwareFamily": "iPhone", "api": "SKAdNetwork" } Failure Details: Failure Type: 1201 Failure Domain: ASDErrorDomain Ad Network ID: mj797d8u6f.skadnetwork API Versions Affected: 3.0, 4.0 Timeframe of Failures: All logs occur within 2025-01-07 22:00:00 UTC to 23:00:00 UTC. Environment: OS Version: iOS 18.2.1 (Build 22C161). Device Type: iPhone (hardwareFamily: iPhone). App Configuration: Includes the ad network ID in the Info.plist under SKAdNetworkItems. Impact Details Financial Loss: Based on failure rates, we estimate $20–$65/day per advertiser for small campaigns and $75–$375/day per advertiser for larger campaigns. If 100 advertisers are affected, daily losses range from $2,000–$37,500. Over a week, losses could exceed $70,000 to $262,500 or more. Operational Impact: Advertisers cannot track installs or optimize campaigns, leading to inefficient ad spending and potential budget reallocation to other networks. Damaged trust between advertisers and the ad network. Reputation Risk: Continued failures harm the credibility of the SKAdNetwork framework, critical in a post-ATT (App Tracking Transparency) ecosystem. Steps to Reproduce Serve an ad impression through the ad network mj797d8u6f.skadnetwork. Monitor SKAdNetwork attribution for that impression. Observe repeated failures (failureType: 1201) despite the resultType: finalized status. Recommendations for Investigation Attribution Timeout: Verify if these failures stem from delayed responses or missed attribution windows. Ad Network Configuration: Confirm the ad network’s integration complies with SKAdNetwork API 3.0 and 4.0 requirements. Infrastructure Review: Investigate potential bottlenecks or failures in Apple’s attribution servers (ASDErrorDomain) or communication delays. Contact Details Name: [Your Full Name] Role: [Your Role] (e.g., Ad Network Analyst/Developer) Organization: [Your Company Name] Email: [Your Email Address] Phone: [Your Phone Number] Submission Instructions You can submit this report via the following channels: Apple Feedback Assistant: https://feedbackassistant.apple.com/ Bug Reporting Tool: https://developer.apple.com/bug-reporting/ Apple DTS: https://developer.apple.com/support/technical/
0
0
283
Jan ’25
Error: com.apple.AuthenticationServices.Authorization Error Code 1000
I am getting this error when trying to run the app on an actual IOS device using testFlight. It works well on simulators. I am using expo to build my app. I have done the following: Added Sign in with Apple in Xcode for all the profiles ( i.e. debug and release) Set expo.ios.usesAppleSignIn to true in my app.json ( I think due to this I see Synced capabilities: Enabled: Sign In with Apple while building my app using eas build. ) Make sure that the provision profile has Sign in with App capability. What I am not able to understand is how option provided in provision profile works. Expo creates the profile with type "App Store". I think this is because I am building the expo app with production profile. Does this have to take anything with it? Does the profile with type "App Store" not work when I am testing the app using testFlight? I am so confused right now. Can someone help me figure out the actual issue? I have seen almost all the posts regarding this error. None of those helped! Thanks in advance!
0
0
339
Nov ’24
Questions about the implementation of "SMS and MSS Message Filtering"
In the mobile application we are developing we need to use the "SMS and MMS Message Filtering" functionality to analyze SMS messages received from unknown senders. The questions that arise about the use of this functionality are: Does this functionality allow these messages to be sent to solutions deployed in Microsoft Azure?. If the previous case is confirmed, could we use Rest APIs that we already have deployed in Azure?. If it were not possible to configure access to an existing Rest API, what would be the definition of the API to be implemented in the cloud for this service?. And finally, how would the configuration of access permissions to this API deployed in the cloud be?.
0
0
212
Jan ’25
"Waiting for Handoff" Dialog Appearing in iOS 18 App Clip
When opening our App Clip from a Live Activity, the iOS system Handoff alert blocks our app on open. It is reproducible 100% of the time. The description in the system alert is: Waiting for Handoff to {My App}. We never had this issue before and believe it is related to iOS 18. I don't have Handoff enabled anywhere in my app. All uses of NSUserActivity explicitly block handoff userActivity.isEligibleForHandoff = false We have been able to locate this same issue in other iOS apps that use Live Activities and App Clips. Is this an iOS 18 system-level bug?
0
0
478
Oct ’24
Message Filter Extension and multiple servers
In the documentation for a Message Filter Extension it states: If you have servers that can help your app extension determine how to handle a message, you must add the Associated Domains capability to your Xcode project and specify those domains. (https://developer.apple.com/documentation/sms_and_call_reporting/sms_and_mms_message_filtering/creating_a_message_filter_app_extension) The words servers and domains are in the plural. If it's possible to specify multiple servers/domains for a Message Filter Extension then how is that done? There's no documentation nor reference for that. If multiple domains can be added to the info.plist then what is the iOS behavior in that case? Can the extension supply/change which domain is used at run time?
0
0
423
Jan ’25
App Store Review always got 4.3 for my apps
I created an app and submitted to app store for review and got a rejection with "4.3(a) - Design - Spam". This one was created for brand new, and I didn't find any similar apps in App store. I searched in this forum but am not sure if it is because I used flutter to build my app? How could I get a bit more specific detail why it got rejected? Guideline 4.3(a) - Design - Spam We noticed your app shares a similar binary, metadata, and/or concept as apps submitted to the App Store by other developers, with only minor differences. Submitting similar or repackaged apps is a form of spam that creates clutter and makes it difficult for users to discover new apps. Next Steps Since we do not accept spam apps on the App Store, we encourage you to review your app concept and submit a unique app with distinct content and functionality.
0
0
288
Dec ’24
Live Caller ID doesn't perform caching
I experimented a lot with Live Caller ID when it first appeared with iOS 18 Beta. Now I'm starting to pick it up again and have immediately noticed some detrimental differences between the behavior observed when it was in beta status to how it currently behaves with iOS 18.3. The main difference is caching - if a call is made and data from a live call id lookup displayed on the call screen, then if the call is made again immediately then that data is re-fetched from the server. And it takes a long time too, about 5 or 6 seconds before the data is displayed in the call screen (with the beta it took about 3 seconds). In the data set cache_expiry_minutes is set to 50, yet it's not being honored, there's no caching occurring at all. Yet this did used to occur several months ago when the feature was in beta. What's happened to caching, why is it no longer working when it used to? Another change is there used to be a notification displayed when a call was blocked, this no longer is displayed. Is this an intentional change or a bug?
0
2
340
Jan ’25
[macos 15.2 (24C101)] Custom input method does not work as expected
Environment: macOS 15.2 (24C101) with Xcode 16.2 (16C5032a) Goal: I am trying to build a simple IMKInputController-based input method. Problem: My .app bundle registers successfully and I can select it as an input source. When selected, it blocks keyboard input, but my handle method does not seem to execute or produce output. I have placed NSLog statements in my controller's init and handle methods. Code for the controller: import InputMethodKit // The IMKTextInput protocol is provided by the framework. // We don't need to define our own bridging protocol for this test. public class HelloWorldController: IMKInputController { public override init!(server: IMKServer!, delegate: Any!, client inputClient: Any!) { super.init(server: server, delegate: delegate, client: inputClient) NSLog("HelloWorldIME: Controller has been initialized.") } public override func handle(_ event: NSEvent!, client sender: Any!) -> Bool { NSLog("HelloWorldIME: handle() method was called.") // ================== FINAL FIX APPLIED HERE ================== // 1. First, we ensure the client is a fundamental Objective-C object. guard let clientObject = sender as? NSObject else { NSLog("HelloWorldIME: Error - client object is not an NSObject.") return false } NSLog("HelloWorldIME: Successfully cast client to NSObject.") // 2. Now that we have an NSObject, we can safely check if it responds to the selector. let selector = #selector(IMKTextInput.insertText(_:replacementRange:)) if !clientObject.responds(to: selector) { NSLog("HelloWorldIME: Error - client object does not respond to the insertText selector.") return false } NSLog("HelloWorldIME: Client responds to insertText. Preparing to insert text.") // 3. Since we've confirmed it responds, we can now safely treat it as an IMKTextInput // and call the method. let client = clientObject as! IMKTextInput let stringToInsert = "A" let replacementRange = NSRange(location: NSNotFound, length: 0) client.insertText(stringToInsert, replacementRange: replacementRange) NSLog("HelloWorldIME: Called insertText with string '\(stringToInsert)'. Action complete.") // ======================================================== return true } }
0
0
57
Jun ’25
App Clip works in TestFlight but not elsewhere
My app is available in TestFlight but has been rejected in App Review with the review feedback that the app clip "just shows a blank screen". However, in the TestFlight app, the App Clip works as expected and brings up the clip. It also works correctly from Xcode testing. Any ideas on what the problem could be? It is using the default App Clip link (appclip.apple.com)
0
0
300
Jan ’25
How to Archive iMessages via API with User Authorization Workflow?
I’m working on a solution to archive iMessages by using an API or similar mechanism. Here’s the desired workflow: The user provides their phone number to initiate the archiving process. They receive a text message with a URL link. Clicking on the link authorizes the archiving of their iMessages. Once authorized, their text messages are archived. So far, I’ve researched third-party services and APIs but haven’t found any that offer this capability directly for iMessages. Questions: Are there any APIs or frameworks (Apple or third-party) that support accessing and archiving iMessages programmatically?
0
0
410
Jan ’25
How to Filter App Usage for a Specific Time Period Using Screen Time API?
I am working on a SwiftUI app using the Screen Time API and the DeviceActivityReport view to display app usage data. My current implementation successfully shows daily app usage using a DeviceActivityFilter with the .daily(during:) segment. However, I need to filter this data to show app usage only for a specific time period during the day, e.g., 4:00 PM to 5:00 PM. I created a DeviceActivityFilter with a .daily(during:) segment and passed a DateInterval for the desired time range: let now = Date() let startTime = calendar.date(bySettingHour: 16, minute: 0, second: 0, of: now)! let endTime = calendar.date(bySettingHour: 17, minute: 0, second: 0, of: now)! let timeInterval = DateInterval(start: startTime, end: endTime) let filter = DeviceActivityFilter( segment: .daily(during: timeInterval), users: .all, devices: .init([.iPhone]) ) I applied this filter to the DeviceActivityReport view: DeviceActivityReport(context, filter: filter) Even with the DateInterval set for the specific time range, the report still shows the total daily usage for each app, instead of restricting the results to the specified 1:00 PM to 5:00 PM range.
0
1
391
Dec ’24
iOS 18 CarPlay - all apps freeze when selected
Ever since updating to iOS18, CarPlay has been buggy for me. One of the constant issues is that after it connects, when I select an app (eg google maps or Spotify), the entire app will freeze. This happens regardless of the app I choose. Nothing on the app will be responsive unless I use the physical car controls to back out of the app. Once I’m on the main view of carplay (where it shows all the apps or where it shows the maps/audio mixes screen) then carplay becomes responsive again. But since I can’t use any of the apps as soon as I select one, I have to reboot my phone to resolve the issue. However, the issue will just happen again ok a subsequent attempt to connect. Some times it will work ok, probably happens again every 2-3 times I connect. This never happened prior to iOS18. Any suggestions?
0
0
328
Nov ’24
PTT Bluetooth transmission does not work as expected
Hello, I've been working to implement PTT in the way recommended by the documentation. The main issue is that the bluetooth methods are opaque, so I cannot solve for what I need. The result will be that I will have to resort to hacky approaches that the PTT framework seems to be intended to solve (playing silent clips, playing custom notification sounds, having long running background audio sessions). I am testing with Anker soundcore mini as well as airpod pro. Here's the issue: there are 2 very different behaviours depending on whether I'm using a call/fullDuplex session and a halfDuplex session. halfDuplex Anchor mini Current behaviour long press activates siri pressing again after siri is active, starts transmission long press activates siri again pressing again after siri is active, stops transmission pause/play routes to the ongoing media session and plays music Expected behaviour play/pause should map to transmit/stopTransmit IF I have to use long press, it should at least not trigger siri AirPod pro Current behaviour long press changes noise cancellation pause/play routes to the ongoing media session and plays music Expected behaviour play/pause should map to transmit/stopTransmit fullDuplex/call Anchor mini: Current behaviour long press activates siri pressing again after siri is active, starts transmission long press activates siri again pressing again after siri is active, stops transmission pause/play routes to the ongoing media session and plays music Expected behaviour play/pause should map to transmit/stopTransmit IF I have to use long press, it should at least not trigger siri AirPod pro Current behaviour long press changes noise cancellation pause/play maps to mute/unmute (even if media is playing) Expected behaviour This makes sense for call behaviour, I wish it worked this well for PTT The intention here is to be able to fully interact with a channel hands-free. The current API seems to make that impossible. Is that by design? Reading all the docs seems to suggest its intended for transmit/stopTransmit to be doable just with the play/pause buttons, but even apple hardware seems to not support that.
0
0
529
Nov ’24