I have a timer running on widget which was working completely fine till iOS 17. However, in iOS 18 it has stopped displaying
General
RSS for tagDelve into the world of built-in app and system services available to developers. Discuss leveraging these services to enhance your app's functionality and user experience.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I have a timer displaying on widget which was working fine till iOS 17. However, it is not displayed in updated version. May someone provide the insight for this
We have developed an application using xamarin forms , our iOS app is working fine till iOS17 , if we upgraded our OS version to iOS18 app is not working properly.
Visual studio for Mac 2022
Xcode 16
Minimum OS version 15.4
Xamarin.iOS version 16.4.023
We found some Firebase crashes in QLPreviewController on iOS18.1 +.
It shows cash info in QLPreviewController that we haven't changed for some years.
Please help with this.
Thanks in advance.
// stack info from Firebase
Fatal Exception: NSInvalidArgumentException
*** -[NSURL URLByAppendingPathComponent:]: component, components, or pathExtension cannot be nil.
0
CoreFoundation
__exceptionPreprocess
1
libobjc.A.dylib
objc_exception_throw
2
Foundation
-[NSURL(NSURLPathUtilities) URLByAppendingPathComponent:]
3
QuickLookUICore
+[NSURL(_QL_Utilities) _QLTemporaryFileURLWithType:filename:]
4
QuickLookUICore
+[NSURL(_QL_Utilities) _QLTemporaryFileURLWithType:uuid:]
5
QuickLook
-[QLPreviewController(ScreenshotsSupport) screenshotService:generatePDFRepresentationWithCompletion:]
6
ScreenshotServices
__82+[SSScreenshotMetadataHarvester _grabPDFRepresentationForIdentifier:withCallback:]_block_invoke_3
7
libdispatch.dylib
_dispatch_call_block_and_release
8
libdispatch.dylib
_dispatch_client_callout
9
libdispatch.dylib
_dispatch_main_queue_drain
10
libdispatch.dylib
_dispatch_main_queue_callback_4CF
11
CoreFoundation
__CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE__
12
CoreFoundation
__CFRunLoopRun
13
CoreFoundation
CFRunLoopRunSpecific
14
GraphicsServices
GSEventRunModal
15
UIKitCore
-[UIApplication _run]
16
UIKitCore
UIApplicationMain
17
Glip
main.swift - Line 13
main + 13
Hi,
I'm implementing a BADownloaderExtension in my app for essential assets. I would like to treat the install case differently than the update case, however it seems whether I "install" or "update" the app (via TestFlight) I always end up getting a BAContentRequest of type .install. I can simulate an update via xcrun, but cannot seem to get into that case in the wild. Is this expected?
I have developed a standalone WatchOS app which runs a stopwatch.
I want to develop a shortcut that launches the stopwatch. So far I have created the Intent file, and added the basic code (shown below) but the intent doesn't show in the shortcuts app.
In the build, I can see the intent metadata is extracted, so it can see it, but for some reason it doesn't show in the watch.
I downloaded Apple's demo Intent app and the intents show in the watch there. The only obvious difference is that the Apple app is developed as an iOS app with a WatchOS companion, whereas mine is standalone.
Can anyone point me to where I should look for an indicator of the problem?
Many thanks!
//
// StartStopwatch.swift
// LapStopWatchMaster
import AppIntents
import Foundation
struct StartStopWatchAppIntent: AppIntent {
static let title: LocalizedStringResource = "Start Stopwatch"
static let description = IntentDescription("Starts the stopwatch and subsequently triggers a lap.")
static let openAppWhenRun: Bool = true
@MainActor
func perform() async throws -> some IntentResult {
// Implement your app logic here
return .result(value: "Started stopwatch.")
}
}
How can I make my app support this recommendation? Is there any relevant documentation?
Anyone know what exactly this means? I have enabled in Capabilities Near Field Communications Tag Read in the Xcode project AND just to make sure added it in by hand in the app certificate section on the website, yet keep on getting the error started in the title:
"/Provisioning profile "iOS Team Provisioning Profile: --------" doesn't support the Near Field Communication Tag Reading capability."
Ive taken an Apple demo and it works right out of the box.
Any ideas?
Topic:
App & System Services
SubTopic:
General
Hello everyone,
I’m currently receiving feedback from clients in a production environment who are encountering a BadDeviceToken error with Live Activities, which is preventing their states from updating. However, for other clients, the token is working fine and everything functions as expected.
I’m collaborating with the back-end developers to gather more information about this issue, but the only log message we’re seeing is:
Failed to send a push, APNS reported an error: BadDeviceToken
I would greatly appreciate it if anyone could provide some insight or information on how to resolve this issue.
Hello,
I have a question related to the public iTunes Search API: https://performance-partners.apple.com/search-api
Do all the books have an ISBN associated? I used to do queries like:
https://itunes.apple.com/lookup?isbn=9781501110368. That book is available on Apple Books here: https://books.apple.com/us/book/it-ends-with-us/id1052928247 and the endpoint above returns informations about it.
However for newer books like:
https://itunes.apple.com/lookup?isbn=9781419766954
https://itunes.apple.com/lookup?isbn=9781250288776
Nothing comes back anymore even if those books exist there. The url's for the 2 above are:
https://books.apple.com/us/book/hot-mess-diary-of-a-wimpy-kid-19/id6476554491
https://books.apple.com/mt/book/the-mirror/id6474420363
For newer books starting the beginning of September 2024, nothing seems to come back when you search them by ISBN.
Thanks
So, I'm trying to create my own text-to-speech setup. Problem I'm having is whenever I do a test run, the speech gets a bit choppy at the start kind of skipping over maybe a word or a few characters.
A few details:
I've essentially built a separate class for handling the speech events.
AVSpeechSynthesizer is set up as a private variable for the class so I don't expect deallocation to be the issue. Especially since it's a problem at the start.
I've got a queue set up for what it's worth so that shouldn't be a problem.
I'd appreciate any advice.
How can I show my VoIP calling app in the same list as Facetime and Whatsapp as shown in the image?
My app implements VoIP calls and is integrated with CallKit.
Any tip would be appreciated!
Hello, I've been working to implement PTT in the way recommended by the documentation. The main issue is that the bluetooth methods are opaque, so I cannot solve for what I need. The result will be that I will have to resort to hacky approaches that the PTT framework seems to be intended to solve (playing silent clips, playing custom notification sounds, having long running background audio sessions).
I am testing with Anker soundcore mini as well as airpod pro.
Here's the issue: there are 2 very different behaviours depending on whether I'm using a call/fullDuplex session and a halfDuplex session.
halfDuplex
Anchor mini
Current behaviour
long press activates siri
pressing again after siri is active, starts transmission
long press activates siri again
pressing again after siri is active, stops transmission
pause/play routes to the ongoing media session and plays music
Expected behaviour
play/pause should map to transmit/stopTransmit
IF I have to use long press, it should at least not trigger siri
AirPod pro
Current behaviour
long press changes noise cancellation
pause/play routes to the ongoing media session and plays music
Expected behaviour
play/pause should map to transmit/stopTransmit
fullDuplex/call
Anchor mini:
Current behaviour
long press activates siri
pressing again after siri is active, starts transmission
long press activates siri again
pressing again after siri is active, stops transmission
pause/play routes to the ongoing media session and plays music
Expected behaviour
play/pause should map to transmit/stopTransmit
IF I have to use long press, it should at least not trigger siri
AirPod pro
Current behaviour
long press changes noise cancellation
pause/play maps to mute/unmute (even if media is playing)
Expected behaviour
This makes sense for call behaviour, I wish it worked this well for PTT
The intention here is to be able to fully interact with a channel hands-free. The current API seems to make that impossible. Is that by design? Reading all the docs seems to suggest its intended for transmit/stopTransmit to be doable just with the play/pause buttons, but even apple hardware seems to not support that.
Hello,
I’m experiencing an issue with Siri on iOS where it prioritizes a contact from the wrong account, even though I’ve set a default account for Contacts.
Details of the issue:
I have two contact groups:
Exchange (Outlook) — my default account.
iCloud.
There’s a contact, "Alena Jorse," which exists in both groups:
In the Exchange group, the name is saved as Alena Jorse.
In the iCloud group, it is saved as Alena Jorse with double-strike formatting (e.g., "Alena Jorse**").
Both contacts have the same phone number.
When I ask Siri to call "Alena Jorse," it selects the iCloud contact ("Alena Jorse**") instead of the Exchange contact, despite Exchange being set as the default Contacts account in my iOS settings.
Expected Behavior:
Siri should prioritize the contact from the default account (Exchange) and ignore other accounts unless specified.
Steps to Reproduce:
Have duplicate contacts in two groups (Exchange and iCloud) with the same phone number.
Set Exchange as the default Contacts account.
Ask Siri to call the contact.
Troubleshooting Steps Taken:
Ensured the default account is set to Exchange.
Verified both contacts have the same phone number.
Tested by disabling iCloud temporarily, which resolves the issue (but is not a viable long-term solution).
Request:
Could you please advise if this is intended behavior or a bug? If it’s a configuration issue, how can I ensure Siri prioritizes the default account for contacts? If it’s a bug, could this be investigated further?
Thank you for your assistance.
Hi,
I am trying to determine if the Mac that is running my app has an active screen sharing session or not. Is there a way to detect this? Potentially using system API's or a system command?
Any help would be greatly appreciated, thank you!
Topic:
App & System Services
SubTopic:
General
I'm working with the FamilyControls API and am running into an issue with sharing ActivityTokens between devices in the same family sharing network.
Based on this documentation, ActivityTokens are only accessible and readable by other members in the family sharing network. My app is based on the idea that if one user selects the Games category in the FamilyActivityPicker, then this token can be shared with another device in the same family-sharing network and this other device can read and display the category.
So my question is:
If a user in the network selects an activity category in the FamilyActivityPicker, can this category token be shared, read, and used by another user in the family-sharing network?
After updating to watchOS 11.1, updates using WidgetCenter.shared.reloadAllTimelines() in WKRefreshBackgroundTask stopped working. When the background task is triggered, it gets data from the phone and updates the WidgetKit complications. But now the refresh call WidgetCenter.shared.reloadAllTimelines() does not update the complications.
There is a bug when try to open the push notification of appintent at the lock screen.
Hello,
I'm trying to add a working shortcut to my app that will open a Privacy & Security page in System Settings under Security section where prompts to allow system extension appears.
Typically open x-apple.systempreferences:com.apple.settings.PrivacySecurity.extension from the Terminal only open the Privacy & Security page.
I want to emulate the button from this system window.
Topic:
App & System Services
SubTopic:
General
Hello,
I’ve implemented a feature in my app using AppIntent. When the app is not running in the background and is launched for the first time via a shortcut, both application:didFinishLaunchingWithOptions: and applicationWillEnterForeground: are called.
Normally, on the first launch, applicationWillEnterForeground: is not invoked. However, this behavior seems to occur only when the app is launched through a shortcut.
I’d like to understand why applicationWillEnterForeground: is being called in this scenario.
For reference, the AppIntent has openAppWhenRun set to true.
Thank you in advance for your help!