We have identified an issue on iOS 18 and iOS 18.1 (developer beta) where App Clips invoked via NFC or QR codes without a pre-configured Advanced App Clip experience (aka they should be using the Default App Clip Experience) are not functioning as expected. This issue is specific to iOS 18, as the behavior works correctly on iOS devices running 17.x.
Steps to Reproduce:
Set up two scenarios:
One scenario where an App Clip has a pre-configured advanced app clip experience (with metadata such as title, subtitle, image).
Another scenario where the App Clip is invoked without any pre-configured experience (should use the default App Clip experience).
On an iOS 18 or iOS 18.1 device:
For the default App Clip experience (no pre-configured advanced app clip experence):
Scan the NFC tag when the phone is locked.
Scan the NFC tag when the phone is unlocked.
Scan the QR code.
For the pre-configured App Clip experience:
Perform the same tests (NFC and QR code scans).
Test the same scenarios on an iOS 17 device for comparison.
Expected Behavior:
For default App Clip experience invocations (NFC or QR):
Scanning NFC or QR should still trigger the App Clip card, even without metadata or a pre-configured advanced experience, on both locked and unlocked devices.
For pre-configured Advanced App Clip Experence invocations:
The App Clip card should display correctly with the configured metadata and behave as expected on both locked and unlocked devices.
Observed Behavior on iOS 18 and iOS 18.1:
For default App Clip experience invocations:
When scanning the NFC tag on a locked device, an error message is shown - e.g., "App Clip Unavailable" or "The operation couldn't be completed. (CPSErrorDomain error 2.)" .
When scanning the NFC tag on an unlocked device, the system redirects straight to the web browser instead of displaying the App Clip card (or even the normal NFC top of screen push notification style thing)
Scanning the QR code also prompts the user to open the web browser similar to the experience of scanning a non app clip QR code, skipping the expected App Clip experience.
For pre-configured Advanced App Clip Experence invocations:
The App Clip behaves as expected, showing the correct card with metadata and functioning properly on both locked and unlocked devices.
Notes:
The issue is only observed on iOS 18 and iOS 18.1, while the expected behavior is working fine on iOS 17.
This may indicate a regression or change in behavior introduced with iOS 18 that affects App Clip invocations that do not have a pre-configured experience.
General
RSS for tagDelve into the world of built-in app and system services available to developers. Discuss leveraging these services to enhance your app's functionality and user experience.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
We are currently experiencing the following weird issue with our iPhone app. As the title says, NSUserDefaults is losing our custom keys and values when phone is rebooted but not unlocked, and this is happening on a very specific scenario with ActivityKit.
Context:
We are using the NSUserDefaults in the app to store user data (e.g. username).
Issue: An error occurred with no permission to access cached messages after a restart.
Scenario: When receiving a Dynamic Island notification, if the phone is restarted, after unlocking the phone and tapping on the Dynamic Island to open the App, all cached information results in an error.
Reasons for the error:
After restarting the APP, the files are in a locked state. The Dynamic Island proactively invokes the App method. When executing the startup method and retrieving cached messages, an error occurs due to lack of permission.
All storage, including files, NSUserDefaults, Keychain, and Plist retrieval, results in errors.
The error message is as follows:
{
"errorCode": "-25308",
"errorDesc": "Error Domain=com.samsoffes.sskeychain Code=-25308 "(null)"",
"serviceName": "com.qunar.qunarclient8",
"account": "iid"
}
The data returned at this time is in a protected state, [UIApplication sharedApplication].isProtectedDataAvailable.
Any help or idea will be truly appreciated :)
Apple provides HTML Embeds for Music Kit JS, as mentioned here.
There is a tool here that you can use to create embeds.
Scroll to Preview Player.
Now, if you set your browser width to less than 490 (right when the thumbnail gets smaller), the player will move vertically if you drag up. Not by much, it's hard to see without a contrasting background.. It's more noticeable in dark mode, as a white background shows clearly, like this:
I'm misusing "drag" here: a simple, vertical scroll will cause this.
Since this happens on Apple's own site, I'm fairly certain that this is not a "me problem," but who knows!
Anyhow, I'm not sure if anyone has come across this. These embeds are pretty handy, and I'm loathe to roll my own player just to fix this.
Any constructive advice would be appreciated.
When opening our App Clip from a Live Activity, the iOS system Handoff alert blocks our app on open. It is reproducible 100% of the time. The description in the system alert is: Waiting for Handoff to {My App}. We never had this issue before and believe it is related to iOS 18. I don't have Handoff enabled anywhere in my app.
All uses of NSUserActivity explicitly block handoff
userActivity.isEligibleForHandoff = false
We have been able to locate this same issue in other iOS apps that use Live Activities and App Clips. Is this an iOS 18 system-level bug?
I am trying to parse CMSensorDataList in watchOS. The maximum batch of data is comprised of 30 minutes. The sampling rate is 50Hz, which makes it 90,000 records for 30 minutes. We iterate over each item and finally write the data to a CSV file. As this is a slow processing keeping in view the execution limits of watchOS, the process suspends when app goes to the background. This way, it takes too much time to parse a significant time data. My question is, is there a way we can serialize this CMSensorDataList as a whole and transfer it to the phone using WCSession? Or is there another effective way to achieve this?
Im update my app to new version and the widget that use IntentConfiguration created in old version not working and still lay on homescreen. Is there any way to keep old widget working fine?
I'm using Live Activity features in my app, but I want to customize the user experience across different Apple devices. Specifically, I'd like to:
Keep Live Activity enabled and functioning on the iPhone Disable or prevent Live Activity from appearing on the connected Apple Watch
Is this level of device-specific control possible with Live Activity? If so, what's the best approach to implement this functionality? What I've tried:
I've looked through Apple's documentation on Live Activity, but couldn't find specific information about device-level control. I've experimented with ActivityKit, but haven't found a clear way to distinguish between iPhone and Apple Watch when pushing updates.
Hi,
We’ve developed a workout app with a Live Activity feature to help users launch the mirroring view on iPhone, similar to the built-in workout app for biking activities.
While Live Activities are now available on watchOS 11, the integration feels a bit off for our Workout app. Is there a way to disable or exclude our Live Activity from appearing on watchOS?
Currently, when a user starts a workout, the Live Activity appears at the bottom of the screen, requiring users to tap the screen before they can use our app. The built-in Workout app doesn’t have this issue.
Additionally, our Live Activity appears in the Smart Stack, duplicating content with the built-in Workout Live Activity.
We’re unsure if we missed any keys or settings to exclude Live Activity from watchOS.
We are encountering an issue with the universal link functionality in our app, which was previously working as expected but has now stopped functioning
We have followed all the steps to configure universal links and ensured the necessary settings are in place. The associated domains are enabled within our app's capabilities, with the following domains listed:
We have also verified the apple-app-site-association files for both servers, which are accessible via the following URLs:
https://app.digiboxx.com/apple-app-site-association
These files appear to be correctly formatted according to Apple's guidelines. However, despite this, links such as https://app.digiboxx.com/share/123456 are no longer redirecting to the app.
This is a significant issue for our customers, and we would appreciate your help in resolving the matter.
Our application already supports an iMessage Extension, allowing users to create and send custom or trending stickers.
On iOS 17.0, a popup menu replaces the old tab in iMessage with a "Stickers" option, and iMessage extensions are put in the "More" option.
The "Stickers" page only shows the Sticker Pack Extension. However, an application can only support Sticker Pack Extension or iMessage Extension. Get this error: "Multiple message payload provider extensions found in app but only one is allowed".
Is there any workaround here? We want our application to keep the iMessage extension but also provide sticker packs.
Description:
I have developed an iOS app that includes a sticker pack feature. However, when adding stickers to iMessage, they are not appearing as expected. Despite following the standard procedures, the stickers are not visible in the iMessage app.
The issue persists even after ensuring compatibility with the latest iOS version. Attached is an image that highlights the problem.
Any guidance or suggestions to resolve this issue would be greatly appreciated. Thank you!
Can you initiate a live activity from a Watch app? From what I can tell you can only do it from an iOS app and then have the Watch mirror it, is that true?
If you were builiding a standalone timer app for WatchOS for instance and wanted the timer to show up automatically in Smart Stacks when the app is in the background, is this possible?
Thanks
I've noticed delays with the Live Caller ID Lookup feature, taking around 3 to 6 seconds to complete, even on repeated lookups. This seems odd since there's no server activity during these repeats, suggesting the information might be coming from a cache. Most of the time, it’s fast, but there are cases when it's unexpectedly slow, and I haven’t quite figured out the pattern yet. Is anyone else seeing this issue?
FB number FB15372765 - with sysdiagnose and video demonstrating the delay.
Dear Everyone,
We are experiencing a crucial issue with our app CERRET where it consistently pauses after being in the background for more than 9 hours.
To manage this issue we have implemented background tasks Silent Push Notifications and location updates but the problem persist
We have tried enabling Background App Refresh, disabling Low Power Mode, updating the apps and iOS, and even restarting my phone, but the problem persists.
Could you please provide guidance on how to resolve this? It is crucial for my use-case to have continuous background operation of CERRET.
Thank you for your assistance.
Do you know how to get this URL for "general/vpn and device management" of iOS18?
My Weather beach app displays the time according to the country in the weather link under iOS 17 but not under IOS 18
NSTimeZone.resetSystemTimeZone() is no longer supported by IOS 18 or it is a bug
Hello, I am currently developing an application using SensorKit to retrieve visit data. While the data retrieval works smoothly on one iPhone (iPhone 14, iOS 18.0.1), it fails on other devices, including:
iPhone 15 Pro Max with iOS 18.1 Beta
Another iPhone 14 with iOS 18.0
I’ve verified that the entitlements are configured properly, and the app has the necessary SensorKit visit permissions across all devices. Despite these steps, only one of the phones is able to retrieve the visit data correctly.
Is there any minimum hardware requirement or compatibility issue with certain models or configurations that I should be aware of for using SensorKit visits?
Any guidance or insight would be greatly appreciated!
Thank you.
Hello there,
We are developing our own server for live caller ID service, and we have some questions for end-to-end testing:
According to the official documentation, it's said that the OS on user's iPhone will issue OHTTP request to 3rd party's gateway. Is it possible to verify this behavior at local environment, using physical device?
If answer to question 1. is no, will Apple provide other beta testing methods? For example via Testflight.
Any suggestion helps. Thanks!
I am using BGProcessingTaskRequest to fetch a API make my app up to date. Sometimes this background task execute with 10 minutes some times it take more than 10 mins, Some other times its never execute.
But in my case im provide just 1 minute to BGProcessingTaskRequest.earliestBeginDate variable. And i will share my implementation here,
My codes are,
Called the register function before app launching.
let taskId = "_________"
func registerBackgroundTaks() {
BGTaskScheduler.shared.register(forTaskWithIdentifier: taskId, using: nil) { task in
self.handleBackgroundProcessRequest(task: task as! BGProcessingTask)
}
print("Receiver called")
}
Called the scheduleBackgroundPrecessingTask function when application enter in background mode.
func scheduleBackgroundPrecessingTask() {
let request = BGProcessingTaskRequest(identifier: taskId)
request.requiresNetworkConnectivity = false // Need to true if your task need to network process. Defaults to false.
request.requiresExternalPower = false
request.earliestBeginDate = Date(timeIntervalSinceNow: 1 * 60) // Featch Image Count after 1 minute.
do {
try BGTaskScheduler.shared.submit(request)
print("Process notification triggered")
} catch {
print("Could not schedule background process: \(error)")
}
}
Could anyone share any concerns to my problem? or kindly clarify me why BGProcessingTaskRequest takes time randomly?
Recently, we have started seeing this countdown in the dynamic island and it enables kind of listening mode and app completely loses the talk button. This is seen very recently and I cant make what it really is. I would like to know what this UI is and how I can bring back talk button.