Hello,
Back in January 2024, I filed a bug report regarding a cache being kept by the macOS Wallpaper Agent. This cache contains every image ever set as a users wallpaper, and at the time the issue was reported, it never cleared, leaving hundreds of gigabytes wasted on users disks in some cases.
FB13536275
This issue was ultimately fixed in macOS 15.1 beta 6, and remained fixed for the duration of macOS 15. The fix was excellent - the cache was reduced to storing just 2-3 days worth of images.
Sadly, we've discovered that this issue is back in macOS Tahoe. The cache has moved locations, and once again is not clearing. We have filed this bug again, less than a year after it was first fixed:
FB20636593
We develop an app called 24 Hour Wallpaper that keeps the wallpaper in sync with the time of day. This necessitates that the app regularly changes the wallpaper, which sadly now results in an infinitely growing and useless cache of BMP files generated by the system.
As we waited 10 months for this to get fixed the first time, we expect to wait at least that long to get it fixed again, and have no confidence that it will stay fixed because the last fix lasted less than a year. This leaves us in a bad position, as people can't use our app without the cache growing arbitrarily and ultimately completely filling their disk.
We've already had customers call Apple to complain about this, and the good news is that the support agents understand that this is a problem with macOS, not with our app.
What we've decided to do is add a feature to the app that monitors the size of this cache and periodically deletes it. We're required to get the users permission the first time to do this, but after that the permission is cached, so the app can keep the cache folder clean regardless of if macOS is doing it's job or not.
We haven't seen any side effects or problems as a result of doing this. We've seen other apps like CleanMyMac do this without any problems. We're wondering if there is anything we should be aware of regarding this caches behavior before releasing this flushing feature.
Thanks for your time,
-josh
General
RSS for tagDelve into the world of built-in app and system services available to developers. Discuss leveraging these services to enhance your app's functionality and user experience.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
My app is a VoIP softphone for Mac that allows people to make phone calls to a regular phone numbers. The app exists since before Mac App Store. The app declares itself to the system as capable of handling tel: URLs. Until now, people could change the default handler for tel URLs in FaceTime settings (Default for calls).
In macOS Tahoe 26, this doesn't seem to be possible any more. That option is gone from the FaceTime settings.
Is it completely gone or has it been moved somewhere else? If there is no UI control for this any more, is it possible to change it programmatically?
Our app is unable to write to its own sandbox container on macOS when run via “My Mac (Designed for iPad)”. This is not an issue when the app runs on iPhone or on iPad. This seems to affect all attempts to write to the file system including:
UserDefaults
Core Data (SQLite)
Firebase (Analytics, Crashlytics, Sessions)
File creation (PDFs, temp files, etc.)
We're seeing the following errors in the console:
Operation not permitted / NSCocoaErrorDomain Code=513: Permissions error when writing to disk.
CFPrefsPlistSource: Path not accessible: Failure to write to UserDefaults.
Cannot synchronize user defaults to disk: UserDefaults write blocked.
CoreData: No permissions to create file: Core Data SQLite store can't be created.
Firebase: Failed to open database: Firebase can't initialize local storage.
CGDataConsumerCreateWithFilename: failed to open ... for writing: PDF generation fails due to temp directory access issues.
Created a test project to try and reproduce the issue but unable to do so in the test project, even when setting all the build settings the same as the project having issues.
Issue with App Clip Card Showing "App Clip unavailable" for DIGI LIVE App
In my DIGI LIVE application, I have an App Clip configured for iOS 17.6+ (the file exceeds the 15 MB limit), and I have made all the necessary preparations for the App Clip to work correctly. However, the App Clip card constantly displays an "App Clip unavailable" error for various users (depending on their system language).
The URL associated with the App Clip invocation is: https://ar.digi-live.de
The AASA file is located at /.well-known/apple-app-site-association and contains the required fields for the App Clip:
{
"applinks": {
"details": [
{
"appIDs": ["N9QR6LF765.de.digilive.app"],
"components":[
{
"#": "no_universal_links",
"exclude": true,
"comment": "Matches any URL whose fragment equals no_universal_links and instructs the system not to open it as a universal link"
},
{
"/": "*",
"comment": "Matches any URL like ar.digi-live.de"
}
]
}
]
},
"appclips": {
"apps": ["N9QR6LF765.de.digilive.app.Clip"]
}
}
If I use the default App Clip link (https://appclip.apple.com/id?p=de.digilive.app.Clip) to launch it, the "App Clip unavailable" error no longer appears. But if I delete the App Clip from the settings, the error comes back, and it can only be launched again using the default link.
It feels like the problem is with the App Clip loading at the moment the App Clip card is triggered — it simply fails to load.
Video
https://storage.yandexcloud.net/mmrs/files/digilive/IMG_2259.mp4
There are some campaigns who have data in the UI (3000+ impressions), but when I try to fetch keywords report via API (/api/v5/reports/campaigns/{campaign-id}/keywords) by CountryOrRegion, it returns no data, but when fetched without CountryOrRegion, it has data.
It is a single country campaign, however, I have other single country campaigns that do share data with CountryOrRegion.
Hello,
I would like to understand the update behavior for App Clips.
Let's consider a scenario where a user has an App Clip on their device from a previous interaction. If I, as the developer, then publish a new version of the App Clip to the App Store, what is the expected behavior?
My main questions are: Will the App Clip be automatically updated in the background? Or, is user action required to get the new version, for example, by deleting the old one and re-launching it from a Smart App Banner or QR code?
Any information on this process would be greatly appreciated.
Thank you.
Hi,
We've noticed that this issue occurs more frequently after upgrading to iOS 18.4.1 and can result in one-way audio.
Our app uses CallKit with WebRTC to establish VoIP connections.
However, on iOS 18.4.1, CallKit no longer triggers:
func provider(_ provider: CXProvider, didActivate audioSession: AVAudioSession)
We're currently comparing the occurrence rate across different iOS versions to better understand the impact.
Could you please help analyze the root cause of this issue?
I'm implementing the PushToTalk framework and have encountered an issue where channelManager(_:didActivate:) is not called under specific circumstances.
What works:
App is in foreground, receives PTT push → didActivate is called ✅
App receives audio in foreground, then is backgrounded → subsequent pushes trigger didActivate ✅
What doesn't work:
App is launched, user joins channel, then immediately backgrounds
PTT push arrives while app is backgrounded
incomingPushResult is called, I return .activeRemoteParticipant(participant)
The system UI shows the speaker name correctly
However, didActivate is never called
Audio data arrives via WebSocket but cannot be played (no audio session)
Setup:
Channel joined successfully before backgrounding
UIBackgroundModes includes push-to-talk
No manual audio session activation (setActive) anywhere in my code
AVAudioEngine setup only happens inside didActivate delegate method
Issue persists even after channel restoration via channelDescriptor(restoredChannelUUID:)
Question:
Is this expected behavior or a bug? If expected, what's the correct approach to handle
incoming PTT audio when the app is backgrounded and hasn't received audio while in the
foreground yet?
Our app uses a 24-hour DeviceActivityMonitor repeating schedule to send users notifications for every hour of screen time they spend on their phone per day. Notifications are sent from eventDidReachThreshold callbacks at 1, 2, 3, etc, hour thresholds to keep them aware of their screen time.
We have recently received an influx of emails from our users that after updating to iOS 17.6.1 their DeviceActivityMonitor notifications are saying their screen time was much higher than what is shown in DeviceActivityReport and their device's Screen Time settings.
These users have disabled "Share Across Devices" - but I suspect the DeviceActivityMonitor is still getting screen time from their other devices even though that setting is turned off.
Has anybody else noticed this, understands what is causing this, or could recommend a fix that we can tell our users to do?
0
CoreText
TDecorationRun::CalculateGlyphIntersections(CGAffineTransform, TRun const&, double, double, std::__1::function<void (double, double)> const&) const + 1704
1
CoreText
TDecorationRun::CalculateGlyphIntersections(CGAffineTransform, TRun const&, double, double, std::__1::function<void (double, double)> const&) const + 1440
2
CoreText
void TDecorationRun::DrawDecorationRun<(anonymous namespace)::TRunAdapter>(CGContext*, (anonymous namespace)::TRunAdapter, (anonymous namespace)::TRunAdapter, double)::'lambda'(CGPoint, CGPoint)::operator()(CGPoint, CGPoint) const + 508
3
CoreText
TDecorator::DrawDecoration(TLineDrawContext const&, TLine const&, TInlineVector<DecorationOverride, 30ul> const*) + 2356
4
CoreText
TLine::DrawUnderlines(CGContext*) const + 104
5
CoreText
TLine::DrawGlyphs(CGContext*) const + 292
Noticing a few issues with Screen Savers in macOS Tahoe developer beta 1 :
The command
open x-apple.systempreferences:com.apple.ScreenSaver-Settings.extension
no longer works to open the Screen Savers preference pane. The reason? There is no longer a Screen Saver preference pane - it still exists, but it's now a Modal dialog (that can not be resized) that is opened from within the Wallpaper preference pane, by clicking a button.
Something funny is happening with legacyScreensaver - I think that
ScreenSaverView.Init(frame:isPreview)
may be passing wrong values, e.g. the isPreview boolean is false in Preview mode?
I've submitted Feedback # FB17895600 about some of these, and will report back more as I figure things out.
Please feel free to add to this thread, thanks!
Hello,
I’m developing an app for both iOS and macOS using Mac Catalyst.
On the iOS target, the DeclaredAgeRange framework works fine: I enabled the capability, set the entitlement, and can call the APIs successfully.
However, when I build the Mac Catalyst target, I get compile-time errors because DeclaredAgeRange cannot be imported.
Environment
Xcode 26.0.1
Deployment Target (iOS): 13.0
Mac Catalyst target enabled (same bundle ID, same source code base as iOS version)
iOS build works well with DeclaredAgeRange
What I tried
Added the Declared Age Range capability in Signing & Capabilities for iOS target → works fine.
Tried setting Mac Catalyst Deployment Target to macOS 15.0+ and rebuilding → still failed to import.
Because it didn’t help, I reverted and removed the separate macOS target setting.
On Catalyst, the module simply isn’t available (import DeclaredAgeRange fails).
From Apple’s documentation, my understanding is that DeclaredAgeRange should also be supported on Mac Catalyst (SDK 26+). But in practice I cannot import it in my Catalyst build.
Questions
Is DeclaredAgeRange actually supported on Mac Catalyst right now?
If so, are there additional setup steps (capabilities, entitlements, provisioning, etc.) specific to Catalyst?
Has anyone successfully imported and used DeclaredAgeRange in a Catalyst project?
Any guidance or confirmation would be greatly appreciated.
Thanks in advance.
Topic:
App & System Services
SubTopic:
General
Hi all,
I'm trying to add Spotlight support to a macOS app that handles custom virtual machine bundles with the .vpvm extension. I’ve followed the current documentation and used the modern CSImportExtension approach with a Spotlight Importer extension target.
Here’s what I’ve done:
App Info.plist:
Declared com.makeprog.vpvm as a UTI conforming to com.apple.package.
Registered it under UTExportedTypeDeclarations and CFBundleDocumentTypes.
Spotlight Importer Extension:
Added a new macOS target using the Spotlight Import Extension template.
Set the NSExtensionPointIdentifier to com.apple.spotlight.import.
Used CSSupportedContentTypes = com.makeprog.vpvm.
Implemented a minimal update(_ attributes:forFileAt:) method that sets displayName, title, and contentDescription.
Other steps:
Verified that the .appex is embedded under Contents/PlugIns/.
Confirmed it appears in mdimport -e output with correct UTI.
Used mdimport -m -d2 -t /path/to/file.vpvm, but I still get:
Imported '/path/to/file.vpvm' of type 'com.makeprog.vpvm' with no plugIn.
The extension is never invoked. I’ve also tried:
Ensuring the .vpvm file is a valid directory bundle.
Restarting Spotlight / rebuilding index.
Ensuring the app and extension are properly signed.
Tried installing the app in test virtual machine
Question:
Has anyone successfully used CSImportExtension for custom UTIs?
Is there something additional I need to do for the extension to be recognized and triggered?
Any advice or examples would be greatly appreciated!
Thanks in advance.
The name of our app is a portmanteau, and Siri is getting it slightly wrong. It’s putting the emphasis on the second and fourth syllables instead of the first and third. Kind of like if someone starting singing /tinˈeɪʒd muˌtent/. It’s just wrong enough to be funny. It still recognizes the correct name when someone says “hey Siri”.
We’ve already tried adding CFBundleSpokenName and INAlternativeAppNamePronunciationHint to info.plist, but neither is changing how Siri says it. We can’t put it in AppIntentVocabulary.plist because we don’t know the key path for the app name.
I implemented AlarmKit framework to my app.
It works fine when the device's screen is off.
but it doesn't fire when the screen is on.
No sound, No dynamic Island widget. what's wrong... is this just me?
I tried other apps on appstore and they worked fine.
maybe is it cause they were built with SwiftUI?
I'm a flutter developer. just cause my app is non native could this happen... I'm so frustrated.
Topic:
App & System Services
SubTopic:
General
We are not receving incoming call from blocked numbers below iOS 26 versions but same in iOS 26 onwards we are receiving the incoming call..
Can you please provide any solutions to fix the issue
Hello, developers! I’m interested in adding my native language and flag to iOS, iPadOS and MacOS. Could you please recommend a solution or provide some valuable advice on how to achieve this? I’d greatly appreciate your assistance. Thanks!
Explanation of the issue
When tethering is enabled and a wireless connection is established,
there are instances where an IP address is not assigned.
Steps to Reproduce the Issue (if possible)
Enable iPhone tethering and connect wirelessly using 11ax.
Expected Result
The iPhone assigns an IP address, enabling network connectivity.
Actual Result Observed
DHCP negotiation failed.
After attempting communication with the DHCP server via DHCP Discover, a DHCP Offer was returned from the iPhone.
If this was missed, it would retry by performing another DHCP Discover.
However, the iPhone does not issue a DHCP Offer no matter how many times it retries.
The IP address is not assigned unless the wireless connection is disconnected and reconnected.
If the initial Discover is missed, does this invalidate subsequent Offer retries?
The above issue has been confirmed on iPhone 17 Pro and iPhone 16. It does not appear to occur on iPhone 15.
Hello Team,
We are currently experiencing a challenge deploying our application to the production environment. This is because the previous deployment was done using an old account, whereas we have now created a new organizational account that we are using to deploy the updated and modified version of the application.
Kindly advise on the best practice to resolve this and proceed with the deployment.
Topic:
App & System Services
SubTopic:
General
Hi,
I'm trying to add an extension to my app on iOS 26.
I've followed the instructions on https://developer.apple.com/documentation/extensionfoundation/adding-support-for-app-extensions-to-your-app
and made it as far as being able to launch the extension:
let monitor = try await AppExtensionPoint.Monitor(appExtensionPoint: .localWebServerExtension)
currentIdentity = monitor.identities.first
if let currentIdentity = currentIdentity {
let myConfig = AppExtensionProcess.Configuration(appExtensionIdentity: currentIdentity, onInterruption: { NSLog("extension was terminated") })
myProcess = try await AppExtensionProcess(configuration: myConfig)
myConnection = try myProcess?.makeXPCConnection()
}
None of these calls throw, and when I examine myProcess from inside that code, it seems to be normal (there's a pid, for example).
Yet the code inside my extension seems to not be executed: breakpoints are not triggered, NSLog() calls do not appear on the console. The onInterruption() callback is also not triggered, or at least it does not appear on the console either.
I've probably missed something obvious, but what could it be?