I'm encountering an issue with front camera video recordings via browser (Safari/Chrome) on devices running iOS/iPadOS 18 and above:
On iPad, the recorded video appears upside down.
On iPhone, the recorded video is rotated 90 degrees.
The rear camera functions correctly without orientation issues.
This problem seems specific to browser-based recordings, as the native Camera app records videos with the correct orientation.
Has anyone else experienced this behavior? Is there a known workaround or fix?
The preview while recording is fine, the recorded video is oriented incorrectly.
Posts under iOS tag
200 Posts
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hello,
We're seeing an intermittent issue when playing back FairPlay-protected HLS downloads while the device is offline.
Assets are downloaded using AVAggregateAssetDownloadTask with FairPlay protection.
After download, asset.assetCache.isPlayableOffline == true.
On first playback attempt (offline), ~8% of downloads fail.
Retrying playback always works. We recreate the asset and player on each attempt.
During the playback setup, we try to load variants via:
try await asset.load(.variants)
This call sometimes fails with:
Error Domain=NSURLErrorDomain Code=-1009 “The Internet connection appears to be offline.” UserInfo={NSUnderlyingError=0x105654a00 {Error Domain=NSURLErrorDomain Code=-1009 “The Internet connection appears to be offline.” UserInfo={NSDescription=The Internet connection appears to be offline.}}, NSErrorFailingURLStringKey=file:///private/var/mobile/Containers/Data/Application/2DDF9D7C-9197-46BE-8690-C23EE75C9E90/Library/com.apple.UserManagedAssets.XVvqfh/Baggage_9DD4E2D3F9C0E68F.movpkg/, NSErrorFailingURLKey=file:///private/var/mobile/Containers/Data/Application/2DDF9D7C-9197-46BE-8690-C23EE75C9E90/Library/com.apple.UserManagedAssets.XVvqfh/Baggage_9DD4E2D3F9C0E68F.movpkg/, NSURL=file:///private/var/mobile/Containers/Data/Application/2DDF9D7C-9197-46BE-8690-C23EE75C9E90/Library/com.apple.UserManagedAssets.XVvqfh/Baggage_9DD4E2D3F9C0E68F.movpkg/, AVErrorFailedDependenciesKey=(
“assetProperty_HLSAlternates”
), NSLocalizedDescription=The Internet connection appears to be offline.}
This variant load is used to determine available audio tracks, check for Dolby support, and apply user language preferences.
After this step, the AVPlayerItem also fails via Combine’s publisher for .status.
However, retrying the entire process immediately after (same offline conditions, same asset path, new AVURLAsset) results in successful playback.
Assets are represented using the following class:
public class DownloadedAsset: AVURLAsset {
public let id: String
public let localFileUrl: URL
public let fairplayLicenseUrlString: String?
public let drmToken: String?
var isProtected: Bool {
return fairplayLicenseUrlString != nil
}
public init(id: String,
localFileUrl: URL,
fairplayLicenseUrlString: String?,
drmToken: String?) {
self.id = id
self.localFileUrl = localFileUrl
self.fairplayLicenseUrlString = fairplayLicenseUrlString
self.drmToken = drmToken
super.init(url: localFileUrl, options: nil)
}
}
We use user-selected quality levels to control bitrate and multichannel (e.g. Dolby 5.1) downloads:
let downloadQuality = UserDefaults.standard.downloadVideoQuality
let bitrate: Int
let shouldDownloadMultichannelTracks: Bool
switch downloadQuality {
case .dataSaver:
shouldDownloadMultichannelTracks = false
bitrate = 596564
case .standard:
shouldDownloadMultichannelTracks = false
bitrate = 1503844
case .best:
shouldDownloadMultichannelTracks = true
bitrate = 7038970
}
var selections = multichannelIdentifiedMediaSelections
if !shouldDownloadMultichannelTracks {
selections = selections.filter { !$0.isMultichannel }
}
let task = session.aggregateAssetDownloadTask(
with: asset,
mediaSelections: selections.map { $0.mediaSelection },
assetTitle: title,
assetArtworkData: nil,
options: [AVAssetDownloadTaskMinimumRequiredMediaBitrateKey: bitrate]
)
Seen on devices running iOS 16, iOS 17, and iOS 18.
What could cause the initial failure of an otherwise valid, offline-ready FairPlay HLS asset?
Could .load(.variants) internally trigger a failed network resolution, even when offline?
Is there an internal caching or initialization behavior in AVFoundation that might explain why the second attempt works?
Any guidance would be appreciated.
Topic:
Media Technologies
SubTopic:
Streaming
Tags:
FairPlay Streaming
iOS
HTTP Live Streaming
AVFoundation
Hello everyone,
I’d like to propose Sense & Store — a seamless integration between Safari and the App Store, powered by on-device AI, designed to understand what users are reading, searching, or selecting in Safari, and suggest relevant apps that match their current context or intention.
🔍 Key Idea:
“Sense” the user’s need through intelligent analysis of web content, then “Store” — offer the most relevant app, either already installed or available in the App Store.
🌟 Core Features:
• AI-powered context detection directly inside Safari
• Real-time app suggestions based on user intent
• Smart overlays when selecting text or data (e.g., phone numbers, emails, tools)
• Privacy-first: All AI runs on-device (Apple Neural Engine)
• Instant App Launch or Installation via StoreKit
✅ Examples:
• Reading an article on productivity? → Suggests Notion or Things.
• Looking up meditation tips? → Recommends Calm or Headspace.
• Selecting a phone number? → Offers CRM or spam blocker apps.
• Exploring code samples? → Suggests Pythonista or developer tools.
🔒 Privacy & Performance:
• 100% on-device intelligence (no data sent to servers)
• Follows Apple’s privacy framework
• Works with SafariKit + StoreKit + CoreML
⸻
I’m happy to provide a full prototype roadmap and technical architecture. Feedback and collaboration are welcome!
Would love to hear your thoughts — especially from developers who build for Safari, App Clips, or work with CoreML.
Thanks!
by: Apple lover....
Hello everyone,
I’d like to propose Sense & Store — a seamless integration between Safari and the App Store, powered by on-device AI, designed to understand what users are reading, searching, or selecting in Safari, and suggest relevant apps that match their current context or intention.
🔍 Key Idea:
“Sense” the user’s need through intelligent analysis of web content, then “Store” — offer the most relevant app, either already installed or available in the App Store.
🌟 Core Features:
• AI-powered context detection directly inside Safari
• Real-time app suggestions based on user intent
• Smart overlays when selecting text or data (e.g., phone numbers, emails, tools)
• Privacy-first: All AI runs on-device (Apple Neural Engine)
• Instant App Launch or Installation via StoreKit
✅ Examples:
• Reading an article on productivity? → Suggests Notion or Things.
• Looking up meditation tips? → Recommends Calm or Headspace.
• Selecting a phone number? → Offers CRM or spam blocker apps.
• Exploring code samples? → Suggests Pythonista or developer tools.
🔒 Privacy & Performance:
• 100% on-device intelligence (no data sent to servers)
• Follows Apple’s privacy framework
• Works with SafariKit + StoreKit + CoreML
⸻
I’m happy to provide a full prototype roadmap and technical architecture. Feedback and collaboration are welcome!
Would love to hear your thoughts — especially from developers who build for Safari, App Clips, or work with CoreML.
Thanks!
Jose Luiz Horta Barbosa Maurity Cruz - Apple lover...
Howdy. I'm trying to access media from a users song library and receive:
<ICUserIdentityStoreACAccountBackend: 0x148f8af30> Failed to initialize active account, error=Error Domain=ICError Code=-7013 "Client is not entitled to access account store" UserInfo={NSDebugDescription=Client is not entitled to access account store}
I'm told I need to add a Media Library Access Capability. Nothing like this shows up in Xcode under Signing & Capabilities > +Capabilities. Also I can't find anything like this in my account in dev.apple.com.
How do I enable myself and a test user using another iPhone device to access my music and their music respectively?
Thanks!
Topic:
Media Technologies
SubTopic:
General
Tags:
App Tracking Transparency
Media Player
iOS
MusicKit
Added a view controller in the storyboard, added a tableview in this view, and added a cell under the table, when I run the APP to jump to the page, when using the narration function, I find that when I use three fingers to swipe up or down, a sentence will be broadcast in English, I want to no longer change the accessiblity of the cell, when I perform the behavior of swiping up or down with three fingers, Broadcast how Chinese should be implemented.
There are significant crash reports coming from iOS 18 users regarding AVKit framework that starts from this line [AVPlayerController _observeValueForKeyPath:oldValue:newValue:] which seems to be coming from iOS internal SDK. There are 2 kinds of crash we found:
UI modification on background thread
From the stack trace it seems like when AVPictureInPictureController is being deallocated and its view is being removed from superview somehow the code is being executed in background thread because there is this line there _AssertAutoLayoutOnAllowedThreadsOnly highlighted before the crash.
But I’ve checked our code that plays around AVPictureInPictureController, in the locations where we would deallocate the object it will always be called on main thread which are insideviewDidLoad and deinit inside UIViewController class. From the log, it seems like the crash happened when user try to open another content when PIP player is active resulting in the current PIP instance will be replaced with a new one. My suspect is the observation logic inside AVPlayerController could be the hint to this issue, probably something broken over there since this issue happened across our app versions on iOS 18 users only.
Unfortunately, I was unable to reproduce this issue yet but one of my colleagues reproduced it once but haven’t been able to do it again since. The reports keep raising each day up to 1.3k events in the last 30 days now.
Over release object
This one has lower reports than the first one but I decided to include it since it might have relevant information regarding the first crash since the starting stack trace is similar. The crash timing seems to be similar to the first one, where we deallocate existing AVPictureInPictureController and later replace it with a new one and also found only in iOS 18 users which also refers to [AVPlayerController _observeValueForKeyPath:oldValue:newValue:]. I also was unable to reproduce this issue so far.
Oh, and both of the issues happened on both iPhone and iPad.
We’d appreciate any advice on what we can do to avoid this in the future and probably any hint on why it could happened.
I have reported this issue with bug number: FB15620734
I also attached one sample crash report for each of the crashes here.
non ui thread access.crash
over release.crash
We just dropped support for iOS 16 in our app and migrated to the new properties on Locale to extract the language code, region, and script. However, after doing this we are seeing an issue where the script property is returning a value when the language has no script.
Here is the initializer that we are using to populate the values. The identifier is coming from the preferredLanguages property that is found on Locale.
init?(identifier: String) {
let locale = Locale(identifier: identifier)
guard
let languageCode = locale.language.languageCode?.identifier
else {
return nil
}
language = languageCode
region = locale.region?.identifier
script = locale.language.script?.identifier
}
Whenever I inspect locale.language I see all of the correct values. However, when I inspect locale.language.script directly it is always returning Latn as the value. If I inspect the deprecated locale.scriptCode property it will return nil as expected.
Here is an example from the debugger for en-AU. I also see the same for other languages such as en-AE, pt-BR.
Since the language components show the script as nil, then I would expect locale.language.script?.identifier to also return nil.
I have been appealing to Apple to delete this rejected build since we have a new API and an upgraded new build with new IOS version to submit but we must up a version from 2 to 3.
But the review BOT, I believe, doesn't read my appeal. It keeps responding with exactly the same rejection notes as before.
I don't see the [+] button top left so I can't add a new version. I sent emails and made appeals to the "Review Board" link and submitted a case. There has been no response. It's been over a month.
Can anyone recommend a solution, please?
When my app starts it loads data (of vehicle models, manufacturers, ...) from JSON files into CoreData. This content is static.
Some CoreData entities have fields that can be set by the user, for example an isFavorite boolean field.
How do I tell CloudKit that my CoreData objects are 'static' and must not be duplicated on other devices (that will also load it from JSON files).
In other words, how can I make sure that the CloudKit knows that the record created from JSON for vehicle model XYZ on one device is the same record that was created from JSON on any other device?
I'm using NSPersistentCloudKitContainer.
My Objective-C Catalyst app when built with Xcode 16.x/iOS 18 does not have a visible Tab Bar when run on Sequoia. App starts up in first tab, but there is no way to access other tabs. The same app when run on macOS Sonoma (or macOS Catalina) has a normal Tab Bar.
The app has an initial View UITabBarController with 3 tabs. The main tab is a UiSplitViewController. Minimum macOS deployment 10.5.
If app is built on Sonoma with Xcode 15.x/iOS 17 the Tab Bar is normal on macOS Sonoma, Sequoia, and Catalina.
I've tried without success:
if (@available(macCatalyst 18.0, *)) {
self.tabBarController.tabBarHidden = false;
} else {
// Fallback on earlier versions
}
I wonder if this console log message has anything to do with the problem:
CLIENT OF UIKIT REQUIRES UPDATE: This process does not adopt UIScene lifecycle. This will become an assert in a future version.
I'm not sure if I have found a bug with iOS or if it's just unexpected behavior with my implementation. I have a gomobile library that sets up a local http server. It needs to be able to write to temporary storage. If I use the shared library from my main apps process it can write to the file manager.default temporary storage.
while Xcode is running a debug session I can use that same process from my file provider replicated extension and it works fine. However I realized running my file provider extension where it starts the gomobile shared library directly instead of first from my app the library fails to write anything to the file provider manager default temporary storage or the file provider manager for my file provider domain temporary storage or even the app group library.
it is odd, because I have a swift URL extension that confirms the temporary storage can be written to from swift. I have monitored console logs for fileproviderd, my file extension and have tried writing data to a log file. nothing seems to catch exactly what causes the file provider extension to crash and restart.
I also cannot keep the shared gomobile server running in the background on iOS even if I were to force the user to "authenticate" with the main app first. Im pretty sure the file provider extension needs to run the gomobile library for it to work right.
I'm wondering if something may be wrong with the iOS sandbox that could be preventing the file provider extension to let a c based gomobile shared library from accessing the temporary storage.
Any guidance for further things to try would be greatly appreciated. I have tried every avenue I can think of.
I cannot run just the appex itself on either my m4 pro MacBook or my iPhone so attaching the debugger has been tricky and I don't see much in the way of useful logs in console app either just a swarm of noise.
Im fairly confident it's an issue to writing to temporary storage from the gomobile c library and not much else. App was working great on macOS designed for iPad which just seemed rather ironic that an iOS code base runs better on macOS than it was able to on my iPhone 16 pro max. Like im all for the sandbox I just wish it didn't treat c level gomobile libraries different than it treats the swift code itself.
Hello,
I recently enrolled in the Apple Developer Program and created an App ID with the bundle ID com.echo.eyes.voice.
I am trying to enable Speech Recognition in the App ID capabilities list, but the option does not appear — even after waiting over a week since my membership was activated.
I’ve already:
Confirmed my Apple Developer account is active
Checked the Identifiers section in the Developer portal
Tried editing the App ID, but Speech Recognition is not listed
Contacted both Developer Support and Developer Technical Support (Case #102594089120), but was told to post here for help
My app uses Capacitor + the @capacitor-community/speech-recognition plugin. I need the com.apple.developer.speech-recognition entitlement to appear so I can use native voice input in iOS.
I would really appreciate help from an Apple engineer or anyone who has faced this issue.
Thank you,
— Daniel Colyer
I have a UITextView that contains paragraphs with text bullet lists (via NSTextList). I also implement NSTextContentStorageDelegate.textContentStorage(_:, textParagraphWith:) in order to apply some custom attributes to the text without affecting the underlying attributed text. My implementation returns a new NSParagraph that modifies the foreground color of the text. I based this on the example in the WWDC 21 session "Meet Text Kit 2".
UITextView stops rendering the bullets when I implement the delegate function and return a custom paragraph. Why?
func textContentStorage(_ textContentStorage: NSTextContentStorage, textParagraphWith range: NSRange) -> NSTextParagraph? {
guard let originalText = textContentStorage.textStorage?.attributedSubstring(from: range) else { return nil }
let updatedText = NSMutableAttributedString(attributedString: originalText)
updatedText.addAttribute(.foregroundColor, value: UIColor.green, range: NSRange(location: 0, length: updatedText.length))
let paragraph = NSTextParagraph(attributedString: updatedText)
// Verify that the text still contains NSTextList
if let paragraphStyle = paragraph.attributedString.attribute(.paragraphStyle, at: 0, effectiveRange: nil) as? NSParagraphStyle {
assert(!paragraphStyle.textLists.isEmpty)
} else {
assertionFailure("Paragraph has lost its text lists")
}
return paragraph
}
I have filed a bug report for this (FB17734946), but I'm posting it here verbatim in case others have the same issue and in hopes of getting attention from an Apple engineer sooner.
When calling setNeedsDisplayInRect on a CATiledLayer - or a UIView whose backing layer is CATiledLayer - one would expect to re-draw only a region identified by the rect passed to the method. This is even written in the documentation for the class:
"Regions of the layer may be invalidated using the setNeedsDisplayInRect: method however the update will be asynchronous. While the next display update will most likely not contain the updated content, a future update will."
However, upon calling this method, CATiledLayer redraws whole contents instead of just the tile at the specified rect, and it flashes when doing so. It behaves exactly the same as if one had called setNeedsDisplay without passing any rect; all contents are cleared and re-drawn again. I'm 100% sure I've passed in the correct rect of the exact tile that I need to redraw. I have even tried passing much smaller rects, but still the same. (And yes, the rect I've passed accounts for the current level of detail.)
I have found this GitHub repo https://github.com/frankus/NetPhotoScroller, which based on discussion from here https://forums.macrumors.com/threads/catiledlayer-blanks-out-tiles-when-redrawing.1333948/ aims at solving these issues by using two private methods on CATiledLayer class:
(void)setNeedsDisplayInRect:(CGRect)r levelOfDetail:(int)level;
(BOOL)canDrawRect:(CGRect)rect levelOfDetail:(int)level;
I have explored the repo in detail, however I wasn't able to test exactly this code from the GitHub repo. I have tried using those two private methods myself (through an Objective-C class that defines the methods in the header file and then a swift class which inherits it), but I couldn't solve the issue; the flashing and the full re-draw is still there.
After doing a lot of research, the conclusion seems to be that one cannot use CATiledLayer with contents that are downloaded remotely, on demand, as tiles are being requested.
I have, however, found one interesting thing which seems to work so far: before calling setNeedsDisplayInRect (or just setNeedsDisplay, as they behave the same for CATiledLayer in my testing), cache the current layer's contents, and after calling setNeedsDisplay (or setNeedsDisplayInRect), restore the contents back to the layer. This prevents flashing and preserves any tiles that were drawn at the time of the re-draw.
let c = tiledLayer.contents
tiledLayer.setNeedsDisplay(tileRect)
tiledLayer.contents = c
However! Docs clearly state the warning:
Do not attempt to directly modify the contents property of a CATiledLayer object. Doing so disables the ability of a tiled layer to asynchronously provide tiled content, effectively turning the layer into a regular CALayer object.
I believe this message implies modifying the contents property with some raw content, like image data, and that it may be safe to re-apply the existing contents (which are in my testing of type CAImageProvider) -- but I can't rely on an implementation detail in my production app.
I have tested this and confirmed that the bug appears on:
iPhone 14 Pro, iOS 18.5
iPhone 13 Pro, iOS 17.5.1
iPhone 5s, iOS 15.8.3
iPad Pro 1st gen, iPadOS 18.4.1
a couple simulator versions
I can also confirm that the fix (to re-apply contents property) is also working properly on all these versions.
Is this expected behavior, that tiled layer redraws itself entirely instead of redrawing specific tiles?
Is it safe to modify contents of a CATiledLayer by re-applying the existing contents?
If not, is there an alternative to avoid flashing?
Hello Apple Developer Community,
I'm investigating Core ML model loading behavior and noticed that even when the compiled model path remains unchanged after an APP update, the first run still triggers an "uncached load" process. This seems to impact user experience with unnecessary delays.
Question: Does Core ML provide any public API to check whether a compiled model (from a specific .mlmodelc path) is already cached in the system?
If such API exists, we'd like to use it for pre-loading decision logic - only perform background pre-load when the model isn't cached.
Has anyone encountered similar scenarios or found official solutions? Any insights would be greatly appreciated!
I’d like to create a button on my iPhone Home Screen that changes settings if I am using an external lens mounted to my phone.
This is what I have along with notes on what I hope to do. But it does not work. I am only able to open either the camera settings or the camera preserve settings but nothing will toggle macro controls.
There is an additional step within the camera app as well but it can be skipped for brevity.
Name: Telephoto and macro lenses
Check if macro settings are already enabled or not. (Settings > Camera > Macro Control toggle) as well as (Settings > Camera > Preserve Settings > Macro Control toggle)
Non-working shortcut code:
Verification question: "Are you using an external lens?"
If yes,
Notification: "Turning on Macro Control and preserve settings for macro control." (Run below immediately)
prefs:root=CAMERA&path=Turn%20On%20Macro_Control
And
prefs:root=CAMERA&path=CameraPreserveSettingsSwitch&path=Turn%20On%20Macro_Control
If no,
Notification: "Turning off Macro Control and Preserve Settings for macro control." (Run below immediately)
prefs:root=CAMERA&path=Turn%20Off%20Macro_Control
And
prefs:root=CAMERA&path=CameraPreserveSettingsSwitch&path=Turn%20Off%20Macro_Control
Hello Apple Developer Community,
I'm investigating Core ML model loading behavior and noticed that even when the compiled model path remains unchanged after an APP update, the first run still triggers an "uncached load" process. This seems to impact user experience with unnecessary delays.
Question: Does Core ML provide any public API to check whether a compiled model (from a specific .mlmodelc path) is already cached in the system?
If such API exists, we'd like to use it for pre-loading decision logic - only perform background pre-load when the model isn't cached.
Has anyone encountered similar scenarios or found official solutions? Any insights would be greatly appreciated!
Hi,
I am trying to remove the audio controls for my app on the lock screen. Since I use WKWebView, there are 3 audio tags in my html and I play and pause em via JS. However, if I do not play any sound since app launch, there are no audio controls on the lock screen. But if I play one of those 3 files (they are even less then 3 Sec sound effects e.g. for buttons) the audio controls appears on lock screen.
Note even when the sounds on pause() or not playing they were listed on the lock screen.
What I have tried so far without success
MPNowPlayingInfoCenter.default().nowPlayingInfo = [:]
and
``try audioSession.setCategory(.playback, mode: .default, options: [])
try audioSession.setActive(false, options: .notifyOthersOnDeactivation)``
and
UIApplication.shared.endReceivingRemoteControlEvents()
Another problem is that the app scales with iOS system settings "display zoom". Is there a way to deny it?
It is latest Xcode verion 16.3 and iOS 18.
I have no background mode in my Capabilities.
Nothing worked so far. Has anyone an idea?
Greetings
Hi Apple engineering team,
I’m trying to integrate the new Live Caller ID Lookup (PIR) on iOS using your pir-service-example code as well as a custom mock server in Vapor, but the extension never advances past the /issue/token-key-for-user-token step. I’ve tried both:
1. Official Example
Cloned https://github.com/apple/pir-service-example
Ran PIRService locally
Confirmed that
GET /.well-known/private-token-issuer-directory → 200
GET /issue/token-key-for-user-token → 200 (DER bytes, correct SPKI)
No POST /issue ever fires
2. Mock Server (Vapor)
Implemented all five endpoints (/config, /.well-known/private-token-issuer-directory, /issue/token-key-for-user-token, /issue, /queries)
Verified with curl and openssl asn1parse that:
GET /.well-known/private-token-issuer-directory
Content-Type: application/private-token-issuer-directory
{ "issuer-request-uri":"https://…/issue", "token-keys":[…] }
GET /issue/token-key-for-user-token
Content-Type: application/octet-stream
<DER bytes>
Added Cache-Control: public, max-age=3600 on directory and SPKI
Stubbed POST /issue to always return { "token": "" }
Still no POST /issue request from the extension
Reproduction Steps
Install and enable a Live Lookup extension pointing to my server.
Trigger an incoming call on device.
Watch server logs—only see the two GETs, never /issue or /queries.
Expected Behavior
After fetching the SPKI DER, the framework should issue a POST /issue call (Privacy Pass flow) and then POST /queries.
Observed Behavior
Stuck in an infinite loop of:
GET /.well-known/private-token-issuer-directory
GET /issue/token-key-for-user-token
(repeat…)
No progression to the /issue or /queries endpoints.
What I’ve Tried
Verified JSON kebab-case and headers exactly match examples
Confirmed SPKI DER is valid via openssl asn1parse
Added Cache-Control headers
Tested on real device, localhost url, and ngrok public URL
Mocked a valid-looking token response
Could you advise what additional requirement or format detail I’m missing that prevents from advancing past /issue/token-key-for-user-token?
These are the main files:
LiveLookupExtension.swift
routes.swift
service-config.json
Thanks in advance!