Is it possible to start screen recording (through Control Center) without user prompt?
I mean to ask user permission for the first time and after that to start and stop recording programmatically only?
I need to record screen only for specific events.
Posts under iOS tag
200 Posts
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Problem Description
I am using CLLocationManager to obtain the device's compass heading (direction), and I have encountered an abnormal behavior:
When the user is stationary: After calling startUpdatingHeading(), the CLHeading object returned in the locationManager(_:didUpdateHeading:) callback correctly reflects the device’s actual physical orientation (i.e., the direction the top of the device is pointing) in terms of magnetic north / true north, via the magneticHeading and trueHeading properties. When I rotate the device, the heading values change accordingly — this is the expected behavior.
But when the user is in motion (e.g., driving a car): Even if I rotate the device, the values of magneticHeading and trueHeading no longer reflect the device’s actual orientation. Instead, they consistently return what appears to be the user's or vehicle's travel direction (forward direction). In other words, the compass behaves as if it is reporting the direction of motion rather than the device’s actual facing direction.
Only after the user has completely stopped moving, does rotating the device again result in magneticHeading and trueHeading reflecting the actual device orientation as expected.
However, on another device running iOS 16 (iPhone XR), this behavior does not occur — everything works normally.
Expected Behavior
I expect that regardless of whether the user is moving or not, the CLHeading values returned by CLLocationManager should always represent the physical orientation of the device itself (i.e., which direction the top of the device is pointing), as a standard compass should.
Actual Behavior
User is stationary, rotating the device: magneticHeading / trueHeading change properly according to the device’s actual orientation
User is in motion (e.g., driving):magneticHeading / trueHeading remain fixed to the direction of motion (travel direction), and do not change when the device is rotated
User stops moving, then rotates the device:Compass behaves normally again, reflecting the actual device orientation
Environment Information
iOS Version: iOS 26.0.1
Device Models: iPhone 15 Pro / iPhone 17 Pro
Xcode Version: Xcode 26.0.1
Language: Objective-C
Questions
Is this a known issue in iOS? Are there any related radars or official documentation about it?
Have other developers encountered similar issues, especially where CLHeading behaves incorrectly when the user is in motion?
Do I need to set any specific parameters in CLLocationManager (such as headingOrientation) to resolve or work around this issue?
🙏 Thank you for your help — any insights, experiences, or official feedback regarding this issue would be greatly appreciated!
When I try to add the external test group to the current version of the test group, there is an error in processing the request.
Context
I’m deploying large language models on iPhone using llama.cpp. A new iPhone Air (12 GB RAM) reports a Metal MTLDevice.recommendedMaxWorkingSetSize of 8,192 MB, and my attempt to load Llama-2-13B Q4_K (~7.32 GB weights) fails during model initialization.
Environment
Device: iPhone Air (12 GB RAM)
iOS: 26
Xcode: 26.0.1
Build: Metal backend enabled llama.cpp
App runs on device (not Simulator)
What I’m seeing
MTLCreateSystemDefaultDevice().recommendedMaxWorkingSetSize == 8192 MiB
Loading Llama-2-13B Q4_K (7.32 GB) fails to complete. Logs indicate memory pressure / allocation issues consistent with the 8 GB working-set guidance.
Smaller models (e.g., 7B/8B with similar quantization) load and run (8B Q4_K provide around 9 tokens/second decoding speed).
Questions
Is 8,192 MB an expected recommendedMaxWorkingSetSize on a 12 GB iPhone?
What values should I expect on other 2025 devices including iPhone 17 (8 GB RAM) and iPhone 17 Pro (12 GB RAM)
Is it strictly enforced by Metal allocations (heaps/buffers), or advisory for best performance/eviction behavior?
Can a process practically exceed this for long-lived buffers without immediate Jetsam risk?
Any guidance for LLM scenarios near the limit?
Hello Team,
I try to delete photo from Photos for that i used this method,
[[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{
[PHAssetChangeRequest deleteAssets:@[assetToDelete]];
completionHandler:^(BOOL success, NSError *error) {
}];
This method pops up a dialog with Don't Allow or Delete. But some time in some iPhones not respond PHAssetChangeRequest deleteAssets method that's why that completionHandler not called because of that i can't perform any operation of PHPhotoLibrary then after.
If I restart my iPhone then it works. Many users of my app complained about this issue. I have an iPhone 11 with iOS 15.3. But some iOS 12,14,16 users also face the same issue.
So what exact issue is there? Is it related to iOS or a method?
Thanks,
Ankur
Problem Details : Could not launch “App”
Reproduction Route : Install Xcode26.0 > Connect to iPhone6(iOS12) > Run app
We tried this solution but didn't work.
To make Xcode 26 recognize and run apps on an iOS 12 physical device, you can manually add the missing device support files by going to /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/DeviceSupport/ on your Mac, where you’ll see folders like 17.0 or 18.0; download the matching iOS 12 folder (for example, 12.4) from the community-maintained repository.
In our project, we are now using UIScene, and when click InputText to pull up Keyboard, and when rotate the device, the Keyboard does not update:
how can we fix this issue?
Hello, my iOS apps are exiting right after launch on a few of our iOS devices. I tried a couple of my apps that are deployed to our fleet and they do the same thing. If I run the app(s) in the Simulator it works fine and if I run the app(s) on the offending devices it works fine as well. Once I stop the run in Xcode the app on the device will not launch.
I'm thinking something is missing like a certificate etc. Just not sure.
Any ideas on how to troubleshoot this? I would really like to get this fixed.
Topic:
Code Signing
SubTopic:
Certificates, Identifiers & Profiles
Tags:
iOS
Entitlements
iPadOS
Xcode
After the macOS Sequoia update, my app seems to have an issue with Bluetooth communication between macOS and iOS that uses CoreBluetooth for Central-Peripheral communication.
Setup:
The iPhone (in my case: iPhone 14 Pro with iOS 18.0 (22A3354)) acts as the Central, and the Mac (in my case: 14" MacBook Pro 2023 with macOS 15.0 (24A335)) as the Peripheral.
I’ve implemented a mechanism where the Central (iPhone) sends a message to the Peripheral (Mac) every 15 seconds to keep the connection alive (Because it needs to wait for notify characteristic updates).
I never noticed this kind of issue before, but with macOS Sequoia I get it permanently.
Issue:
The connection drops unexpectedly after a period of time (sometimes 20 seconds, sometimes a few minutes) with CBErrorDomain - code 6: The connection has timed out unexpectedly.
Sample Code:
Peripheral (Mac):
ContentView (Peripheral).txt
ContentViewModel (Peripheral).txt
Central (iPhone):
ContentView (Central).txt
ContentViewModel (Central).txt
Reproduce:
I attached sample code including the Central-Sample (for iPhone) and Peripheral-Sample (for Mac).
Just run the Peripheral-Sample (after granting Bluetooth permissions).
Then run the Central-Sample and select the Mac device in the list
After selecting it should connect, discover the service & characteristic and should start writing messages to it.
After some time the func centralManager(_ central: CBCentralManager, didDisconnectPeripheral peripheral: CBPeripheral, error: (any Error)?) {should get called with timed out unexpectedly error.
Could anyone please look into this issue and advise on whether there’s a known bug or any workaround? Any guidance would be greatly appreciated, as this impacts the stability of Bluetooth communication between the devices.
Thanks in advance.
Logs:
I also ran the console.app during this issue which got these errors (if this is helpful):
console_logs.txt
Hi everyone,
I'm encountering an unexpected behavior with modal presentations in UIKit. Here’s what happens:
I have UIViewControllerA (let’s call it the "orange" VC) pushed onto a UINavigationController stack.
I present UIViewControllerB (the "red" VC, inside its own UINavigationController as a .formSheet) modally over UIViewControllerA.
After a short delay, I pop UIViewControllerA from the navigation stack.
Issue:
After popping UIViewControllerA, the modal UIViewControllerB remains visible on the screen and in memory. I expected that dismissing (popping) the presenting view controller would also dismiss the modal, but it stays.
Expected Behavior:
When UIViewControllerA (orange) is popped, I expect the modal UIViewControllerB (red) to be dismissed as well.
Actual Behavior:
The modal UIViewControllerB remains on screen and is not dismissed, even though its presenting view controller has been removed from the navigation stack.
Video example: https://youtube.com/shorts/sttbd6p_r_c
Question:
Is this the expected behavior? If so, what is the recommended way to ensure that the modal is dismissed when its presenting view controller is removed from the navigation stack?
Code snippet:
class MainVC: UIViewController {
private weak var orangeVC: UIViewController?
override func viewDidLoad() {
super.viewDidLoad()
self.view.backgroundColor = .blue
let dq = DispatchQueue.main
dq.asyncAfter(deadline: .now() + 1) { [weak self] in
let vc1 = UIViewController()
vc1.view.backgroundColor = .orange
vc1.modalPresentationStyle = .overCurrentContext
self?.navigationController?.pushViewController(vc1, animated: true)
self?.orangeVC = vc1
dq.asyncAfter(deadline: .now() + 1) { [weak self] in
let vc2 = UIViewController()
vc2.view.backgroundColor = .red
vc2.modalPresentationStyle = .formSheet
vc2.isModalInPresentation = true
let nav = UINavigationController(rootViewController: vc2)
if let sheet = nav.sheetPresentationController {
sheet.detents = [.medium()]
}
self?.orangeVC?.present(nav, animated: true)
dq.asyncAfter(deadline: .now() + 1) { [weak self] in
self?.navigationController?.popViewController(animated: true)
}
}
}
}
}
Thank you for your help!
Hello, is it allowed to include the action=write-review URL parameter in customer support emails to direct users to the App Store review page?
Example: https://apps.apple.com/app/id[APP_ID]?action=write-review
I want to make it easy for customers to leave feedback after positive support interactions, but only if it's compliant.
our app support silent push, and we use below code to get if app is launched by silent push:
if let remoteNotification = launchOptions?[UIApplication.LaunchOptionsKey.remoteNotification] as? [AnyHashable: Any],
let aps = remoteNotification["aps"] as? [AnyHashable: Any],
let contentAvailable = aps["content-available"] as? Int,
contentAvailable == 1 {
isSilentNotification = true
}
when app is launch and call:
application(_:didFinishLaunchingWithOptions:)
but when migrate to UIScene, the launchOptions is always nil, and we can not get to know if app is launched by silent push;
I read the develop doc:
it says:
If the app supports scenes, this is nil.
Continue using UIApplicationDelegate's application(_:didReceiveRemoteNotification:fetchCompletionHandler:) to process silent remote notifications after scene connection.
but the time for method calling
application(_:didReceiveRemoteNotification:fetchCompletionHandler:)
id too late.
So except in didReceiveRemoteNotification method calling:
application(_:didReceiveRemoteNotification:fetchCompletionHandler:)
are there any other ways to obtain the silent push flag when app is launch and has not didReceiveRemoteNotification call back.
In iOS 26.1 beta 4, under MDM restrictions that disable the camera via a configuration profile, the Camera and FaceTime apps are hidden as expected. However, other third-party apps can still access and use the camera function normally. This is unreasonable.
We've identified a regression in iOS 26.0 and 26.1 Beta 4 where AVSpeechSynthesisVoice(language:) no longer respects user-selected voices from Accessibility settings.
Issue: When users select a specific voice in Settings → Accessibility → Spoken Content → Voices, calling AVSpeechSynthesisVoice(language:) returns the system default voice instead of the user's selection. This worked correctly in iOS 18.6.2.
Particularly affects:
Third-party speech synthesis voices (CereProc, Grammatek, etc.)
Apps relying on automatic voice selection based on user preferences
Example:
// User selected CereProc Heather for en-GB in Accessibility settings
let voice = AVSpeechSynthesisVoice(language: "en-GB")
print(voice?.name) // iOS 18.6.2: "HEATHER", iOS 26: "Daniel" (system default)
Interesting observation: The new Accessibility Reader feature in iOS 26 correctly uses the user-selected voice, but Tap to Speak and the API both ignore the setting.
Tested methods:
AVSpeechSynthesisVoice(language:)
AVSpeechUtterance auto-selection
Reflection for new APIs
All return the system default voice, not the user's preference.
Filed: FB[20271264]
Has anyone else encountered this? Any known workarounds to programmatically access the user's preferred voice selection?
We have received several cases that our app can not display uitableview cell in iOS26, but users said that they can select cells with single tab and the uitableview didselectcell delegate can response!
I have reported a feedback but no response. Does anyone have the same bugs with me?
You guys can see that the page is blank, I have a video a user sent to me can proved that he can select cell with gesture.
We cannot reproduce the bug and don't konw how to fixed, we think this is the bug with iOS26, so here for some help. This bug block our distribution of new version(support iOS26)
This is the feedback https://feedbackassistant.apple.com/feedback/20677046
We are using three column split view as root of our app and wants to hide the supplementary column alone in some cases and behaves like two column split view.
With the existing apis we are unable to achieve this since it hides primary column as well and not giving expected results.
.hide(.supplementary)
setViewController(nil, for: .supplementary)
But seen this behavior in Native Notes app when using the View as List and Gallery option.
is there any way to achieve this with maintaining three column split view itself ?
New to iOS development and I've been trying to make heads or tails of the documentation. I know there is a difference between the data fields returned from songs from the user library and from the category, but whenever I search on the apple site I can't find a list of each. For example, Im trying to get the releaseDate of a song in my library, but it seems I'll have to cross-query either the catalog entry for the using song.catalogID or the song.irsc but when I try to use them I can't find a cross reference between the two. I'm totally turned around.
Also trying to determine if a song in my library has been favorited or not? isFavorited (or something similar) doesn't seem to be a thing. Using this code and trying to find a way to display a solid star if the song has been favorited or an empty one if it's not. Seems like a basic request but I can't find anything on how to do it. I've searched docs, googled, tried.
Does apple want us to query the user's Favorited Songs playlist or something? How do I know which playlist that is?
I know isFavorited isn't a thing, just using it here so you can see what my intension is:
HStack(spacing: 10) {
Image(systemName: song.isFavorited ? "star.fill" : "star")
.foregroundColor(song.isFavorited ? .yellow : .gray)
Image(systemName: "magnifyingglass")
}
Hi Guys,
I want to support my client for enable the developer mode, But they not accept to connect with any other devices(Mac Xcode) to enable developer mode.
They are nearly 10 people to enable developer mode. But I think without mac we can't enable developer mode in some of devices. So I need a clarification with IOS versions. That's only we are excepting to list out which IOS versions don't have developer mode option default. Please list out that IOS versions
Like below:
default developer mode available IOS 17.4.1
default developer mode not available IOS 17.5.1
Area: Software Update
Type of Feedback: Application Bug
Description
Device: iPhone 13 Pro running iOS 26
Build environment: Xcode 16.4
Problem description:
When a text field has secureTextEntry = YES and Password Autofill / Passkeys is active, the autofill panel is not included in the rect reported from the keyboard notifications (UIKeyboardFrameEndUserInfoKey or others).
As a result, when calculating the offset to move the screen up and reveal the hidden input field, the field is not displayed correctly because the reported keyboard height is smaller than the actual visible height.
Observed behavior:
This only occurs on devices running iOS 26 built with Xcode 16.4.
On previous versions of iOS, with the same settings (secureTextEntry and Autofill active), the rect correctly includes the autofill panel height, and the UI works as expected.
I tested with both UIKeyboardDidShowNotification and UIKeyboardWillChangeFrameNotification, and in both cases the behavior is the same: the height is incorrect (smaller than expected with the autofill panel).
What I expect / questions:
That UIKeyboardFrameEndUserInfoKey (or the related notification) correctly reports the total area covered by the keyboard, including any password autofill panel, when secureTextEntry is active.
That the new behavior in iOS 26 be documented if this omission is intentional, or otherwise considered a bug if it is not.
If there is any official workaround suggested by Apple for developers affected by this issue while a fix is provided.
Thank you for your support.
I'm developing an iOS application in Swift that performs API calls using URLSession.shared. The requests work correctly when the app is in the foreground. However, when the app transitions to the background (for example, when the user switches to another app), the ongoing API calls are either paused or do not complete as expected.
What I’ve tried:
Using URLSession.shared.dataTask(with:) to initiate the API requests
Observing application lifecycle events like applicationDidEnterBackground, but haven't found a reliable solution to allow requests to complete when backgrounded
Goal:
I want certain API requests to continue running or be allowed to complete even if the app enters the background.
Question:
What is the correct approach to allow API calls to continue running or complete when the app moves to the background?
Should I be using a background URLSessionConfiguration instead of URLSession.shared?
If so, how should it be properly configured and used in this scenario?